Error-Correcting Neural Networks for Two-Dimensional Curvature Computation in the Level-Set Method
We present an error-neural-modeling-based strategy for approximating two-dimensional curvature in the level-set method. Our main contribution is a redesigned hybrid solver (Larios-Cárdenas and Gibou (2021)[1]) that relies on numerical schemes to enable machine-learning operations on demand. In particular, our routine features double predicting to harness curvature symmetry invariance in favor of precision and stability. As in [1], the core of this solver is a multilayer perceptron trained on circular- and sinusoidal-interface samples. Its role is to quantify the error in numerical curvature approximations and emit corrected estimates for select grid vertices along the free boundary. These corrections arise in response to preprocessed context level-set, curvature, and gradient data. To promote neural capacity, we have adopted sample negative-curvature normalization, reorientation, and reflection-based augmentation. In the same manner, our system incorporates dimensionality reduction, well-balancedness, and regularization to minimize outlying effects. Our training approach is likewise scalable across mesh sizes. For this purpose, we have introduced dimensionless parametrization and probabilistic subsampling during data production. Together, all these elements have improved the accuracy and efficiency of curvature calculations around under-resolved regions. In most experiments, our strategy has outperformed the numerical baseline at twice the number of redistancing steps while requiring only a fraction of the cost.
READ FULL TEXT