We propose a geometry-aware strategy for training neural preconditioners tailored to parametrized linear systems arising from the discretization of mixed-dimensional partial differential equations (PDEs). Such systems are typically ill-conditioned due to embedded lower-dimensional structures and are solved using Krylov subspace methods. Our approach yields an approximation of the inverse operator employing a learning algorithm consisting of a two-stage training framework: an initial static pretraining phase, based on residual minimization, followed by a dynamic fine-tuning phase that incorporates solver convergence dynamics into the training process via a novel loss functional. This dynamic loss is defined by the principal angles between the residuals and the Krylov subspaces. It is evaluated using a differentiable implementation of the Flexible GMRES algorithm, which enables backpropagation through both the Arnoldi process and Givens rotations. The resulting neural preconditioner is explicitly optimized to enhance early-stage convergence and reduce iteration counts across a family of 3D–1D mixed-dimensional problems exhibiting geometric variability in the 1D domain. Numerical experiments show that our solver-aligned approach significantly improves convergence rate, robustness, and generalization.

Neural preconditioning via Krylov subspace geometry / Dimola, Nunzio; Coclite, Alessandro; Zunino, Paolo. - In: BOLLETTINO DELLA UNIONE MATEMATICA ITALIANA. - ISSN 1972-6724. - STAMPA. - (In corso di stampa). [10.1007/s40574-025-00522-2]

Neural preconditioning via Krylov subspace geometry

Alessandro Coclite
Membro del Collaboration Group
;
In corso di stampa

Abstract

We propose a geometry-aware strategy for training neural preconditioners tailored to parametrized linear systems arising from the discretization of mixed-dimensional partial differential equations (PDEs). Such systems are typically ill-conditioned due to embedded lower-dimensional structures and are solved using Krylov subspace methods. Our approach yields an approximation of the inverse operator employing a learning algorithm consisting of a two-stage training framework: an initial static pretraining phase, based on residual minimization, followed by a dynamic fine-tuning phase that incorporates solver convergence dynamics into the training process via a novel loss functional. This dynamic loss is defined by the principal angles between the residuals and the Krylov subspaces. It is evaluated using a differentiable implementation of the Flexible GMRES algorithm, which enables backpropagation through both the Arnoldi process and Givens rotations. The resulting neural preconditioner is explicitly optimized to enhance early-stage convergence and reduce iteration counts across a family of 3D–1D mixed-dimensional problems exhibiting geometric variability in the 1D domain. Numerical experiments show that our solver-aligned approach significantly improves convergence rate, robustness, and generalization.
In corso di stampa
Neural preconditioning via Krylov subspace geometry / Dimola, Nunzio; Coclite, Alessandro; Zunino, Paolo. - In: BOLLETTINO DELLA UNIONE MATEMATICA ITALIANA. - ISSN 1972-6724. - STAMPA. - (In corso di stampa). [10.1007/s40574-025-00522-2]
File in questo prodotto:
File Dimensione Formato  
2025_Neural_preconditioning_via_Krylov_subspace_geometry_firstonline.pdf

accesso aperto

Descrizione: First on line
Tipologia: Versione editoriale
Licenza: Creative commons
Dimensione 982.28 kB
Formato Adobe PDF
982.28 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11589/294800
Citazioni
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact