Mesh denoising and inpainting using the total variation of the normal and a shape Newton approach

.We analyze the performance of a variant of the Newton method with quadratic regularization for solving composite convex minimization problems. At each step of our method, we choose a regularization parameter proportional to a certain power of the gradient norm at the current point. We introduce a f...

Full description

Saved in:
Bibliographic Details
Main Authors: Baumgärtner, Lukas (Author) , Bergmann, Ronny (Author) , Herzog, Roland (Author) , Schmidt, Stephan (Author) , Vidal-Núñez, José (Author) , Weiss, Manuel (Author)
Format: Article (Journal)
Language:English
Published: February 28, 2025
In: SIAM journal on scientific computing
Year: 2025, Volume: 47, Issue: 1, Pages: A300-A324
ISSN:1095-7197
DOI:10.1137/24M1646121
Online Access:Verlag, lizenzpflichtig, Volltext: https://doi.org/10.1137/24M1646121
Verlag, lizenzpflichtig, Volltext: https://epubs.siam.org/doi/10.1137/24M1646121
Get full text
Author Notes:Lukas Baumgärtner, Ronny Bergmann, Roland Herzog, Stephan Schmidt, José Vidal-Núñez, and Manuel Weiß
Description
Summary:.We analyze the performance of a variant of the Newton method with quadratic regularization for solving composite convex minimization problems. At each step of our method, we choose a regularization parameter proportional to a certain power of the gradient norm at the current point. We introduce a family of problem classes characterized by the Hölder continuity of either the second or third derivative. Then we present the method with a simple adaptive search procedure allowing an automatic adjustment to the problem class with the best global complexity bounds, without knowing specific parameters of the problem. In particular, for the class of functions with a Lipschitz continuous third derivative, we get the global rate, which was previously attributed to third-order tensor methods. When the objective function is uniformly convex, we justify an automatic acceleration of our scheme, resulting in a faster global rate and local superlinear convergence. The switching between the different rates (sublinear, linear, and superlinear) is automatic. Again, for that, no a priori knowledge of parameters is needed.
Item Description:Online verfügbar: 31. Januar 2025
Gesehen am 30.07.2025
Physical Description:Online Resource
ISSN:1095-7197
DOI:10.1137/24M1646121