Optimal convergence rates of deep neural networks in a classification setting

We establish optimal convergence rates up to a log factor for a class of deep neural networks in a classification setting under a restraint sometimes referred to as the Tsybakov noise condition. We construct classifiers based on empirical risk minimization in a general setting where the boundary of...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
1. Verfasser: Meyer, Joseph Theo (VerfasserIn)
Dokumenttyp: Article (Journal)
Sprache:Englisch
Veröffentlicht: 7 December 2023
In: Electronic journal of statistics
Year: 2023, Jahrgang: 17, Heft: 2, Pages: 3613-3659
ISSN:1935-7524
DOI:10.1214/23-EJS2187
Online-Zugang:Verlag, kostenfrei, Volltext: https://doi.org/10.1214/23-EJS2187
Verlag, kostenfrei, Volltext: https://projecteuclid.org/journals/electronic-journal-of-statistics/volume-17/issue-2/Optimal-convergence-rates-of-deep-neural-networks-in-a-classification/10.1214/23-EJS2187.full
Volltext
Verfasserangaben:Joseph T. Meyer
Beschreibung
Zusammenfassung:We establish optimal convergence rates up to a log factor for a class of deep neural networks in a classification setting under a restraint sometimes referred to as the Tsybakov noise condition. We construct classifiers based on empirical risk minimization in a general setting where the boundary of the Bayes rule can be approximated well by neural networks. Corresponding rates of convergence are proven with respect to the misclassification error using an additional condition that acts as a requirement for the “correct noise exponent”. It is then shown that these rates are optimal in the minimax sense. For other estimation procedures, similar convergence rates have been established. Our first main contribution is to prove that the rates are optimal under the additional condition. Secondly, our main theorem establishes almost optimal rates in a generalized setting. We use this to show optimal rates which circumvent the curse of dimensionality.
Beschreibung:Gesehen am 15.04.2024
Beschreibung:Online Resource
ISSN:1935-7524
DOI:10.1214/23-EJS2187