Abstract
Mean field theory has been successfully used to analyze deep neural networks (DNN) in the infinite size limit. Given the finite size of realistic DNN, we utilize the large deviation theory and path integral analysis to study the deviation of functions represented by DNN from their typical mean field solutions. The parameter perturbations investigated include weight sparsification (dilution) and binarization, which are commonly used in model simplification, for both ReLU and sign activation functions. We find that random networks with ReLU activation are more robust to parameter perturbations with respect to their counterparts with sign activation, which arguably is reflected in the simplicity of the functions they generate.
| Original language | English |
|---|---|
| Article number | 104002 |
| Journal | Journal of Physics A: Mathematical and Theoretical |
| Volume | 53 |
| Issue number | 10 |
| DOIs | |
| State | Published - 20 Feb 2020 |
| Externally published | Yes |
Keywords
- deep neural networks
- function sensitivity
- large deviation theory
- path integral
Fingerprint
Dive into the research topics of 'Large deviation analysis of function sensitivity in random deep neural networks'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver