loading page

Approximation Error Estimates by Noise-injected Neural Networks
  • Keito AKIYAMA
Tohoku Daigaku - Aobayama Campus

Corresponding Author:[email protected]

Author Profile


One-hidden-layer feedforward neural networks are described as functions having many real-valued parameters. The larger the number of parameters is, neural networks can approximate various functions (universal approximation property). The essential optimal order of approximation bounds is already derived in 1996. We focused on the numerical experiment that indicates the neural networks whose parameters have stochastic perturbations gain better performance than ordinary neural networks, and explored the approimation property of neural networks with stochastic perturbations. In this paper, we derived the quantitative order of variance of stochastic perturbations to achieve the essential approximation order.
08 Aug 2023Submitted to Mathematical Methods in the Applied Sciences
08 Aug 2023Assigned to Editor
08 Aug 2023Submission Checks Completed
16 Aug 2023Review(s) Completed, Editorial Evaluation Pending
16 Aug 2023Reviewer(s) Assigned
05 Nov 2023Editorial Decision: Revise Major