Approximation by perturbed neural network operators
Tom 42 / 2015
Applicationes Mathematicae 42 (2015), 57-81
MSC: 41A17, 41A25, 41A30, 41A36.
DOI: 10.4064/am42-1-5
Streszczenie
This article deals with the determination of the rate of convergence to the unit of each of three newly introduced perturbed normalized neural network operators of one hidden layer. These are given through the modulus of continuity of the function involved or its high order derivative that appears in the right-hand side of the associated Jackson type inequalities. The activation function is very general, in particular it can derive from any sigmoid or bell-shaped function. The right-hand sides of our convergence inequalities do not depend on the activation function. The sample functionals are of Stancu, Kantorovich or quadrature types. We give applications for the first derivative of the function involved.