A component free derivation of Backpropagation in deep learning
Training a neural network consists on estimating neural network’s parameters by minimizing a cost function. A standard method for solution is the stochastic gradient descent method. The challenge is to calculate the gradient of the underlying cost function, which is carried out by a Backpropagation algorithm. Most derivations of the latter are based on the component wise application of the chain rule of multivariable calculus. In this work we provide an alternative in the framework of Calculus in Normed spaces. We review the Frechet derivative and generalizations of the chain rule and the Leibniz formula. Then we introduce an abstract inner product space of weights and biases and compute the derivative of the output map of an artificial neural network. Backpropagation for the gradient of an arbitrary cost function results from a simple application of the chain rule. The computational implementation is straightforward. The novel expression of the derivative of the output map simplifies retraining, as a well as other post processes of the neural network.
Disclaimer/Regarding indexing issue:
We have provided the online access of all issues and papers to the indexing agencies (as given on journal web site). It’s depend on indexing agencies when, how and what manner they can index or not. Hence, we like to inform that on the basis of earlier indexing, we can’t predict the today or future indexing policy of third party (i.e. indexing agencies) as they have right to discontinue any journal at any time without prior information to the journal. So, please neither sends any question nor expects any answer from us on the behalf of third party i.e. indexing agencies.Hence, we will not issue any certificate or letter for indexing issue. Our role is just to provide the online access to them. So we do properly this and one can visit indexing agencies website to get the authentic information. Also: DOI is paid service which provided by a third party. We never mentioned that we go for this for our any journal. However, journal have no objection if author go directly for this paid DOI service.