Consider a perturbation vector of size defined by the standard Euclidean inner product (14.121)
To this purpose, let us use the notation for vectors in , rather than , and let us interpret a vector as a function (16.37). Then, the notion of multivariate function (16.240) generalizes to that of functional, which maps a function to a complex number
Consider a perturbation vector of size defined by the inner product (16.70)
The Gateaux derivative is the functional analysis generalization of the directional derivative.
In particular, we denote the Gateaux derivative in the direction of the Dirac delta (16.84) as follows
The Frechet derivative is the functional analysis generalization of the gradient.
for all . Hence, we can interpret the gradient (16.250) as a map from the vector space into itself
for all vectors , where is the standard Euclidean dot product (14.121)
Note that this is in the form (14.158), where the directional derivative is linear in for fixed .
The Frechet derivative is the unique linear functional that recovers the Gateaux derivative (16.248) of the function in a generic direction
where is the inner product (16.70)
This is analogous to the fact that the gradient is the unique vector that recovers the directional derivative (16.253) in the finite dimensional case. Notice that the Riesz representation theorem (16.88) then tells us that is the Riesz representation of , viewed as a linear functional of .
Unlike in the finite dimensional case, there may be occurrences where the Gateaux derivatives (16.249) are well defined and yet the Frechet derivative (16.260) is not well-defined [W], though such occurrences are beyond the scope of the present discussion.
When both derivatives are defined, there is a simple connection between the two: by considering in the inner product representation of the Frechet derivative (16.257), and using the sifting property (16.83) of the Dirac delta we obtain that the Frechet derivative (16.257) at any point is the Gateaux derivative in the direction of the Dirac delta (16.249) at that point
for all points , similar to how the entry of the gradient is the partial derivative (16.251) in the finite dimensional case.
Similar to (16.255), for a given functional , the Frechet derivative provides us with the first order Taylor expansion of the functional
where is the norm induced by the inner product (16.258).
The second order Frechet derivative is the functional analysis generalization of the Hessian.
In multivariate calculus, it is convenient to arrange the second-order derivatives in matrix form in the Hessian (15.50). We can interpret the Hessian as a map from the vector space into a matrix, which means a linear application from the vector space in itself
The left hand side in (16.263) are the terms of the first order Taylor expansion (16.260). The calculation of the inner product on the right hand side follows from the linear action (16.262) and the dot product (14.121)
In the context of functional analysis, the second order Frechet derivative [ W] is a map from the function space (16.75) into a kernel, which means a linear operator from the function space into itself
The left hand side in (16.267) are the terms of the first order Taylor expansion (16.260). The calculation of the inner product on the right hand side follows from the linear action (16.266) and the inner product (16.70)