### 17.8 Functional calculus

Key points

• The directional derivative (17.221) becomes the Gateaux derivative (17.223) in an function space.
• The gradient (17.225) becomes the Frechet derivative (17.231) in an function space, and the Hessian (17.237) becomes the second-order Frechet derivative (17.241).

In multivariate calculus, discussed in Section 16.1, we start from a function  (16.22) that takes as input a vector with entries

 f:x∈R¯ȷ↦f(x)∈R. (17.215)

Consider a perturbation vector of size defined by the standard Euclidean inner product (15.139)

 ∥δx∥22≡∑¯ȷj=1(δxj)2≤η2, (17.216)

with small. The purpose of calculus is to study the effect of the perturbation (17.216) on the function (17.215) when applied to a generic vector

 f(x+δx)≈f(x)+⋯. (17.217)

Here we generalize calculus, which operates on , to functional analysis, which operates on a space of functions, such as the square-integrable functions (17.72), as in Table 17.1.

To this purpose, let us use the notation for vectors in , rather than , and let us interpret a vector as a function (17.38). Then, the notion of multivariate function (17.215) generalizes to that of functional, which maps a function to a complex number

 F:g∈L2μ(T)↦F[g]∈C. (17.218)

Consider a perturbation vector of size defined by the inner product (17.67)

 ∥δg∥2≡∫Tδg∗(t)δg(t)dμ(t)≤η2, (17.219)

with small. The purpose of functional calculus is to study the effect of the perturbation (17.219) on the functional (17.218) when applied to a generic function

 F[g+δg]≈F[g]+⋯. (17.220)

#### 17.8.1 Gateaux derivative

The Gateaux derivative is the functional analysis generalization of the directional derivative.

In multivariate calculus we compute the directional derivative (16.42) of a function (17.215) in the direction of the vector as follows

 Duf(x)≡limϵ→01ϵ(f(x+ϵu)−f(x)). (17.221)

Then the partial derivative (16.23) is the special case of a directional derivative (16.43) in the direction of the canonical basis vector (15.29)

 ∂jf(x)≡limϵ→01ϵ(f(x+ϵδ(j))−f(x)). (17.222)

The Gateaux derivative [W] of a functional (17.218) in the direction of the function (17.72) is defined similarly, as follows

 DhF[g]≡limϵ→01ϵ(F[g+ϵh]−F[g]). (17.223)

In particular, we denote the Gateaux derivative in the direction of the Dirac delta (17.81) as follows

 ∂tF[g]≡limϵ→01ϵ(F[g+ϵδ(t)]−F[g]). (17.224)

#### 17.8.2 Frechet derivative

The Frechet derivative is the functional analysis generalization of the gradient.

In multivariate calculus, for a differentiable function we can stack the partial derivatives (17.222) into the gradient (16.29)

 ∇f(x)≡(∂1f(x),…,∂¯ȷf(x))', (17.225)

or equivalently

 [∇f(x)]j≡∂jf(x), (17.226)

for all . Hence, we can interpret the gradient (17.225) as a map from the vector space into itself

 ∇f:x∈R¯ȷ↦∇f(x)∈R¯ȷ. (17.227)

The gradient (17.225) is in fact the unique vector-valued function (17.227) that recovers the directional derivative (17.221) as in (16.44)

 Duf(x)=⟨u,∇f(x)⟩2, (17.228)

for all vectors , where is the standard Euclidean dot product (15.139)

 ⟨u,∇f(x)⟩2=∑¯ȷj=1uj∂jf(x). (17.229)

Note that this is in the form (15.146), where the directional derivative is linear in for fixed .

With the gradient (17.225) we can perform a first order Taylor expansion (16.139) of a function (17.215), which we rewrite in compact form as

 f(x+δx)−f(x)=⟨δx,∇f(x)⟩2+o(∥δx∥2), (17.230)

where is the norm induced by the dot product (17.229), and denotes terms smaller than [W].

In the context of functional analysis, the Frechet derivative [W] is a map, similar to the gradient (17.227), from the function space (17.72) into itself

 ∇F:g∈L2μ(T)↦∇F[g]∈L2μ(T). (17.231)

The Frechet derivative is the unique linear functional that recovers the Gateaux derivative (17.223) of the function in a generic direction

 DhF[g]=⟨h,∇F[g]⟩, (17.232)

where is the inner product (17.67)

 ⟨h,∇F[g]⟩=∫Th∗(t)∇F[g](t)dμ(t). (17.233)

This is analogous to the fact that the gradient is the unique vector that recovers the directional derivative (17.228) in the finite dimensional case. Notice that the Riesz representation theorem (17.85) then tells us that is the Riesz representation of , viewed as a linear functional of .

Unlike in the finite dimensional case, there may be occurrences where the Gateaux derivatives (17.224) are well defined and yet the Frechet derivative (17.235) is not well-defined [W], though such occurrences are beyond the scope of the present discussion.

When both derivatives are defined, there is a simple connection between the two: by considering in the inner product representation of the Frechet derivative (17.232), and using the sifting property (17.80) of the Dirac delta we obtain that the Frechet derivative (17.232) at any point is the Gateaux derivative in the direction of the Dirac delta (17.224) at that point

 ∇F[g](t)=∂tF[g], (17.234)

for all points , similar to how the entry of the gradient is the partial derivative (17.226) in the finite dimensional case.

Similar to (17.230), for a given functional , the Frechet derivative provides us with the first order Taylor expansion of the functional

 F[g+δg]−F[g]≡⟨δg,∇F[g]⟩+o(∥δg∥), (17.235)

where is the norm induced by the inner product (17.233).

#### 17.8.3 Second order derivative

The second order Frechet derivative is the functional analysis generalization of the Hessian.

In multivariate calculus, it is convenient to arrange the second-order derivatives in matrix form in the Hessian (16.50). We can interpret the Hessian as a map from the vector space into a matrix, which means a linear application from the vector space in itself

 ∇2f:x∈R¯ȷ↦∇2f(x)∈L(R¯ȷ), (17.236)

where . The Hessian (17.236) operates on an arbitrary vector via the matrix-vector multiplication (15.70)

 [∇2f(x)[y]]j≡∑¯ȷj'=1∂2j,j'f(x)yj'. (17.237)

With the Hessian (17.236) we can perform a second-order Taylor expansion (16.140) of a function (17.215), which we rewrite in compact form as

 f(x+δx)−f(x)−⟨δx,∇f(x)⟩2=12⟨δx,∇2f(x)[δx]⟩2+o(∥δx∥22). (17.238)

The left hand side in (17.238) are the terms of the first order Taylor expansion (17.235). The calculation of the inner product on the right hand side follows from the linear action (17.237) and the dot product (15.139)

 ⟨δx,∇2f(x)[δx]⟩2=∑¯ȷj,j'=1∂2j,j'f(x)δxjδxj'. (17.239)

In the context of functional analysis, the second order Frechet derivative [ W] is a map from the function space (17.72) into a kernel, which means a linear operator from the function space into itself

 ∇2F:g∈L2μ(T)↦∇2F[g]∈L(L2μ(T)). (17.240)

The second order Frechet derivative (17.240) operates on an arbitrary vector via kernel-function integration (17.64)

 (∇2F[g][h])(t)≡∫T∇2F[g](t,u)h(u)dμ(u). (17.241)

With the second order Frechet derivative (17.240) we can perform a second-order Taylor expansion of a functional (17.238)

 F[g+δg]−F[g]−⟨δg,∇F[g]⟩=12⟨δg,∇2F[g][δg]⟩+o(∥δg∥2). (17.242)

The left hand side in (17.242) are the terms of the first order Taylor expansion (17.235). The calculation of the inner product on the right hand side follows from the linear action (17.241) and the inner product (17.67)

 ⟨δg,∇2F[g][δg]⟩=∫T(∫T∇2F[g](t,u)δg(t)δg(u)dμ(t))dμ(u). (17.243)