Multi-parametric Optimization and Control. Efstratios N. PistikopoulosЧитать онлайн книгу.
then
1.1.3 Interpretation of Lagrange Multipliers
Consider the following problem:
Let
(1.13)
Calculating the partial derivative of the Lagrange function with respect to the perturbation vector, we have
(1.14)
which yields
(1.15)
Hence, the Lagrange multipliers can be interpreted as a measure of sensitivity of the objective function with respect to the perturbation vector of the constraints at the optimum point .
1.2 Concepts of Multi‐parametric Programming
1.2.1 Basic Sensitivity Theorem
Having the essentials of optimization for the purposes of this book covered, the objective of this subchapter is to introduce the role of parameters in an optimization formulation. In this context, the following multi‐parametric programming problem is considered:
(1.16)
where is the vector of the continuous optimization variables,
is the vector of the uncertain parameters, and the sets
,
correspond to the inequality and equality constraint sets, respectively.
Theorem 1.1 (Basic Sensitivity Theorem, [1])
Let a general multi‐parametric programming problem be described by (1.16). Assume that the functions defining problem (1.16) are twice differentiable in and their gradients with respect to and the constraints are once continuously differentiable in in a neighborhood of . In addition, assume that the second‐order sufficient conditions for a local minimum of the problem hold at with associated Lagrange multipliers and . Lastly, let the gradients (for such that ) and be linearly independent (i.e. LICQ holds), and for such that , i.e. strict complementary slackness (SCS) holds.
Then, the first‐order sensitivity results for a second‐order local minimizing point are known as the basic sensitivity theorem (BST), and the following properties hold:
is a local isolated minimizing point of the problem and the associated Lagrange multipliers and are unique.
For in the neighborhood or , there exists a unique, but continuously differentiable vector function satisfying the second‐order sufficient conditions for a local minimum of the problem with associated unique Lagrange multipliers and .
For near the set of binding inequalities is unchanged, SCS holds and the binding constraint gradients are linearly independent at .
Proof
See [1].
If there exist Lagrange multipliers, and
, such that the first‐order KKT conditions hold, then we have:
(1.17)
and the vector is defined as follows:
(1.18)
Furthermore, if there exists for which
(1.19)
the Basic Sensitivity Theorem holds, and it is identically satisfied for a neighborhood around
and can be differentiated with respect to
to yield explicit expressions for the partial derivatives of the vector function
.
The first‐order estimate of the variation of an isolated local solution x() of (1.16) and the associated unique Lagrange multipliers
and
can be approximated, given that
is known and that
is available.
In particular, let be the concatenation of the vectors