Skip to content

Conversation

@franckgaga
Copy link
Member

@franckgaga franckgaga commented Nov 28, 2025

The info dictionary now includes the following fields in both methods:

  • :∇J or :nablaJ : the gradient of the objective function $\mathbf{\nabla} J$
  • :∇²J or :nabla2J : the Hessian of the objective function $\mathbf{\nabla^2}J$
  • :∇g or :nablag : the Jacobian of the inequality constraint $\mathbf{\nabla g}$
  • :∇²ℓg or :nabla2lg : the Hessian of the inequality Lagrangian $\mathbf{\nabla^2}\ell_{\mathbf{g}}$
  • :∇geq or :nablageq : the Jacobian of the equality constraint $\mathbf{\nabla g_{eq}}$
  • :∇²ℓgeq or :nabla2lgeq : the Hessian of the equality Lagrangian $\mathbf{\nabla^2}\ell_{\mathbf{g_{eq}}}$

I don't use the preparation mechanism, since getinfo is only meant for troubleshooting and it's already a relatively expensive function that allocates the arrays at each call.

@franckgaga
Copy link
Member Author

franckgaga commented Nov 28, 2025

Quick question @odow. If I want to retrieve the value of the $\mu$ vector of a nonlinear constraint defined with a VectorNonlinearOracle after solving, should I use JuMP.dual or JuMP.shadow_price ? I need this value to compute the Hessian of the two Langrangian defined above at the solution.

edit: also, for Ipopt, why both functions return a vector with nx elements, the number of decision variable, instead of ng or ngeq, the number of inequality and equality constraints? My understanding is that it should be the Lagrange multipliers, no ?

@franckgaga
Copy link
Member Author

franckgaga commented Nov 30, 2025

To be a bit more explicit, here's a simple example, that is, the first example of jump-dev/Ipopt.jl#468 (comment):

using JuMP, Ipopt, MathOptInterface
set = MathOptInterface.VectorNonlinearOracle(;
    dimension = 2,
    l = [-Inf],
    u = [1.0],
    eval_f = (ret, x) -> (ret[1] = x[1]^2 + x[2]^2),
    jacobian_structure = [(1, 1), (1, 2)],
    eval_jacobian = (ret, x) -> ret .= 2.0 .* x,
    hessian_lagrangian_structure = [(1, 1), (2, 2)],
    eval_hessian_lagrangian = (ret, x, u) -> ret .= 2.0 .* u[1],
)
model = Model(Ipopt.Optimizer)
set_silent(model)
@variable(model, x)
@variable(model, y)
@objective(model, Max, x + y)
@constraint(model, c, [x, y] in set)
optimize!(model)
@show value(x), value(y)
@show dual(c)

why does the dual variable vector has two elements instead of one:

(value(x), value(y)) = (0.707106783471979, 0.707106783471979)
dual(c) = [-0.9999999949684055, -0.9999999949684055]

Isn't supposed to be the same thing as the Lagrange multiplier (the u argument of eval_hessian_lagrangian function) ? There is only one constraint in the model above.

Thanks.

@franckgaga
Copy link
Member Author

I asked the question on discourse, since it may be usefull for others. You can answer me here: https://discourse.julialang.org/t/lagrange-multipliers-of-a-vectornonlinearoracle-after-solve/134236

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants