Skip to content

Conversation

@odow
Copy link
Member

@odow odow commented Jan 30, 2026

Closes #1971

There aren't many good options here.

My trivial test case is (x + y)^2 / 2.

The permissive options all fail:

julia> using LDLFactorizations, LinearAlgebra, QDLDL, SparseArrays

julia> A = [1.0 1.0; 1.0 1.0]
2×2 Matrix{Float64}:
 1.0  1.0
 1.0  1.0

julia> Q = SparseArrays.sparse(A);
julia> LinearAlgebra.cholesky(Q)
ERROR: PosDefException: matrix is not positive definite; Factorization failed.
Stacktrace:

julia> LinearAlgebra.ldlt(Q)
ERROR: ZeroPivotException: factorization encountered one or more zero pivots. Consider switching to a pivoted LU factorization.
Stacktrace:

julia> QDLDL.qdldl(Q)
ERROR: Zero entry in D (matrix is not quasidefinite)
Stacktrace:

LDLFactorizations.jl works:

julia> LDLFactorizations.ldl(Q)
LDLFactorizations.LDLFactorization{Float64, Int64, Int64, Int64}(true, false, false, 2, [2, -1], [1, 0], [2, 2], [1, 2], [1, 2], [1, 2, 2], Int64[], Int64[], [2], [1.0], [1.0, 0.0], [0.0, 0.0], [1, 1], 0.0, 0.0, 0.0, 2)

but it requires us to add a GPL dependency.

This PR implements an option I don't think we've previously considered: do something that isn't upper triangular...

Am I missing something obviously wrong?

@mlubin
Copy link
Member

mlubin commented Jan 30, 2026

I don't think the eivenvalue decomposition is wrong, just potentially expensive and, as a result, surprising to users.

@odow
Copy link
Member Author

odow commented Jan 30, 2026

The alternative here is that we're going to error. I don't know if people solve problems with super large Q matrices? And need to use this bridge, and worry about performance?

@odow
Copy link
Member Author

odow commented Jan 30, 2026

The suggestion from #1971 is to add LDLFactorizations as a package extension.

@odow odow marked this pull request as draft January 30, 2026 03:40
# triangular, but nothing says that it has to be.
E = LinearAlgebra.eigen(LinearAlgebra.Symmetric(Matrix(Q)))
if !all(isreal, E.values) || minimum(E.values) < 0
error("Matrix is not PSD")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe give the minimal eigenvalue

@blegat
Copy link
Member

blegat commented Jan 30, 2026

Using eigenvalues as a fallback sounds like a good idea, maybe we should allow a small tolerance. Historically, we have allowed 1e-10 to be considered as zero:

if MOI.Utilities.isapprox_zero(diff, 1e-10)
return nothing
end
if MOI.Utilities.isapprox_zero(diff, 1e-8)
i, j = ij
@warn(
"The entries ($i, $j) and ($j, $i) of the matrix are " *
"almost identical, but a constraint has been added " *
"to ensure their equality because the largest " *
"difference between the coefficients is smaller than " *
"1e-8 but larger than 1e-10. This usually means that " *
"there is a modeling error in your formulation.",
)
end

We can use a very tight tolerance and only loosen it upon request. Maybe starting with 0 is good but it should be easy to find example of reasonable PSD matrices having negative eigenvalues like -1e-15

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Development

Successfully merging this pull request may close these issues.

Sparse LDL for QuadtoSOC

4 participants