Skip to content

Conversation

@arnavk23
Copy link

@arnavk23 arnavk23 commented Oct 18, 2025

I fixed the "storage types cannot be promoted to a concrete type" error (issue #329) by making the code robust when promote_type(storage_type(op1), storage_type(op2)) yields a non-concrete type. The fix:

In [operations.jl] : (operator * and +)
In [cat.jl] : (operator hcat and vcat)
Falls back to a concrete operand storage type if promotion yields an abstract type, or uses Vector{T} as a last resort.
This allows GPU-backed operators and mixed-storage scenarios to work without hard failures.

Closes #329

@arnavk23
Copy link
Author

@tmigot please review this pr.

@tmigot
Copy link
Member

tmigot commented Oct 19, 2025

What does the example in the issue do with this PR?

@arnavk23
Copy link
Author

arnavk23 commented Oct 19, 2025

@tmigot julia> using CUDA, NLPModels, NLPModelsTest, LinearOperators

julia> V = CuArray{Float64, 1, CUDA.Mem.DeviceBuffer}
CuArray{Float64, 1, CUDA.Mem.DeviceBuffer}

julia> nlp = BROWNDEN(V)
BrowndenNLPModel - Model with 4 variables and 0 constraints

julia> nvar = nlp.meta.nvar
4

julia> xc = V(undef, nvar)
4-element CuArray{Float64, 1, CUDA.Mem.DeviceBuffer}:
0.0
0.0
0.0
0.0

julia> Hs = V(undef, nvar)
4-element CuArray{Float64, 1, CUDA.Mem.DeviceBuffer}:
0.0
0.0
0.0
0.0

julia> H = hess_op!(nlp, xc, Hs)
Linear operator
nrow: 4
ncol: 4
eltype: Float64
symmetric: true
hermitian: true

julia> cg_op_diag = V(undef, nvar)
4-element CuArray{Float64, 1, CUDA.Mem.DeviceBuffer}:
0.0
0.0
0.0
0.0

julia> cg_op = opDiagonal(cg_op_diag)
Linear operator
nrow: 4
ncol: 4
eltype: Float64
symmetric: true
hermitian: false

julia> ZHZ = cg_op' * H * cg_op
Linear operator
nrow: 4
ncol: 4
eltype: Float64
symmetric: false
hermitian: false

julia> size(ZHZ)
(4, 4)

julia> typeof(ZHZ)
LinearOperator{Float64, Int64, var"#..."#..., var"#..."#..., var"#..."#..., CuArray{Float64, 1, CUDA.Mem.DeviceBuffer}}

julia> v = V(undef, nvar); v .= 1.0;

julia> result = ZHZ * v
4-element CuArray{Float64, 1, CUDA.Mem.DeviceBuffer}:
0.0
0.0
0.0
0.0

…at.jl consistently (tests still pass functionally; only the known two tiny allocation assertions remain).
@codecov
Copy link

codecov bot commented Oct 24, 2025

Codecov Report

❌ Patch coverage is 61.53846% with 5 lines in your changes missing coverage. Please review.
✅ Project coverage is 93.77%. Comparing base (32dbc5e) to head (3dfa5ff).
⚠️ Report is 48 commits behind head on main.

Files with missing lines Patch % Lines
src/abstract.jl 44.44% 5 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #382      +/-   ##
==========================================
- Coverage   95.00%   93.77%   -1.23%     
==========================================
  Files          17       20       +3     
  Lines        1100     1156      +56     
==========================================
+ Hits         1045     1084      +39     
- Misses         55       72      +17     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Error for product of two operators LinearOperatorException("storage types cannot be promoted to a concrete type")

2 participants