Skip to content

fix: promote the output array of initialization to the tunable eltype#4469

Open
SebastianM-C wants to merge 1 commit intoSciML:masterfrom
SebastianM-C:smc/init_promotion
Open

fix: promote the output array of initialization to the tunable eltype#4469
SebastianM-C wants to merge 1 commit intoSciML:masterfrom
SebastianM-C:smc/init_promotion

Conversation

@SebastianM-C
Copy link
Copy Markdown
Member

@SebastianM-C SebastianM-C commented Apr 23, 2026

This adresses the cases where the we generate output arrays from fully constant RHS, which would end up being incompatible with the elype of parameters in ForwardDiff and other similar contexts.

Checklist

  • Appropriate tests were added
  • Any code changes were done in a way that does not break public API
  • All documentation related to code changes were updated
  • The new code follows the
    contributor guidelines, in particular the SciML Style Guide and
    COLPRAC.
  • Any new documentation only uses public API

Additional context

After digging through what's going on in #4457 with Claude I think that I found a fix. The issue is that we generate observed functions that hardcode Vector{Float64} regardless of the eltype of the tunables which ends up defeating the u0 promotion mechanisms that we have.

This should fix #4457 and probably #3924 too, I think that the underlying issue is the same, but #4457 is the simpler reproducer.

This adresses the cases where the
we generate output arrays from
fully constant RHS, which would
end up being incompatible
with the elype of parameters
in ForwardDiff and other similar
contexts.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Float64(::ForwardDiff.Dual{..}) error when using initialization_eqs

1 participant