Skip to content

Conversation

ChrisRackauckas-Claude
Copy link

Summary

  • Fixed repeated evaluation of fx0 = f(x) inside the loop in finite_difference_gradient!
  • Optimizes function evaluations from 2N to N+1 for forward differences when computing gradients
  • Maintains full compatibility with both cached and uncached function values

Problem

As reported in #202, when computing forward differences for gradients, the function f(x) was being evaluated N times inside the loop (once per iteration) in addition to the N evaluations for perturbed inputs, resulting in 2N total function evaluations.

Solution

  • Moved the fx0 = f(x) computation outside the loop
  • Use conditional assignment: fx0 = typeof(fx) != Nothing ? fx : f(x)
  • Simplified the loop logic by eliminating conditional branches
  • Applied the same optimization to both real and complex gradient computations

Test plan

  • Verified gradient accuracy remains unchanged
  • Confirmed function evaluation count is reduced from 2N to N+1
  • Ran existing test suite (passed core functionality tests)
  • Applied SciMLStyle formatting

Performance Impact

For a vector of length N, this reduces function evaluations by ~50%, providing significant performance improvement for expensive functions.

Fixes #202

🤖 Generated with Claude Code

- Moved fx0 computation outside the loop in finite_difference_gradient!
- Optimizes function evaluations from 2N to N+1 for forward differences
- Maintains compatibility with both cached and uncached function values
- Simplifies logic by eliminating conditional branches in the main loop

Fixes JuliaDiff#202

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <[email protected]>
@ChrisRackauckas ChrisRackauckas merged commit 72f5d07 into JuliaDiff:master Aug 16, 2025
6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Repeted evaluation of fx0 when computing forward differences for gradients

2 participants