Skip to content

Releases: optimagic-dev/optimagic

v0.5.2

23 Jul 17:08
a1b09f3

Choose a tag to compare

Summary

This minor release adds support for two additional optimizer libraries:

  • Nevergrad: A library for gradient-free optimization developed by Facebook Research.
  • Bayesian Optimization: A library for constrained bayesian global optimization with Gaussian processes.

In addition, this release includes several bug fixes and improvements to the documentation. Many contributions in this release were made by Google Summer of Code (GSoC) 2025 applicants, with @gauravmanmode and @spline2hg being the accepted contributors.

Pull Requests

  • #620 Uses interactive plotly figures in documentation (@timmens).
  • #618 Improves bounds processing when no bounds are specified (@timmens).
  • #615 Adds pre-commit hook that checks mypy version consistency (@timmens).
  • #613 Exposes converter functionality (@spline2hg).
  • #612 Fixes results processing to work with new cobyla optimizer (@janosg).
  • #610 Adds needs_bounds and supports_infinite_bounds fields to algorithm info (@gauravmanmode).
  • #608 Adds support for plotly >= 6 (@hmgaudecker, @timmens).
  • #607 Returns run_explorations results in a dataclass (@r3kste).
  • #605 Enhances batch evaluator checking and processing, introduces the internal BatchEvaluatorLiteral literal, and updates CHANGES.md (@janosg, @timmens).
  • #602 Adds optimizer wrapper for bayesian-optimization package (@spline2hg).
  • #601 Updates pre-commit hooks and fixes mypy issues (@janosg).
  • #598 Fixes and adds links to GitHub in the documentation (@hamogu).
  • #594 Refines newly added optimizer wrappers (@janosg).
  • #591 Adds multiple optimizers from the nevergrad package (@gauravmanmode).
  • #589 Rewrites the algorithm selection pre-commit hook in pure Python to address issues with bash scripts on Windows (@timmens).
  • #586 and #592 Ensure the SciPy disp parameter is exposed for the following SciPy algorithms: slsqp, neldermead, powell, conjugate_gradient, newton_cg, cobyla, truncated_newton, trust_constr (@sefmef, @TimBerti).
  • #585 Exposes all parameters of SciPy's BFGS optimizer in optimagic (@TimBerti).
  • #582 Adds support for handling infinite gradients during optimization (@Aziz-Shameem).
  • #579 Implements a wrapper for the PSO optimizer from the nevergrad package (@r3kste).
  • #578 Integrates the intersphinx-registry package into the documentation for automatic linking to up-to-date external documentation (@Schefflera-Arboricola).
  • #576 Wraps oneplusone optimizer from nevergrad (@gauravmanmode, @gulshan-123).
  • #572 and #573 Fix bugs in error handling for parameter selector processing and constraints checking (@hmgaudecker).
  • #570 Adds a how-to guide for adding algorithms to optimagic and improves internal documentation (@janosg).
  • #569 Implements a threading batch evaluator (@spline2hg).
  • #568 Introduces an initial wrapper for the migrad optimizer from the iminuit package (@spline2hg).
  • #567 Makes the fun argument optional when fun_and_jac is provided (@gauravmanmode).
  • #563 Fixes a bug in input harmonization for history plotting (@gauravmanmode).
  • #552 Refactors and extends the History class, removing the internal HistoryArrays class (@timmens).
  • #485 Adds bootstrap weights functionality (@alanlujan91).

v0.5.1

13 Nov 17:02
84be118

Choose a tag to compare

Summary

This is a minor release that introduces the new algorithm selection tool and several
small improvements.

To learn more about the algorithm selection feature check out the following resources:

Pull Requests

  • #549 Add support for Python 3.13 (@timmens)
  • #550 and #534 implement the new algorithm selection tool (@janosg)
  • #548 and #531 improve the documentation (@ChristianZimpelmann)
  • #544 Adjusts the results processing of the nag optimizers to be compatible
    with the latest releases (@timmens)
  • #543 Adds support for numpy 2.x (@timmens)
  • #536 Adds a how-to guide for choosing local optimizers (@mpetrosian)
  • #535 Allows algorithm classes and instances in estimation functions
    (@timmens)
  • #532 Makes several small improvements to the documentation (@janosg)

v0.5.0

26 Aug 14:10
786247c

Choose a tag to compare

Summary

This is a major release with several breaking changes and deprecations. In this
release we started implementing two major enhancement proposals and renamed the package
from estimagic to optimagic (while keeping the estimagic namespace for the estimation
capabilities).

The implementation of the two enhancement proposals is not complete and will likely
take until version 0.6.0. However, all breaking changes and deprecations (with the
exception of a minor change in benchmarking) are already implemented such that updating
to version 0.5.0 is future proof.

Pull Requests

  • #500 removes the dashboard, the support for simopt optimizers and the
    derivative_plot (@janosg)
  • #502 renames estimagic to optimagic (@janosg)
  • #504 aligns maximize and minimize more closely with scipy. All related
    deprecations and breaking changes are listed below. As a result, scipy code that uses
    minimize with the arguments x0, fun, jac and method will run without changes
    in optimagic. Similarly, to OptimizeResult gets some aliases so it behaves more
    like SciPy's.
  • #506 introduces the new Bounds object and deprecates lower_bounds,
    upper_bounds, soft_lower_bounds and soft_upper_bounds (@janosg)
  • #507 updates the infrastructure so we can make parallel releases under the names
    optimagic and estimagic (@timmens)
  • #508 introduces the new ScalingOptions object and deprecates the
    scaling_options argument of maximize and minimize (@timmens)
  • #512 implements the new interface for objective functions and derivatives
    (@janosg)
  • #513 implements the new optimagic.MultistartOptions object and deprecates the
    multistart_options argument of maximize and minimize (@timmens)
  • #514 and #516 introduce the NumdiffResult object that is returned from
    first_derivative and second_derivative. It also fixes several bugs in the
    pytree handling in first_derivative and second_derivative and deprecates
    Richardson Extrapolation and the key (@timmens)
  • #517 introduces the new NumdiffOptions object for configuring numerical
    differentiation during optimization or estimation (@timmens)
  • #519 rewrites the logging code and introduces new LogOptions objects
    (@schroedk)
  • #521 introduces the new internal algorithm interface.
    (@janosg and @mpetrosian)
  • #522 introduces the new Constraint objects and deprecates passing
    dictionaries or lists of dictionaries as constraints (@timmens)

Breaking changes

  • When providing a path for the argument logging of the functions
    maximize and minimize and the file already exists, the default
    behavior is to raise an error now. Replacement or extension
    of an existing file must be explicitly configured.
  • The argument if_table_exists in log_options has no effect anymore and a
    corresponding warning is raised.
  • OptimizeResult.history is now a optimagic.History object instead of a
    dictionary. Dictionary style access is implemented but deprecated. Other dictionary
    methods might not work.
  • The result of first_derivative and second_derivative is now a
    optimagic.NumdiffResult object instead of a dictionary. Dictionary style access is
    implemented but other dictionary methods might not work.
  • The dashboard is removed
  • The derivative_plot is removed.
  • Optimizers from Simopt are removed.
  • Passing callables with the old internal algorithm interface as algorithm to
    minimize and maximize is not supported anymore. Use the new
    Algorithm objects instead. For examples see: https://tinyurl.com/24a5cner

Deprecations

  • The criterion argument of maximize and minimize is renamed to fun (as in
    SciPy).
  • The derivative argument of maximize and minimize is renamed to jac (as
    in SciPy)
  • The criterion_and_derivative argument of maximize and minimize is renamed
    to fun_and_jac to align it with the other names.
  • The criterion_kwargs argument of maximize and minimize is renamed to
    fun_kwargs to align it with the other names.
  • The derivative_kwargs argument of maximize and minimize is renamed to
    jac_kwargs to align it with the other names.
  • The criterion_and_derivative_kwargs argument of maximize and minimize is
    renamed to fun_and_jac_kwargs to align it with the other names.
  • Algorithm specific convergence and stopping criteria are renamed to align them more
    with NlOpt and SciPy names.
    • convergence_relative_criterion_tolerance -> convergence_ftol_rel
    • convergence_absolute_criterion_tolerance -> convergence_ftol_abs
    • convergence_relative_params_tolerance -> convergence_xtol_rel
    • convergence_absolute_params_tolerance -> convergence_xtol_abs
    • convergence_relative_gradient_tolerance -> convergence_gtol_rel
    • convergence_absolute_gradient_tolerance -> convergence_gtol_abs
    • convergence_scaled_gradient_tolerance -> convergence_gtol_scaled
    • stopping_max_criterion_evaluations -> stopping_maxfun
    • stopping_max_iterations -> stopping_maxiter
  • The arguments lower_bounds, upper_bounds, soft_lower_bounds and
    soft_upper_bounds are deprecated and replaced by optimagic.Bounds. This affects
    maximize, minimize, estimate_ml, estimate_msm, slice_plot and several
    other functions.
  • The log_options argument of minimize and maximize is deprecated. Instead,
    LogOptions objects can be passed under the logging argument.
  • The class OptimizeLogReader is deprecated and redirects to
    SQLiteLogReader.
  • The scaling_options argument of maximize and minimize is deprecated. Instead a
    ScalingOptions object can be passed under the scaling argument that was previously
    just a bool.
  • Objective functions that return a dictionary with the special keys "value",
    "contributions" and "root_contributions" are deprecated. Instead, likelihood and
    least-squares functions are marked with a mark.likelihood or mark.least_squares
    decorator. There is a detailed how-to guide that shows the new behavior. This affects
    maximize, minimize, slice_plot and other functions that work with objective
    functions.
  • The multistart_options argument of minimize and maximize is deprecated. Instead,
    a MultistartOptions object can be passed under the multistart argument.
  • Richardson Extrapolation is deprecated in first_derivative and second_derivative
  • The key argument is deprecated in first_derivative and second_derivative
  • Passing dictionaries or lists of dictionaries as constraints to maximize or
    minimize is deprecated. Use the new Constraint objects instead.

v0.5.0rc2

22 Aug 15:52
95891f5

Choose a tag to compare

v0.5.0rc2 Pre-release
Pre-release

Second release candidate for optimagic 0.5.0

This removes nlopt as mandatory pip dependency because the pip installation of nlopt is unstable.

v0.5.0rc1

22 Aug 15:13
a0c192e

Choose a tag to compare

v0.5.0rc1 Pre-release
Pre-release

First release candidate for version 0.5.0

Summary

This is a major release with several breaking changes and deprecations. In this
release we started implementing two major enhancement proposals and renamed the package
from estimagic to optimagic (while keeping the estimagic namespace for the estimation
capabilities).

The implementation of the two enhancement proposals is not complete and will likely
take until version 0.6.0. However, all breaking changes and deprecations (with the
exception of a minor change in benchmarking) are already implemented such that updating
to version 0.5.0 is future proof.

Pull Requests

  • #500 removes the dashboard, the support for simopt optimizers and the
    derivative_plot (@janosg)
  • #502 renames estimagic to optimagic (@janosg)
  • #504 aligns maximize and minimize more closely with scipy. All related
    deprecations and breaking changes are listed below. As a result, scipy code that uses
    minimize with the arguments x0, fun, jac and method will run without changes
    in optimagic. Similarly, to OptimizeResult gets some aliases so it behaves more
    like SciPy's.
  • #506 introduces the new Bounds object and deprecates lower_bounds,
    upper_bounds, soft_lower_bounds and soft_upper_bounds (@janosg)
  • #507 updates the infrastructure so we can make parallel releases under the names
    optimagic and estimagic (@timmens)
  • #508 introduces the new ScalingOptions object and deprecates the
    scaling_options argument of maximize and minimize (@timmens)
  • #512 implements the new interface for objective functions and derivatives
    (@janosg)
  • #513 implements the new optimagic.MultistartOptions object and deprecates the
    multistart_options argument of maximize and minimize (@timmens)
  • #514 and #516 introduce the NumdiffResult object that is returned from
    first_derivative and second_derivative. It also fixes several bugs in the
    pytree handling in first_derivative and second_derivative and deprecates
    Richardson Extrapolation and the key (@timmens)
  • #517 introduces the new NumdiffOptions object for configuring numerical
    differentiation during optimization or estimation (@timmens)
  • #519 rewrites the logging code and introduces new LogOptions objects
    ({ghuser}schroedk)
  • #521 introduces the new internal algorithm interface.
    (@janosg and @mpetrosian)
  • #522 introduces the new Constraint objects and deprecates passing
    dictionaries or lists of dictionaries as constraints (@timmens)

Breaking changes

  • When providing a path for the argument logging of the functions
    maximize and minimize and the file already exists, the default
    behavior is to raise an error now. Replacement or extension
    of an existing file must be explicitly configured.
  • The argument if_table_exists in log_options has no effect anymore and a
    corresponding warning is raised.
  • OptimizeResult.history is now a optimagic.History object instead of a
    dictionary. Dictionary style access is implemented but deprecated. Other dictionary
    methods might not work.
  • The result of first_derivative and second_derivative is now a
    optimagic.NumdiffResult object instead of a dictionary. Dictionary style access is
    implemented but other dictionary methods might not work.
  • The dashboard is removed
  • The derivative_plot is removed.
  • Optimizers from Simopt are removed.
  • Passing callables with the old internal algorithm interface as algorithm to
    minimize and maximize is not supported anymore. Use the new
    Algorithm objects instead. For examples see: https://tinyurl.com/24a5cner

Deprecations

  • The criterion argument of maximize and minimize is renamed to fun (as in
    SciPy).
  • The derivative argument of maximize and minimize is renamed to jac (as
    in SciPy)
  • The criterion_and_derivative argument of maximize and minimize is renamed
    to fun_and_jac to align it with the other names.
  • The criterion_kwargs argument of maximize and minimize is renamed to
    fun_kwargs to align it with the other names.
  • The derivative_kwargs argument of maximize and minimize is renamed to
    jac_kwargs to align it with the other names.
  • The criterion_and_derivative_kwargs argument of maximize and minimize is
    renamed to fun_and_jac_kwargs to align it with the other names.
  • Algorithm specific convergence and stopping criteria are renamed to align them more
    with NlOpt and SciPy names.
    • convergence_relative_criterion_tolerance -> convergence_ftol_rel
    • convergence_absolute_criterion_tolerance -> convergence_ftol_abs
    • convergence_relative_params_tolerance -> convergence_xtol_rel
    • convergence_absolute_params_tolerance -> convergence_xtol_abs
    • convergence_relative_gradient_tolerance -> convergence_gtol_rel
    • convergence_absolute_gradient_tolerance -> convergence_gtol_abs
    • convergence_scaled_gradient_tolerance -> convergence_gtol_scaled
    • stopping_max_criterion_evaluations -> stopping_maxfun
    • stopping_max_iterations -> stopping_maxiter
  • The arguments lower_bounds, upper_bounds, soft_lower_bounds and
    soft_upper_bounds are deprecated and replaced by optimagic.Bounds. This affects
    maximize, minimize, estimate_ml, estimate_msm, slice_plot and several
    other functions.
  • The log_options argument of minimize and maximize is deprecated. Instead,
    LogOptions objects can be passed under the logging argument.
  • The class OptimizeLogReader is deprecated and redirects to
    SQLiteLogReader.
  • The scaling_options argument of maximize and minimize is deprecated. Instead a
    ScalingOptions object can be passed under the scaling argument that was previously
    just a bool.
  • Objective functions that return a dictionary with the special keys "value",
    "contributions" and "root_contributions" are deprecated. Instead, likelihood and
    least-squares functions are marked with a mark.likelihood or mark.least_squares
    decorator. There is a detailed how-to guide that shows the new behavior. This affects
    maximize, minimize, slice_plot and other functions that work with objective
    functions.
  • The multistart_options argument of minimize and maximize is deprecated. Instead,
    a MultistartOptions object can be passed under the multistart argument.
  • Richardson Extrapolation is deprecated in first_derivative and second_derivative
  • The key argument is deprecated in first_derivative and second_derivative
  • Passing dictionaries or lists of dictionaries as constraints to maximize or
    minimize is deprecated. Use the new Constraint objects instead.

v0.4.7

15 Jul 08:17
21cf398

Choose a tag to compare

v0.4.7

This release contains minor improvements and bug fixes. It is the last release before
the package will be renamed to optimagic and two large enhancement proposals will be
implemented.

v0.4.6

05 Jun 15:43
d25d4c2

Choose a tag to compare

This release drastically improves the optimizer benchmarking capabilities, especially
with noisy functions and parallel optimizers. It makes tranquilo and numba optional
dependencies and is the first version of estimagic to be compatible with Python
3.11.

  • #464 Makes tranquilo and numba optional dependencies (@janosg)
  • #461 Updates docstrings for procss_benchmark_results (@segsell)
  • #460 Fixes several bugs in the processing of benchmark results with noisy
    functions (@janosg)
  • #459 Prepares benchmarking functionality for parallel optimizers
    (@mpetrosian and @janosg)
  • #457 Removes some unused files (@segsell)
  • #455 Improves a local pre-commit hook (@ChristianZimpelmann)

v0.4.5

10 Apr 11:39
582caa4

Choose a tag to compare

v0.4.4

17 Feb 10:58
01a4f9c

Choose a tag to compare

v0.4.3

13 Dec 16:11
0335246

Choose a tag to compare

This is a minor release that mainly improves the installation

  • #416 pins a minimum scipy version and adds numba as pip dependency. It also adds bounds support for scipy neldermead (@janosg)