A Python package for constrained global optimization using Bayesian inference and Gaussian processes.
BayesianOptimization is a Python library for performing constrained global optimization using Bayesian inference and Gaussian processes. It is designed to efficiently find the maximum of an unknown, expensive-to-evaluate function by balancing exploration of new areas with exploitation of known promising regions.
Data scientists, machine learning engineers, and researchers who need to optimize high-cost black-box functions, such as hyperparameter tuning for machine learning models or simulation-based optimization.
Developers choose BayesianOptimization because it minimizes the number of function evaluations required, reducing computational costs, and provides a robust, pure-Python implementation with support for advanced features like constrained optimization and non-float parameters.
A Python implementation of global optimization with gaussian processes.
Open-Awesome is built by the community, for the community. Submit a project, suggest an awesome list, or help improve the catalog on GitHub.
Minimizes function evaluations using Gaussian processes and acquisition functions like UCB or EI, making it ideal for high-cost optimizations such as hyperparameter tuning.
Easy to install via pip or conda with a clean API for specifying parameter bounds and running optimizations, lowering the barrier to entry.
Includes constrained optimization and non-float parameter handling through cited research extensions, broadening its applicability to complex real-world problems.
Provides stable documentation with examples and citation guidelines, facilitating both academic use and practical implementation.
Fitting Gaussian processes at each iteration adds latency, making it unsuitable for real-time applications or optimizations where function evaluations are very fast.
Requires predefined min and max bounds for all parameters, which can be restrictive if the search space is unknown or dynamically changing.
Gaussian processes scale cubically with observations, reducing efficiency and increasing computation time as the number of parameters grows beyond moderate levels.