BayesOpt: A toolbox for bayesian optimization, experimental design and stochastic bandits.

Overview

BayesOpt: A Bayesian optimization library

BayesOpt is an efficient implementation of the Bayesian optimization methodology for nonlinear optimization, experimental design and hyperparameter tunning.

Bayesian optimization uses a distribution over functions to build a surrogate model of the unknown function for we are looking the optimum, and then apply some active learning strategy to select the query points that provides most potential interest or improvement. Thus, it is a sample efficient method for nonlinear optimization, design of experiments and simulations or bandits-like problems. Currently, it is being used in many scientific and industrial applications. In the literature it is also called Sequential Kriging Optimization (SKO), Sequential Model-Based Optimization (SMBO) or Efficient Global Optimization (EGO).

BayesOpt is licensed under the AGPL and it is free to use. However, if you use BayesOpt in a work that leads to a scientific publication, we would appreciate it if you would kindly cite BayesOpt in your manuscript.

Ruben Martinez-Cantin, BayesOpt: A Bayesian Optimization Library for Nonlinear Optimization, Experimental Design and Bandits. Journal of Machine Learning Research, 15(Nov):3735--3739, 2014.

The paper can be found at http://jmlr.org/papers/v15/martinezcantin14a.html

Commercial applications may also acquire a commercial license. Please contact [email protected] for details.

Getting and installing BayesOpt

The library can be download from Github: https://github.com/rmcantin/bayesopt

You can also get the cutting-edge version from the repositories:

>> git clone https://github.com/rmcantin/bayesopt

The online documentation can be found at: http://rmcantin.github.io/bayesopt/html/ where it includes a install guide.

Questions and issues


Copyright (C) 2011-2020 Ruben Martinez-Cantin [email protected]

BayesOpt is free software: you can redistribute it and/or modify it under the terms of the GNU Affero General Public License as published by the Free Software Foundation, version 3 of the License.

BayesOpt is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Affero General Public License for more details.

You should have received a copy of the GNU Affero General Public License along with BayesOpt. If not, see http://www.gnu.org/licenses/.


Comments
  • Build error on Ubuntu 17.04

    Build error on Ubuntu 17.04

    I cloned the repo, created a "build" directory inside and ran "cmake .." from within that directory. Finally, I attempted to run "make". The compilation seems to be going well till this point:

    [ 71%] Building CXX object CMakeFiles/bayesopt.dir/utils/fileparser.cpp.o /home/vlad/tools/bayesopt/utils/fileparser.cpp: In member function ‘bool bayesopt::utils::FileParser::fileExists()’: /home/vlad/tools/bayesopt/utils/fileparser.cpp:88:23: error: cannot convert ‘std::ifstream {aka std::basic_ifstream<char>}’ to ‘bool’ in initialization bool result = ifile; ^~~~~ CMakeFiles/bayesopt.dir/build.make:734: recipe for target 'CMakeFiles/bayesopt.dir/utils/fileparser.cpp.o' failed make[2]: *** [CMakeFiles/bayesopt.dir/utils/fileparser.cpp.o] Error 1 CMakeFiles/Makefile2:67: recipe for target 'CMakeFiles/bayesopt.dir/all' failed make[1]: *** [CMakeFiles/bayesopt.dir/all] Error 2 Makefile:127: recipe for target 'all' failed make: *** [all] Error 2

    opened by usptact 4
  • Python 3 support

    Python 3 support

    There isn't much Python code in this repo at all, just demos really. The interface is written in Cython which emits code that works with Python 2 or 3.

    Because of this, it is trivial to support both 2 and 3 from the same code base.

    Just some changes to print statements within the demos, and a simple bytes/strings conversion on the bopt parameters.

    opened by ericfrederich 3
  • Parallelization for matlab wrapper

    Parallelization for matlab wrapper

    Is there any way to parallelize the function evaluations for the matlab implementation of bayesoptcont()?

    If not for the actual optimization (due to the sequential dependency), then at least for the initial function evaluations (set by n_init_samples)? If I can have 20 of those initial samples running simultaneously using HTcondor, it's pretty wasteful to do them serially.

    enhancement 
    opened by san-bil 3
  • Use BayesOPT to optimize categorical variables

    Use BayesOPT to optimize categorical variables

    Hey Ruben, Sorry to disturb you. I have a question about categorical variables. My inputs are 8 binary variables (0/1). Here is the running status and the results.

    bayesopt1.txt log1.txt bayesopt2.txt log2.txt

    When "mParameters.noise" is small e.g. 1e-10, there is a error in log1.txt. But when "mParameters.noise" is equal to 1.0, there is not any error. Why did this happen ? This question has been bothering me for a long time. Have a favor.

    Thanks a lot.

    Cui

    opened by csjtx1021 2
  • compatibility with python3

    compatibility with python3

    According to the python code, it seems to be compatible with python 3 but when I tried to install it with python 3.5, and import bayesopt, the error occurred:

    ImportError: dynamic module does not define module export function(PyInit_bayesopt)

    Any idea to solve this, many thanks.

    opened by NanyangYe 2
  • Segfault for low discrete parameter space

    Segfault for low discrete parameter space

    I am not getting the DiscreteModel to run without a segmentation fault. (continuous model works fine)

    It might be connected to the discrete parameter space. The error can be reproduced by changing the number of discrete points (line 80) in examples/bo_disc.cpp:

    const size_t nPoints = 1000  // large space - works fine
    const size_t nPoints = 10  // small space - segfaults
    

    The first output of valgrind:

    ==64059== Memcheck, a memory error detector
    ==64059== Copyright (C) 2002-2015, and GNU GPL'd, by Julian Seward et al.
    ==64059== Using Valgrind-3.12.0 and LibVEX; rerun with -h for copyright info
    ==64059== Command: ./bin/bo_disc
    ==64059==
    Running C++ interface
    - 12:30:51.665538 INFO: Expected 6 hyperparameters. Replicating parameters and prior.
    - 12:30:51.728427 INFO: Using default parameters for criteria.
    ==64059== Invalid read of size 8
    ==64059==    at 0x43D25A: bayesopt::DiscreteModel::generateInitialPoints(boost::numeric::ublas::matrix<double, boost::numeric::ublas::basic_row_major<unsigned long, long>, boost::numeric::ublas::unbounded_array<double, std::allocator<double> > >&) (in /home/bach_tm/git_repos_temporary/bayesopt/build/bin/bo_disc)
    ==64059==    by 0x442E43: bayesopt::BayesOptBase::initializeOptimization() (in /home/bach_tm/git_repos_temporary/bayesopt/build/bin/bo_disc)
    ==64059==    by 0x44383C: bayesopt::BayesOptBase::optimize(boost::numeric::ublas::vector<double, boost::numeric::ublas::unbounded_array<double, std::allocator<double> > >&) (in /home/bach_tm/git_repos_temporary/bayesopt/build/bin/bo_disc)
    ==64059==    by 0x43528C: main (in /home/bach_tm/git_repos_temporary/bayesopt/build/bin/bo_disc)
    ==64059==  Address 0x5a94bd8 is 8 bytes after a block of size 240 alloc'd
    ==64059==    at 0x4C2A6F0: operator new(unsigned long) (in /usr/lib64/valgrind/vgpreload_memcheck-amd64-linux.so)
    ==64059==    by 0x43D355: bayesopt::DiscreteModel::generateInitialPoints(boost::numeric::ublas::matrix<double, boost::numeric::ublas::basic_row_major<unsigned long, long>, boost::numeric::ublas::unbounded_array<double, std::allocator<double> > >&) (in /home/bach_tm/git_repos_temporary/bayesopt/build/bin/bo_disc)
    ==64059==    by 0x442E43: bayesopt::BayesOptBase::initializeOptimization() (in /home/bach_tm/git_repos_temporary/bayesopt/build/bin/bo_disc)
    ==64059==    by 0x44383C: bayesopt::BayesOptBase::optimize(boost::numeric::ublas::vector<double, boost::numeric::ublas::unbounded_array<double, std::allocator<double> > >&) (in /home/bach_tm/git_repos_temporary/bayesopt/build/bin/bo_disc)
    ==64059==    by 0x43528C: main (in /home/bach_tm/git_repos_temporary/bayesopt/build/bin/bo_disc)
    ==64059==
    

    Compiler used: gcc / g++ 4.8.5

    opened by tfachmann 1
  • center of search space

    center of search space

    When optimising multidimensional functions I find that the center of the upper and lower bounds are often sampled. This is the precise center every time which makes me think that there is a hard coded probability of sampling the center somewhere, but I've not found where.

    The reason I'm asking is because I would like to benchmark this package and many of the benchmark functions have their optimum near or at the center of the search space. These are the settings I've used for benchmark purposes

    sc_type = SC_MAP
    l_type = L_MCMC
    init_method=1
    n_inner_iterations=5000
    n_iter_relearn=1
    l_all=1
    

    I have also tried force_jump = 0 however I still have the same problem of sampling the precise center. For example, optimising a 20 dimensional Rosenbrock function in the full space (-5.0,5.0) for every dimension, performs better than reducing the search space to (-5.0,2.0) due to the center being sampled. Is there a way for me to turn this off to make the benchmark fairer?

    Also if you have suggestions of settings (other than those I showed) for benchmark purposes I'd be happy to change them. The run time is not important, I'm interested in reducing the function value as much as possible per iteration.

    opened by MrUrq 1
  • Issue in MATLAB compilation

    Issue in MATLAB compilation

    Hi I have been trying to install this on windows platform in MATLAB 2018a. After building successfully using mingw32, matlab_compile.m shows error MEX cannot find library 'bayesopt' specified with the -l option. MEX looks for a file with one of the names: libbayesopt.lib bayesopt.lib Please specify the path to this library with the -L option.

    The lib files generated after building the source code are located in build/lib however they have extension of libbayesopt.a and libnlopt.a. The MATLAB are looking for .lib files. How can this issue be resolved.

    opened by earthat 1
  • noise effect

    noise effect

    About noise the instruction says: Too much noise results in slow convergence while not enough noise might result in not converging at all Why is that?

    opened by zhaozhongch 1
  • Why it doesn't converge to the right value

    Why it doesn't converge to the right value

    Hi Thanks for viewing the question. I run the bayes optimization for my problem and get the data below. The first column is output y and second column is x 317.402,-0.278159//10 sample, 70 iteration 485.787,-0.325544 489.675,-0.256076 577.022,-0.244675 339.859,-0.310618 289.268,-0.282267 603.213,-0.241337 383.85,-0.314427 428.921,-0.265534 264.013,-0.296477// sample ends 296.115,-0.2817 297.359,-0.2817 296.15,-0.2817 295.788,-0.2817 296.233,-0.2817 296.172,-0.2817 295.955,-0.2817 295.969,-0.2817 295.711,-0.2817 295.864,-0.2817 296.037,-0.2817 296.08,-0.2817 .....

    As you can see the value of x converge to -0.2817, however you can see when x = 0.296477, the output is smaller(264). In fact as I run my function by adding 0.001 to x from -0.33 to 0.23 I found when x is about -0.2917 y will output the smallest value. In my problem the output is not deterministic because an algorithm called RANSAC but it will be near a certain value as you can see when x = -0.2817 y has different values but won't change too much. So what might be the problem? Why the bayesopt cannot find the minimum? I use the default parameter.

    opened by zhaozhongch 1
  • Fixing

    Fixing "cannot convert std::stdistream to bool" (#18)

    According to Porting Guide GCC 6 as mentioned in: https://stackoverflow.com/questions/38659115/make-fails-with-error-cannot-convert-stdistream-aka-stdbasic-istreamchar

    opened by usptact 1
  • example: Build + Install on Google Colabratory w/ Python 3.6

    example: Build + Install on Google Colabratory w/ Python 3.6

    Hi Ruben,

    Thank you for this very nice Bayesian Optimization library! It works very well, and has some well thought out defaults and features! :-)

    I managed to get this running on Google Colaboratory in a Python 3.6 environment (after many wasted hours), and I just wanted to share how I did this. I have a collaborator who is stuck on Windows, and this looks like it could be a possible solution for us.

    I'm not sure if this would be worth mentioning in your documentation as an option, but maybe another user would find this information helpful.

    This is a fairly ugly, so I'd be interested to hear if anyone has any cleaner ways to get this up and running.

    The following are the commands I used to install this within Google Colaboratory notebook:

    !apt install libboost-dev cmake cmake-curses-gui g++ python3-dev libboost-dev cmake cmake-curses-gui g++ cython3 freeglut3-dev
    rm -rf /usr/include/numpy
    !ln -s /usr/local/lib/python3.6/dist-packages/numpy/core/include/numpy /usr/include/numpy
    !git clone https://github.com/rmcantin/bayesopt
    cd bayesopt/
    !cmake -DBAYESOPT_PYTHON_INTERFACE=ON -DPYTHON_LIBRARY=/usr/lib/python3.6/config-3.6m-x86_64-linux-gnu/libpython3.6m.so -DPYTHON_INCLUDE_DIR=$(python-config --prefix)/include/python3.6 -DPYTHON_NUMPY_INCLUDE_DIR=/usr/lib/python3.6/dist-packages/numpy/core/include . && make && make install
    cp /usr/lib/python2.7/dist-packages/bayes* /usr/lib/python3.6/
    cd python/
    %run demo_distance.py
    

    And here is a sample notebook: https://colab.research.google.com/drive/1ajWJGdrZCdfRML4O6Ltv2NpOE4w_oyFF

    Very excited to optimize some functions now!

    All the best, CJ

    enhancement 
    opened by cjekel 2
  • Access to the surrogate model through C API?

    Access to the surrogate model through C API?

    Hi

    I like your package and wrote a little julia wrapper.

    One thing I couldn't easily figure out was how to access the surrogate model after fitting (my C++ is a bit limited). Is there an easy way (ideally through functions similar to the ones in the current C API) to access the surrogate model (e.g. inspect kernel parameters, sampling from the model or evaluating mean and sigma for some inputs)?

    opened by jbrea 2
  • Compatibility with python 3.6

    Compatibility with python 3.6

    I'm using Python 3.6.3 :: Anaconda custom (64-bit) - I get the following error, when I try running the examples - File "demo_quad.py", line 22, in import bayesopt ImportError: dynamic module does not define module export function (PyInit_bayesopt)

    I have tried re-generating the bayesopt.cpp with cython and rebuilding and installing the entire code base with the new cpp, no luck. I know that this issue cropped up(#10 and #11 ), but I still face the same problem with python 3.6.3, thanks. Kumar

    opened by krish240574 1
  • Returning minimum of mean instead of minimum sample

    Returning minimum of mean instead of minimum sample

    Currently the optimization routine returns the x_i with smallest y_i observed. While this makes sense with deterministic functions, it doesn't make that much sense with stochastic functions where the minimum of the mean doesn't always coincide with the sample minimum. For this reason, it would be good to have an option for retrieving the minimum of the mean function instead of the minimum sampled value.

    enhancement 
    opened by akangasr 1
  • Mixed type optimization (continuous, discrete, categorical)

    Mixed type optimization (continuous, discrete, categorical)

    Have there been any thoughts of implementing an optimization routine (with interface to Python) that can optimize multiple types of parameters at once? This is often needed when optimizing machine learning algorithms. I'd really like to use it with deep learning, which has both continuous, discrete and categorical hyperparameters.

    enhancement 
    opened by PiranjaF 2
Owner
Ruben Martinez-Cantin
Ruben Martinez-Cantin
Parallel library for approximate inference on discrete Bayesian networks

baylib C++ library Baylib is a parallel inference library for discrete Bayesian networks supporting approximate inference algorithms both in CPU and G

Massimiliano Pronesti 26 Dec 7, 2022
Reviatalizing Optimization for 3D Human Pose and Shape Estimation: A Sparse Constrained Formulation

Reviatalizing Optimization for 3D Human Pose and Shape Estimation: A Sparse Constrained Formulation This is the implementation of the approach describ

Taosha Fan 47 Nov 15, 2022
《Graph Optimization Approach to Range-based Localization》; UWB localization

This is modified from localization . Thanks for his work for uwb localizaiton. 代码1:https://github.com/qxiaofan/awesome-uwb-localization 代码2:https://gi

3D视觉工坊 35 Dec 2, 2022
Transformer related optimization, including BERT, GPT

FasterTransformer This repository provides a script and recipe to run the highly optimized transformer-based encoder and decoder component, and it is

NVIDIA Corporation 1.7k Dec 26, 2022
Distributed Pose Graph Optimization

Distributed Pose Graph Optimization

MIT Aerospace Controls Laboratory 99 Dec 28, 2022
OpenVINO™ optimization for PointPillars*

OpenVINO™ optimization for PointPillars* There are 2 demonstrations in the repo. Demo of PointPillars Optimization - It demonstrates how to implement

Intel Corporation 17 Nov 6, 2022
Nano - C++ library [machine learning & numerical optimization] - superseeded by libnano

Nano Nano provides numerical optimization and machine learning utilities. For example it can be used to train models such as multi-layer perceptrons (

Cosmin Atanasoaei 1 Apr 18, 2020
nanoPGO: A header-only library for Pose-Graph-Optimization in SE(2).

nanoPGO nanoPGO: A header-only library for Pose-Graph-Optimization in SE(2). 1. Description This repo is an implementation of 2D Pose Graph Optimizati

道锋 3 Jul 7, 2022
Experimental and Comparative Performance Measurements of High Performance Computing Based on OpenMP and MPI

High-Performance-Computing-Experiments Experimental and Comparative Performance Measurements of High Performance Computing Based on OpenMP and MPI 实验结

Jiang Lu 1 Nov 27, 2021
APFS module for linux, with experimental write support (out-of-tree repository)

Apple File System ================= The Apple File System (APFS) is the copy-on-write filesystem currently used on all Apple devices. This module pro

APFS for Linux 260 Jan 4, 2023
Instagram's experimental performance oriented greenfield implementation of Python.

Welcome to Skybison! Skybison is experimental performance-oriented greenfield implementation of Python 3.8. It contains a number of performance optimi

Meta Experimental 288 Jan 3, 2023
Experimental OpenCL SPIR-V to OpenCL C translator

spirv2clc spirv2clc is an experimental OpenCL SPIR-V to OpenCL C translator currently targeting OpenCL 1.2 support. It can generate OpenCL C code equi

Kévin Petit 19 Oct 1, 2022
Loki is a C++ library of designs, containing flexible implementations of common design patterns and idioms.

Last update: Novmber 16, 2005 Directions: To use Loki, simply extract the files from the archive, give your compiler access to their include path:

Stefan Naewe 207 Dec 29, 2022
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow

eXtreme Gradient Boosting Community | Documentation | Resources | Contributors | Release Notes XGBoost is an optimized distributed gradient boosting l

Distributed (Deep) Machine Learning Community 23.6k Jan 3, 2023
High performance, easy-to-use, and scalable machine learning (ML) package, including linear model (LR), factorization machines (FM), and field-aware factorization machines (FFM) for Python and CLI interface.

What is xLearn? xLearn is a high performance, easy-to-use, and scalable machine learning package that contains linear model (LR), factorization machin

Chao Ma 3k Dec 23, 2022
Video, Image and GIF upscale/enlarge(Super-Resolution) and Video frame interpolation. Achieved with Waifu2x, SRMD, RealSR, Anime4K, RIFE, CAIN, DAIN and ACNet.

Video, Image and GIF upscale/enlarge(Super-Resolution) and Video frame interpolation. Achieved with Waifu2x, SRMD, RealSR, Anime4K, RIFE, CAIN, DAIN and ACNet.

Aaron Feng 8.7k Dec 31, 2022
ORB-SLAM3 is the first real-time SLAM library able to perform Visual, Visual-Inertial and Multi-Map SLAM with monocular, stereo and RGB-D cameras, using pin-hole and fisheye lens models.

Just to test for my research, and I add coordinate transformation to evaluate the ORB_SLAM3. Only applied in research, and respect the authors' all work.

B.X.W 5 Jul 11, 2022
Cartographer is a system that provides real-time simultaneous localization and mapping (SLAM) in 2D and 3D across multiple platforms and sensor configurations.

Cartographer Purpose Cartographer is a system that provides real-time simultaneous localization and mapping (SLAM) in 2D and 3D across multiple platfo

Cartographer 6.3k Jan 4, 2023