A Theoretical Framework for Bayesian Optimization Convergence

Detta är en Master-uppsats från KTH/Optimeringslära och systemteori

Författare: Alexandre Scotto Di Perrotolo; [2018]

Nyckelord: ;

Sammanfattning: Bayesian optimization is a well known class of derivative-free optimization algorithms mainly used for expensive black-box objective functions. Despite their efficiency, they suffer from a lack of rigorous convergence criterion which makes them more prone to be used as modeling tools rather than optimizing tools. This master thesis proposes, analyzes, and tests a globally convergent framework (that is to say the convergence to a stationary point regardless the initial sample) for Bayesian optimization algorithms. The framework design intends to preserve the global search characteristics for minimum while being rigorously monitored to converge.

  HÄR KAN DU HÄMTA UPPSATSEN I FULLTEXT. (följ länken till nästa sida)