References #
My understanding of Bayesian optimization is also limited, mainly referencing the following content👇
-
【Machine Learning】Understanding Bayesian Optimization in One Article
-
Hyperparameter Optimization—Bayesian Optimization and Its Improvement (PBT Optimization)
Advantages and Algorithm Principles #
This section focuses on the advantages of Bayesian optimization and its algorithmic principles. If you are only interested in “how to use it,” you can first learn about the advantages of Bayesian optimization and then skip to MATLAB Usage.
Advantages #
Algorithm Principles #
MATLAB Usage #
Code Overview #
% Define the objective function
function y = objectiveFcn(x)
y = (1 - x.x1)^2 + 100 * (x.x2 - x.x1^2)^2;
end
% Define optimization variables
vars = [optimizableVariable('x1', [-2, 2])
optimizableVariable('x2', [-2, 2])];
% Perform Bayesian optimization
results = bayesopt(@objectiveFcn, vars, ...
'AcquisitionFunctionName', 'expected-improvement-plus', ...
'MaxObjectiveEvaluations', 30, ...
'IsObjectiveDeterministic', true, ...
'Verbose', 1);
% View results
bestPoint = results.XAtMinObjective;
bestObjective = results.MinObjective;
fprintf('Optimal solution x1: %.4f, x2: %.4f\n', bestPoint.x1, bestPoint.x2);
fprintf('Optimal objective value: %.4f\n', bestObjective);
Parameter Explanation #
Params | Meaning |
---|---|
AcquisitionFunctionName |
Selects the acquisition function, which determines how the algorithm chooses the next sampling point after each iteration. |
MaxObjectiveEvaluations |
Maximum number of iterations. |
IsObjectiveDeterministic |
Set to true if the objective function is deterministic (no noise); otherwise, set to false . |
Verbose |
Controls the verbosity of the output, which may include multiple charts. |
For specific options for each parameter, refer to the official documentation: bayesopt. The official documentation is very detailed and includes many examples.
One of the essential skills for mathematical modelers is reading documentation 😝