The presentation below, “Using Bayesian Optimization to Tune Machine Learning Models” by Scott Clark of SigOpt is from MLconf.
The talk briefly introduces Bayesian Global Optimization as an efficient way to optimize machine learning model parameters, especially when evaluating different parameters is time-consuming or expensive.
The talk motivates the problem and gives example applications.
Clark also talks about the development of a robust benchmark suite for our algorithms including test selection, metric design, infrastructure architecture, visualization, and comparison to other standard and open source methods.
He discusses how this evaluation framework empowers our research engineers to confidently and quickly make changes to our core optimization engine.
The talk ends with an in-depth example of using these methods to tune the features and hyperparameters of a real world problem and give several real world applications.
The slides for this presentation are available HERE.
Sign up for the free insideBIGDATA newsletter.