Table of Contents Automatic Machine Learning Common Pitfalls for Studying the Human Side of Machine Learning Statistical Learning Theory: a Hitchhiker’s Guide Unsupervised Deep Learning Adversarial Robustness, Theory and Practice Visualization for Machine Learning Counterfactual Inference Scalable Bayesian Inference Negative Dependence, Stable Polynomials and All That Automatic Machine Learning Speakers: Frank Hutter and Joaquin Vanschoren (function(d, s, id) { var js, fjs = d.getElementsByTagName(s)[0]; if (d.getElementById(id)) return; js = d.createElement(s); js.id = id; js.src = https://connect.facebook.net/mr_IN/sdk.js#xfbml=1&version=v3.2; fjs.parentNode.insertBefore(js, fjs);}(document, script, facebook-jssdk)); Tutorial Session: Automatic Machine Learning Welcome back to the NeurIPS 2018 Tutorial SessionThis tutorial Automatic Machine Learning will cover the methods underlying the current state of the art in this fast-paced field.Persented by Frank Hutter (University of Freiburg) and Joaquin Vanschoren (Eindhoven University of Technology, OpenML) Neural Information Processing Systems यांनी वर पोस्ट केले सोमवार, ३ डिसेंबर, २०१८ Tutorial summary Building an end-to-end machine learning model involves a number of steps, such as preprocessing data, creating features, selecting model, and tuning the hyperparameters..Automatic Machine Learning, or AutoML, aims to automate these processes – this tutorial covers methods underlying the state-of-the-art in AutoML..Quite a relevant topic in today’s environment..Frank Hutter kicked off the tutorial by discussing the various applications of deep learning and an expert’s role in building a successful model..This can potentially be replaced by an AutoML service that tries to learn the features, architecture and parameters to use based on the raw data that we provide..Followed by this basic introduction to AutoML, Frank spoke about the types of hyperparameters and modern approaches to Hyperparameter Optimisation..This is broadly divided into three sub-topics: AutoML as hyperparameter optimization Blackbox optimization: Discusses approaches for blackbox optimization like grid search, random search and Bayesian optimization..Beyond Blackbox optimization: Covers three main approaches – hyperparameter gradient descent, extrapolation of learning curves and multi-fidelity optimization..Meta-learning is also a part of this aspect The next topic Frank covered was about Neural Architecture, which is again divided into three parts – Search Space Design, Blackbox optimization and Beyond Blackbox optimization..Search Space Design: Includes basic neural architecture search spaces such as chain structured search spaces and cell structured search spaces..Blackbox optimization for neural architecture search (NAS) method: Frank covers NAS with reinforcement learning and Bayesian optimization as a part of this topic Beyond Blackbox optimization: Discussion on the four main approaches weight inheritance and network morphisms, weight sharing and one-shot model, multi-fidelity optimization and meta-learning..After a short Q&A session with Frank, Joaquin Vanschoren took over for the second half of the tutorial.. More details