Hyperparameter Search With Optuna: Part 2 – XGBoost Classification and Ensembling
In this article, we use the tree-structured Parzen algorithm via Optuna to find hyperparameters for XGBoost for the the MNIST handwritten digits data set classification problem.
Hyperparameter Search With Optuna: Part 1 – Scikit-learn Classification and Ensembling
Optuna is a Python package for general function optimization. It also has specialized coding to integrate it with many popular machine learning packages to allow the use of pruning algorithms to make hyperparameter searching more efficient. In this article we use Optuna to optimize hyperparameters for Sci-kit Learn machine learning algorithms.
Paper: DeepSMILES: An Adaptation of SMILES for Use in Machine-Learning of Chemical Structures – O’Boyle and Dalke 2018
SMILES (Simplified Molecular Input Line Entry System) representations of molecules have found many uses in machine learning algorithms, especially those derived from natural language processing techniques. However, they were not designed for machine learning and thus suffer from various syntax issues that can hamper machine learning methods, especially generative methods. DeepSMILES is a modification of SMILES explicitly designed to address these issues.
Paper: Augmenting Genetic Algorithms with Deep Neural Networks for Exploring the Chemical Space – Nigam et al 2020
In this paper, the authors use a genetic algorithm operating on the SELFIES (SELF-referencIng Embedded Strings) representation of molecules to explore the vast space of small molecules. A neural network is used to guide the exploration process. Also, fitness functions are constructed to generate molecules with specific properties.
Paper: Optuna: A Next-generation Hyperparameter Optimization Framework – Akiba et al 2019
This paper introduces Optuna, a Python package for performing hyperparameter optimization and pruning for machine learning algorithms.
Hyperparameter Search With Bayesian Optimization for Keras (CNN) Classification and Ensembling
In this article we use the Bayesian Optimization (BO) package to determine hyperparameters for a 2D convolutional neural network classifier with Keras.
Hyperparameter Search With Bayesian Optimization for XGBoost Classification and Ensembling
In Hyperparameter Search With Bayesian Optimization for Scikit-learn Classification and Ensembling we applied the Bayesian Optimization (BO) package to the Scikit-learn ExtraTreesClassifier algorithm. Here we do the same for XGBoost.
Hyperparameter Search With Bayesian Optimization for Scikit-learn Classification and Ensembling
Bayesian Optimization (BO) is a lightweight Python package for finding the parameters of an arbitrary function to maximize a given cost function.
In this article, we demonstrate how to use this package to do hyperparameter search for a classification problem with Scikit-learn.
Paper: Automatic Machine Learning by Pipeline Synthesis using Model-Based Reinforcement Learning and a Grammar – Drori et al 2019
We formulate the AutoML problem of pipeline synthesis as a single-player game, in which the player starts from an empty pipeline, and in each step is allowed to perform edit operations to add, remove, or replace pipeline components according to a pipeline grammar.
Paper: An Introduction to Variational Autoencoders – Kingma and Welling 2019
Variational autoencoders provide a principled framework for learning deep latent-variable models and corresponding inference models. In this work, we provide an introduction to variational autoencoders and some important extensions.
Paper: GraphVAE: Towards Generation of Small Graphs Using Variational Autoencoders – Simonovsky and Komodakis 2018
We approach the task of graph generation by devising a neural network able to translate vectors in a continuous code space to graphs. Our main idea is to output a probabilistic fully-connected graph and use a standard graph matching algorithm to align it to the ground truth.
Prediction of Small Molecule Lipophilicity: Part 5 – Ensemble of 2D Convolutional Neural Networks With Morgan (Circular) Fingerprints
Using Morgan (circular) fingerprints as input data, we create an ensemble of 2D convolutional neural networks (CNNs) with Keras, to predict lipophilicity values.
Prediction of Small Molecule Lipophilicity: Part 4 – Ensemble of 1D Convolutional Neural Networks With Morgan (Circular) Fingerprints
Using Morgan (circular) fingerprints as input data, we create an ensemble of 1D convolutional neural networks (CNNs) with Keras, to predict lipophilicity values.
Prediction of Small Molecule Lipophilicity: Part 3 – Ensemble of Multilayer Perceptrons With Morgan (Circular) Fingerprints
Using Morgan (circular) fingerprints as input data, we create an ensemble of multilayer perceptrons with Keras, to predict lipophilicity values.
Paper: Auto-Keras: An Efficient Neural Architecture Search System – Jin et al. 2019
Network morphism, which keeps the functionality of a neural network while changing its neural architecture, could be helpful for NAS by enabling more efficient training during the search. In this paper, we propose a novel framework enabling Bayesian optimization to guide the network morphism for efficient neural architecture search. The framework develops a neural network kernel and a tree-structured acquisition function optimization algorithm to efficiently explores the search space.
At no cost to you, Machine Learning Applied earns a commission from qualified purchases when you click on the links below.