Paper: Augmenting Genetic Algorithms with Deep Neural Networks for Exploring the Chemical Space – Nigam et al 2020
In this paper, the authors use a genetic algorithm operating on the SELFIES (SELF-referencIng Embedded Strings) representation of molecules to explore the vast space of small molecules. A neural network is used to guide the exploration process. Also, fitness functions are constructed to generate molecules with specific properties.
This paper introduces Optuna, a Python package for performing hyperparameter optimization and pruning for machine learning algorithms.
In Hyperparameter Search With Bayesian Optimization for Scikit-learn Classification and Ensembling we applied the Bayesian Optimization (BO) package to the Scikit-learn ExtraTreesClassifier algorithm. Here we do the same for XGBoost.
Paper: Automatic Machine Learning by Pipeline Synthesis using Model-Based Reinforcement Learning and a Grammar – Drori et al 2019
We formulate the AutoML problem of pipeline synthesis as a single-player game, in which the player starts from an empty pipeline, and in each step is allowed to perform edit operations to add, remove, or replace pipeline components according to a pipeline grammar.
Variational autoencoders provide a principled framework for learning deep latent-variable models and corresponding inference models. In this work, we provide an introduction to variational autoencoders and some important extensions.
Paper: GraphVAE: Towards Generation of Small Graphs Using Variational Autoencoders – Simonovsky and Komodakis 2018
We approach the task of graph generation by devising a neural network able to translate vectors in a continuous code space to graphs. Our main idea is to output a probabilistic fully-connected graph and use a standard graph matching algorithm to align it to the ground truth.
Network morphism, which keeps the functionality of a neural network while changing its neural architecture, could be helpful for NAS by enabling more efficient training during the search. In this paper, we propose a novel framework enabling Bayesian optimization to guide the network morphism for efficient neural architecture search. The framework develops a neural network kernel and a tree-structured acquisition function optimization algorithm to efficiently explores the search space.
Paper: A de novo molecular generation method using latent vector based generative adversarial network – Prykhodko et al. 2019
Deep learning methods applied to drug discovery have been used to generate novel structures. In this study, we propose a new deep learning architecture, LatentGAN, which combines an autoencoder and a generative adversarial neural network for de novo molecular design.
We introduce MolGAN, an implicit, likelihood-free generative model for small molecular graphs that circumvents the need for expensive graph matching procedures or node ordering heuristics of previous likelihood-based methods. Our method adapts generative adversarial networks (GANs) to operate directly on graph-structured data. We combine our approach with a reinforcement learning objective to encourage the generation of molecules with specific desired chemical properties.
Paper: How Much Chemistry Does a Deep Neural Network Need to Know to Make Accurate Predictions? – Goh et al. 2018
Taking SMILES representations of small molecules, converting them to 2d drawings and adding domain information as image channels, an Inception-ResNet deep convolutional neural network (CNN) is used to solve various cheminformatics problems.
At no cost to you, Machine Learning Applied earns a commission from qualified purchases when you click on the links below.