## Paper: Augmenting Genetic Algorithms with Deep Neural Networks for Exploring the Chemical Space – Nigam et al 2020

In this paper, the authors use a genetic algorithm operating on the SELFIES (SELF-referencIng Embedded Strings) representation of molecules to explore the vast space of small molecules. A neural network is used to guide the exploration process. Also, fitness functions are constructed to generate molecules with specific properties.

## Paper: Optuna: A Next-generation Hyperparameter Optimization Framework – Akiba et al 2019

This paper introduces Optuna, a Python package for performing hyperparameter optimization and pruning for machine learning algorithms.

## Hyperparameter Search With Bayesian Optimization for Keras (CNN) Classification and Ensembling

In this article we use the Bayesian Optimization (BO) package to determine hyperparameters for a 2D convolutional neural network classifier with Keras.

## Hyperparameter Search With Bayesian Optimization for XGBoost Classification and Ensembling

In Hyperparameter Search With Bayesian Optimization for Scikit-learn Classification and Ensembling we applied the Bayesian Optimization (BO) package to the Scikit-learn ExtraTreesClassifier algorithm. Here we do the same for XGBoost.

## Hyperparameter Search With Bayesian Optimization for Scikit-learn Classification and Ensembling

Bayesian Optimization (BO) is a lightweight Python package for finding the parameters of an arbitrary function to maximize a given cost function.

In this article, we demonstrate how to use this package to do hyperparameter search for a classification problem with Scikit-learn.

## Paper: Automatic Machine Learning by Pipeline Synthesis using Model-Based Reinforcement Learning and a Grammar – Drori et al 2019

We formulate the AutoML problem of pipeline synthesis as a single-player game, in which the player starts from an empty pipeline, and in each step is allowed to perform edit operations to add, remove, or replace pipeline components according to a pipeline grammar.

## Paper: An Introduction to Variational Autoencoders – Kingma and Welling 2019

Variational autoencoders provide a principled framework for learning deep latent-variable models and corresponding inference models. In this work, we provide an introduction to variational autoencoders and some important extensions.

## Paper: GraphVAE: Towards Generation of Small Graphs Using Variational Autoencoders – Simonovsky and Komodakis 2018

We approach the task of graph generation by devising a neural network able to translate vectors in a continuous code space to graphs. Our main idea is to output a probabilistic fully-connected graph and use a standard graph matching algorithm to align it to the ground truth.

## Prediction of Small Molecule Lipophilicity: Part 5 – Ensemble of 2D Convolutional Neural Networks With Morgan (Circular) Fingerprints

Using Morgan (circular) fingerprints as input data, we create an ensemble of 2D convolutional neural networks (CNNs) with Keras, to predict lipophilicity values.

## Prediction of Small Molecule Lipophilicity: Part 4 – Ensemble of 1D Convolutional Neural Networks With Morgan (Circular) Fingerprints

Using Morgan (circular) fingerprints as input data, we create an ensemble of 1D convolutional neural networks (CNNs) with Keras, to predict lipophilicity values.

## Prediction of Small Molecule Lipophilicity: Part 3 – Ensemble of Multilayer Perceptrons With Morgan (Circular) Fingerprints

Using Morgan (circular) fingerprints as input data, we create an ensemble of multilayer perceptrons with Keras, to predict lipophilicity values.

## Paper: Auto-Keras: An Efficient Neural Architecture Search System – Jin et al. 2019

Network morphism, which keeps the functionality of a neural network while changing its neural architecture, could be helpful for NAS by enabling more efficient training during the search. In this paper, we propose a novel framework enabling Bayesian optimization to guide the network morphism for efficient neural architecture search. The framework develops a neural network kernel and a tree-structured acquisition function optimization algorithm to efficiently explores the search space.

## Paper: A de novo molecular generation method using latent vector based generative adversarial network – Prykhodko et al. 2019

Deep learning methods applied to drug discovery have been used to generate novel structures. In this study, we propose a new deep learning architecture, LatentGAN, which combines an autoencoder and a generative adversarial neural network for de novo molecular design.

## Paper: MolGAN: An implicit generative model for small molecular graphs – De Cao and Kipf 2018

We introduce MolGAN, an implicit, likelihood-free generative model for small molecular graphs that circumvents the need for expensive graph matching procedures or node ordering heuristics of previous likelihood-based methods. Our method adapts generative adversarial networks (GANs) to operate directly on graph-structured data. We combine our approach with a reinforcement learning objective to encourage the generation of molecules with specific desired chemical properties.

## Paper: How Much Chemistry Does a Deep Neural Network Need to Know to Make Accurate Predictions? – Goh et al. 2018

Taking SMILES representations of small molecules, converting them to 2d drawings and adding domain information as image channels, an Inception-ResNet deep convolutional neural network (CNN) is used to solve various cheminformatics problems.

At no cost to you, Machine Learning Applied earns a commission from qualified purchases when you click on the links below.