In this interesting paper, the authors use a multi input neural network to predict various small molecule properties in which one branch is a multilayer perceptron with MACCS fingerprints and the other is one of a RNN, 1D CNN, 1D CNN-RNN with SMILES. On various data sets, their results are usually as good as or better than these methods as single input models, Chemception, or graph convolution networks (GCN).
It would have been interesting to see how the use of Morgan fingerprints would compare with MACCS fingerprints, and also adding GCNs to the mix. However, the work done here does show that combining models is a simple way to enhance machine learning approaches for predictive modeling in cheminformatics.


Code can be found at https://github.com/NU-CUCIS/CheMixNet.

Below is the abstract of CheMixNet: Mixed DNN Architectures for Predicting Chemical Properties using Multiple Molecular Representations.

SMILES is a linear representation of chemical structures which encodes the connection table, and the stereochemistry of a molecule as a line of text with a grammar structure denoting atoms, bonds, rings and chains, and this information can be used to predict chemical properties. Molecular fingerprints are representations of chemical structures, successfully used in similarity search, clustering, classification, drug discovery, and virtual screening and are a standard and computationally efficient abstract representation where structural features are represented as a bit string. Both SMILES and molecular fingerprints are different representations for describing the structure of a molecule. There exist several predictive models for learning chemical properties based on either SMILES or molecular fingerprints. Here, our goal is to build predictive models that can leverage both these molecular representations. In this work, we present CheMixNet — a set of neural networks for predicting chemical properties from a mixture of features learned from the two molecular representations — SMILES as sequences and molecular fingerprints as vector inputs. We demonstrate the efficacy of CheMixNet architectures by evaluating on six different datasets. The proposed CheMixNet models not only outperforms the candidate neural architectures such as contemporary fully connected networks that uses molecular fingerprints and 1-D CNN and RNN models trained SMILES sequences, but also other state-of-the-art architectures such as Chemception and Molecular Graph Convolutions.