Automated machine learning (AutoML) when applied to neural networks is usually designated as neural architecture search (NAS), meaning that the architecture, types of blocks (such as inception), skip connections, etc., are considered as hyperparameters along with activation function type, dropout value, etc.

A difficulty that arises in NAS is that

neural network architectures are not in Euclidean space and hard to parameterize into a fixed-length vector.

Also,

A network morphism operation on one layer may change the shapes of some intermediate output tensors, which no longer match input shape requirements of the layers taking them as input.

Thus a main point of the paper is

an efficient neural architecture search with network morphism is proposed, which utilizes Bayesian optimization to guide through the search space by selecting the most promising operations each time. To tackle the aforementioned challenges, an edit-distance neural network kernel is constructed. Being consistent with the key idea of network morphism, it measures how many operations are needed to change one neural network to another.

The authors have implemented their ideas in Auto-Keras. Unfortunately, we were unable to install the code. Also, we note that there is no documentation for the latest version and this has been the case for a few months as we have been monitoring the progress of this project. On the website, there is a request for donations, which implies that the project was unable to secure private or government grants, so we should not expect much progress in the near future. This is unfortunate, as it appears to be both interesting and useful.

Below is the abstract of Auto-Keras: An Efficient Neural Architecture Search System.

Neural architecture search (NAS) has been proposed to automatically tune deep neural networks, but existing search algorithms, e.g., NASNet, PNAS, usually suffer from expensive computational cost. Network morphism, which keeps the functionality of a neural network while changing its neural architecture, could be helpful for NAS by enabling more efficient training during the search. In this paper, we propose a novel framework enabling Bayesian optimization to guide the network morphism for efficient neural architecture search. The framework develops a neural network kernel and a tree-structured acquisition function optimization algorithm to efficiently explores the search space. Intensive experiments on real-world benchmark datasets have been done to demonstrate the superior performance of the developed framework over the state-of-the-art methods. Moreover, we build an open-source AutoML system based on our method, namely Auto-Keras. The system runs in parallel on CPU and GPU, with an adaptive search strategy for different GPU memory limits.