This paper introduces Message Passing Neural Networks (MPNNs), a unifying framework for graph-based deep learning, and applies it to quantum-chemistry property prediction, achieving state-of-the-art accuracy on the QM9 benchmark and approaching chemical accuracy on most targets. Its impact includes popularising graph neural networks, influencing subsequent work in cheminformatics, materials discovery, and the broader machine-learning community by demonstrating how learned message passing can replace hand-engineered molecular descriptors.
This paper proposes a quantitative framework for the rise-and-fall trajectory of complexity in closed systems, showing that a coffee-and-cream cellular automaton exhibits a bell-curve of apparent complexity when particles interact, thereby linking information theory with thermodynamics and self-organization.
Supervised learning on molecules has incredible potential to be useful in chemistry, drug discovery, and materials science. Luckily, several promising and closely related neural network models invariant to molecular symmetries have already been described in the literature. These models learn a message passing algorithm and aggregation procedure to compute a function of their entire input graph. At this point, the next step is to find a particularly effective variant of this general approach and apply it to chemical prediction benchmarks until we either solve them or reach the limits of the approach. In this paper, we reformulate existing models into a single common framework we call Message Passing Neural Networks (MPNNs) and explore additional novel variations within this framework. Using MPNNs we demonstrate state-of-the-art results on an important molecular property prediction benchmark; these results are strong enough that we believe future work should focus on datasets with larger molecules or more accurate ground truth labels.