LogoAIAny

A simple neural network module for relational reasoning

This paper introduces Relation Networks, a plug-and-play neural module that explicitly computes pair-wise object relations. When appended to standard CNN/LSTM encoders the module yields super-human 95.5 % accuracy on CLEVR, solves 18/20 bAbI tasks, and infers hidden links in dynamic physical systems, inspiring later work on relational reasoning across vision, language and RL.

Introduction

Relational reasoning is a central component of generally intelligent behavior, but has proven difficult for neural networks to learn. In this paper we describe how to use Relation Networks (RNs) as a simple plug-and-play module to solve problems that fundamentally hinge on relational reasoning. We tested RN-augmented networks on three tasks: visual question answering using a challenging dataset called CLEVR, on which we achieve state-of-the-art, super-human performance; text-based question answering using the bAbI suite of tasks; and complex reasoning about dynamic physical systems. Then, using a curated dataset called Sort-of-CLEVR we show that powerful convolutional networks do not have a general capacity to solve relational questions, but can gain this capacity when augmented with RNs. Our work shows how a deep learning architecture equipped with an RN module can implicitly discover and learn to reason about entities and their relations.

Information

  • Websitearxiv.org
  • AuthorsAdam Santoro, David Raposo, David G. T. Barrett, Mateusz Malinowski, Razvan Pascanu, Peter Battaglia, Timothy Lillicrap
  • Published date2017/06/05