We introduce a new neural architecture to learn the conditional probability of an output sequence with elements that are discrete tokens corresponding to positions in an input sequence. Such problems cannot be trivially addressed by existent approaches such as sequence-to-sequence and Neural Turing Machines, because the number of target classes in each step of the output depends on the length of the input, which is variable. Problems such as sorting variable sized sequences, and various combinatorial optimization problems belong to this class. Our model solves the problem of variable size output dictionaries using a recently proposed mechanism of neural attention. It differs from the previous attention attempts in that, instead of using attention to blend hidden units of an encoder to a context vector at each decoder step, it uses attention as a pointer to select a member of the input sequence as the output. We call this architecture a Pointer Net (Ptr-Net). We show Ptr-Nets can be used to learn approximate solutions to three challenging geometric problems -- finding planar convex hulls, computing Delaunay triangulations, and the planar Travelling Salesman Problem -- using training examples alone. Ptr-Nets not only improve over sequence-to-sequence with input attention, but also allow us to generalize to variable size output dictionaries. We show that the learnt models generalize beyond the maximum lengths they were trained on. We hope our results on these tasks will encourage a broader exploration of neural learning for discrete problems.
This paper proposes Pointer Networks (Ptr-Nets), a novel neural architecture that uses attention as a pointer mechanism to select elements from the input sequence, enabling variable-sized output dictionaries. Unlike traditional sequence-to-sequence models with fixed output spaces, Ptr-Nets can handle combinatorial optimization problems like convex hulls, Delaunay triangulations, and the Travelling Salesman Problem. The model demonstrates strong generalization beyond training sizes and achieves competitive results with purely data-driven learning. This work introduced a powerful method to solve discrete structured prediction tasks using deep learning, expanding the applicability of neural networks to problems with variable output spaces.