Source Themes

Detecting Syntactic Change Using a Neural Part-of-Speech Tagger

We find that a neural network part-of-speech tagger implicity learns to model syntactic change.

Sequential Neural Networks as Automata

I formally characterize the capacity and memory of various RNNs, given asymptotic assumptions.

Finding Syntactic Representations in Neural Stacks

Stack-augmented RNNs that are trained to perform language modeling learn to exploit linguistic structure without any supervision.

Context-Free Transductions with Neural Stacks

This paper analyzes the behavior of stack-augmented recurrent neural network models.

End-to-End Graph-Based TAG Parsing with Neural Networks

We present a graph-based Tree Adjoining Grammar parser that uses BiLSTMs, highway connections, and character-level CNNs.