Source Themes

A Formal Hierarchy of RNN Architectures

We formally characterize RNN architectures in terms of their capacity.

CORD-19: The COVID-19 Open Research Dataset

I helped out a bit with this large collaboration between AI2 and other institutions.

Detecting Syntactic Change Using a Neural Part-of-Speech Tagger

We find that a neural network part-of-speech tagger implicity learns to model syntactic change.

Sequential Neural Networks as Automata

I formally characterize the capacity and memory of various RNNs, given asymptotic assumptions.

Finding Syntactic Representations in Neural Stacks

Stack-augmented RNNs that are trained to perform language modeling learn to exploit linguistic structure without any supervision.

Context-Free Transductions with Neural Stacks

This paper analyzes the behavior of stack-augmented recurrent neural network models.

End-to-End Graph-Based TAG Parsing with Neural Networks

We present a graph-based Tree Adjoining Grammar parser that uses BiLSTMs, highway connections, and character-level CNNs.

Sense Abstraction: A Unified Framework of Intensionality, Predicate Abstraction, and Alternative Semantics

I propose a unified semantic treatment of several English subordinate clause types.