I am interested in the formal theory of language and computation—spanning applications in NLP and linguistics. Much of my research has focused on analyzing how neural networks learn to represent the structure and meaning of natural language, and where they fall short. I believe that formal theory is an indispensible tool for truly understanding the capabilities of large-scale language models.
I am a Ph.D. student at the CDS at NYU, where I am supported by an NSF graduate research fellowship. I am thankful to have previously worked with Noah Smith, Yoav Goldberg, and Roy Schwartz as a PYI on the AllenNLP team at AI2. As an undergrad at Yale, I was a member of CLAY and wrote a thesis with Bob Frank and Dana Angluin, both of whom were inspiring mentors.
Ph.D. at Center for Data Science, 2021-
New York University
B.Sc. in Computer Science, 2019
B.A. in Linguistics, 2019