Modeling Ancient Greek Syntax with an Unsupervised RNNG

November 6, 2020
4:00-5:00 pm ET
Sococo 209, Zoom
Speaker: Sophia Sklaviadis
Host: JP de Ruiter, Greg Crane

Abstract

We aim to train an unsupervised generative language autoencoder with a graphical representation of dependency syntax in the encoder and a transition-based (stack-memory-mediated) representation of dependency syntax in the decoder. This deep latent variable language modeling architecture is, to our knowledge, new. We build upon the most recent incarnations of Dyer, Kuncoro, et al. 2016’s recurrent neural network grammar (RNNG) model. Specifically, we combine Li et al. 2018’s latent dependency decoder with Kim et al. 2019’s approach to variational inference, using an edge-factored CRF variational distribution. To model the famously free word order patterns of ancient Greek, we generalize the underlying latent structure of Kim et al. 2019, by describing an unsupervised RNNG using a neural CRF parametrization of an edge- factored dependency parser as the variational distribution. Using this dependency CRF proposal distribution, we combine Li et al. 2018’s latent dependency model with Kim et al. 2019’s variational inference approach. The CRF over dependency trees is the encoder’s parser, while a stack LSTM is used as the decoder’s transition-based parser.

Join meeting in Sococo, Halligan 209. Login: tuftscs.sococo.com

Join Zoom Meeting: https://tufts.zoom.us/j/98610939077

PASSWORD: see colloquia email for password

Dial by your location: +1 646 558 8656 US (New York)

Meeting ID: 986 1093 9077

Passcode: see colloquia email for passcode