Introduction My research is centered around the study of meaning-in-language: what is meaning, how do languages encode meaning, how can the grammars that encode meaning be learned from observed utterances? These are very broad questions that encompass both (i) language as the external signal that carries meaning and (ii) cognition as the ultimate producer and interpreter of language.

I am interested in Metaphor Theory and Construction Grammar because these are important intersections of language and cognition. I am interested in corpus-based computational modeling because it supports linguistic theories that are reproducible, falsifiable, and learnable. Such computational modeling involves both (i) the symbolic representation of metaphor and constructions and (ii) the statistical learning of such symbolic representations from corpora of written language.

I am also interested in using these learned symbolic representations to model social and regional dialects. Because language is characterized by the presence of externally-conditioned variations, a further criteria for the success of Metaphor Theory and Construction Grammar is their ability to predict and model variations in usage.

As a short introduction to my work, here are two highlights:

Computational Construction Grammar (C2xG)

A variant of Construction Grammar (CxG) with generalizations formed at the level of learning.

[Draft | Data]

[Draft | Data]

C2xG: Python package to learn, evaluate, annotate, and extract vector representations of construction grammars from corpora.

Modeling Metaphor-in-Language

An approach to metaphor based on analysis of observed language that supports the gradient identification of metaphoric language in corpora:

[Draft | Data]

[Draft | Data]