RESEARCH
PROJECTS
Few-Shot Learning in Language Models
LSTM language models successfully learn the grammatical gender of novel nouns in a few-shot learning paradigm, and apply this knowledge in a previously unseen context, suggesting that they are capable of abstract syntactic generalisation and represent grammatical gender as a context-invariant property, similar to humans.
Large Language Models and Psycholinguistics
Large Language models are not intended as models of human linguistic processing. They are, however, very successful at providing a model for language. Large language models are important in psycholinguistics: they are useful as a practical tool, as an illustrative comparative, and philosophically, as a basis for recasting the relationship between language and thought. Our Commentary on Bowers et al. 2023.
Natural Language Processing
Deep neural networks (DNNs) are surprisingly great at learning language rules required for natural language modelling. Do LSTMs really learn abstract grammatical rules like humans, or do they rely on simple heuristics? We use gender agreement to study the mechanisms behind LSTMs' linguistic abilities. Contributing to the debate on how humans vs. machines process language.
EMNLP 2022 (BlackboxNLP) Paper
NMC22 Talk
Adaptive Memory & Reward
The brain preferentially remembers particularly meaningful events. However, we cannot know whether an event is meaningful when we encounter it. Thus, we need an adaptive memory system that enhances the memory of events after or before they become salient. We explore whether such a system is driven by reward.
Neural Correlates of Language
The system underlying language processing of continuous speech is still debated upon; is it driven by grammar-based hierarchical linguistic structures or by statistical cues? We use EEG data and deep learning methods to answer this.
Thalamocortical Circuits & Deep Learning
Thalamo-cortical circuits play an important role in sensory processing in parallel with direct cortical circuitry. We explore the role of the thalamus within this circuitry using skip-connections in deep neural networks.