• adaptive memory and tokenization in foundation models (see my NeurIPS 2024 tutorial on dynamic sparsity): I aim to redefine the units of computation of foundation models by adaptively compressing the sequences of their hidden representations and memory. This allows models to tokenize raw, modality-agnostic data end-to-end, learning hierarchical abstractions. Simultaneously, it provides the foundations for permanent model memories and inference-time hyper-scaling.

  • modular deep learning: I am interested in designing neural architectures that route information to specialised modules (e.g., sparse subnetworks). This facilitates systematic generalisation and conditional computation.

  • computational typology: I wish to understand how languages vary, across the world and its cultures, within a computational framework. Multimodal models in particular give us a powerful tool to study how form depends on grounded, embodied representations of meaning and function.

Between 2022 and 2026, I have been an assistant professor (lecturer) at the University of Edinburgh. I spent time between 2024 and 2025 as a visiting professor at NVIDIA. Previously, I was a visiting postdoctoral scholar at Stanford University and a postdoctoral fellow in computer science at Mila - Quebec AI Institute in Montreal. In 2021, I obtained a PhD from the University of Cambridge, St John’s College. Once upon a time I studied modern literature at the University of Pavia. Deep in my heart, I am still a humanist: some of my favourite writers are Italo Calvino, Ursula Le Guin, and Lucretius.

My research is currently supported by ERC, ARIA, and various gifts/compute from Google DeepMind, NVIDIA, and NatWest. I also received a Google Research Faculty Award, and Best Paper / SAC Highlight Awards at ACL, EMNLP, and RepL4NLP. I am a board member of SIGTYP, the ACL special interest group for computational typology, a Scholar of the European Lab for Learning and Intelligent Systems (ELLIS), and part of the TACL journal editorial team.

Lab Members

PhD students (main supervisor)

PhD students (co-supervisor)

Alumni

Media