About me
I am a machine learning researcher working on scalable learning paradigms. Since March 2025, I have been at Ndea with François Chollet. Previously, I completed a PhD and postdoc at the University of St. Gallen, advised by Damian Borth, co-advised by Michael W. Mahoney and Xavier Giró-i-Nieto. During my PhD, I was a research intern at Google DeepMind and a visiting researcher at ICSI, Berkeley. My work focuses on learning to generate models and spans treating neural network weights as data and deep learning guided program synthesis.
Research Statement
My research fouses on learning to generate abstractions where the model itself is the output. Currently, I design systems for program synthesis that combine symbolic search with neural network intuition. More broadly, I pursue learning paradigms that merge deep learning with symbolic methods to achieve scalable, efficient learners that generalize across levels of abstraction - from raw data to high-level conceptual structure.
News
2025
- Two papers accepted at NeurIPS 2025:
- Blog + talk on “The Hidden Drivers of HRM’s Performance on ARC-AGI” blog talk
- Paper accepted into DMLR: “A Model Zoo on Phase Transitions in Neural Networks” paper
- Invited talk at Cohere on Scalable Weight Space Learning talk
Joined NDEA in March 2025 as a researcher to work on deep learning–guided program synthesis.
- 2024
- Our workshop for Weight Space Learning was accepted at ICLR 2025! workshop homepage
- Invited talk at Deep Learning Barcelona Symposium on Weight Space Learning
- Invited talk at TU Eindhoven on Phase Transitions in Neural Networks
- Invited talk at UPenn on Weight Space Learning
- Two papers accepted at ICML 2024:
- “Towards Scalable and Versatile Weight Space Learning” arxiv
- “MD tree: a model-diagnostic tree grown on loss landscape” openreview
- Paper accepted at ICML 2024 GRaM Workshop: “Dirac-Bianconi Graph Neural Networks – Enabling Non-Diffusive Long-Range Graph Predictions”
- Completed my PhD at the University of St.Gallen with honors (Summa Cum Laude).
- Reviewer for International Conference on Machine Learning (ICML) 2024.
- Invited Talk at Dartmouth College: Weight Space Learning: Learning from Populations of Neural Networks, 01/2024.
- 2023
- Recently finished a research internship at Google Deepmind, working on data-free methods to mitigate forgetting in LLMs.
- Paper accepted to Chaos: An Interdisciplinary Journal of Nonlinear Science: “Toward dynamic stability assessment of power grid topologies using graph neural networks”, paper
- Paper accepted to NeurIPS 2023 ATTRIB Workshop: “Why do landscape diagnostics matter? Pinpointing the failure mode of generalization” poster paper
- Paper accepted to ICML 2023 HiLD Workshop: “Hyperparamter Tuning using Loss Landscape”, poster
- Invited Talk at Google Algorithms Seminar in Mountain View: Hyper-Representations: Learning from Populations of Neural Networks.
- Received the HSG Impact Award 2023 with Damian Borth for our work on Hyper-Representations, announcement.
- Reviewer for Winter Conference on Applications of Computer Vision (WACV) 2023.
- Reviewer for Conference on Neural Information Processing Systems (NeurIPS) 2023.
- Visiting Scholar with Michael Mahoney at ICSI Berkeley: With Michael, I investigated the weight space and loss surface landscape of Neural Networks. We worked on scaling hyper-representations to large models and diverse architectures, and collaborated with Yaoqing Yang on identifying and utilizing phase transitions in neural networks.
- Paper accepted at ICLR 2023 Workshop on Sparsity in Neural Networks: Sparsified Model Zoo Twins: Investigating Populations of Sparsified Neural Network Models. arxiv
- 2022
- Google Research Scholar Award for “Hyper-Representations: Learning from Populations of Neural Networks” with PI Damian Borth. announcement, article
- Paper accepted at NeurIPS 2022: Hyper-Representations as Generative Models: Sampling Unseen Neural Network Weights. paper
- Paper accepted at NeurIPS 2022 Track on Datasets and Benchmarks: Model Zoos: A Dataset of Diverse Populations of Neural Networks. paper, modelzoos.cc
- Invited Talk at University of St. Gallen, Deep Learning Lecture Series: Hyper-Representations, 11/2022.
- Paper accepted at NeurIPS 2022 Climate Change AI Workshop: Towards dynamic stability analysis of sustainable power grids using graph neural networks. arxiv
- Paper accepted at ICML 2022 Workshop on Pre-training: Perspectives, Pitfalls, and Paths Forward: Hyper-Representation for Pre-Training and Transfer Learning. arxiv
- Reviewer for Conference on Neural Information Processing Systems (NeurIPS) 2022, Track on Datasets and Benchmarks.
- Paper accepted to New Journal of Physics: Predicting basin stability of power grids using graph neural networks. paper
- 2021
- Paper accepted at NeurIPS 2021: Self-Supervised Representation Learning on Neural Network Weights for Model Characteristic Prediction. proceedings, arxiv, blog, talk, code, data
- Paper accepted to EGU The Cryosphere: Elements of future snowpack modeling–Part 1: A physical instability arising from the nonlinear coupling of transport and phase changes. paper
- 2019