About me

I am a 4th year PhD Student at AIML Lab, University of St.Gallen advised by Damian Damian Borth, with a focus on self-supervised representation learning on neural network weights. In my PhD, I investigate populations of neural networks and identify latent structure in such populations by learning so-called hyper-representations.. My original background is in mechanical engineering with a focus on simulation science and automatic control, which I studied at RWTH Aachen University. During my studies, I visited Aalto University in Helsinki via an Erasmus scholarship, as well as Siemens via Jiatong-University in Shanghai as DAAD PROMOS scholar. Before my PhD, I interned at the Institute for Snow and Avalanche Research with Henning Loewe on the coupled simulation of energy and mass transfer in snowpack. Currently, I am a visiting scholar with Michael Mahoney at ICSI in Berkley, where I collaborate on an IARPA project extending hyper-representation for backdoor detection.

I am driven by curiosity and a fascination for research, which allowed me to dive into several exciting fields and eventually led me to machine learning to study learning itself. My current research focus lies on investigating populations of Neural Networks and has connections to self-supervised learning and transfer, meta and few-shot learning. To that end, I use self-supervised learning methods on populations of Neural Networks to learn abstract, task-invariant representations of models. These representations can then be used for several downstream tasks, such as i) model analysis, ii) generation of new models for abstract knowledge transfer, and iii) model sparsification in representation space. In addition, I am interested in multi-modal learning with data of different levels of abstractions and machine learning for science.

News

  • I’ll join Google Brain in Mountain View as research intern over the summer and will keep working on scaling Hyper-Representations for efficient knowledge aggregation.
  • Damian and I are awarded the HSG Impact Award 2023 for our work on Hyper-Representations.
  • Invited Talk at Google Algorithms Seminar in Mountain View: Hyper-Representations: Learning from Populations of Neural Networks.
  • I’m visiting Michael Mahoney in Spring 2023 for a few months to investigate the weight space and loss surface landscape of Neural Networks.
  • Paper accepted at ICLR 2023 Workshop on Sparsity in Neural Networks: Sparsified Model Zoo Twins: Investigating Populations of Sparsified Neural Network Models. arxiv
  • Paper accepted at NeurIPS 2022 Climate Change AI Workshop: Towards dynamic stability analysis of sustainable power grids using graph neural networks. arxiv
  • Paper accepted at NeurIPS 2022 Track on Datasets and Benchmarks: Model Zoos: A Dataset of Diverse Populations of Neural Networks. openreview, www.modelzoos.cc
  • Paper accepted at NeurIPS 2022: Hyper-Representations as Generative Models: Sampling Unseen Neural Network Weights
  • Paper accepted at ICML 2022 Workshop on Pre-training: Perspectives, Pitfalls, and Paths Forward: Hyper-Representation for Pre-Training and Transfer Learning. arxiv
  • Paper accepted to New Yournal of Physics: Predicting basin stability of power grids using graph neural networks. paper
  • Google Research Scholar Award for “Hyper-Representations: Learning from Populations of Neural Networks” with PI Damian Borth. announcement, article
  • Paper accepted to EGU The Cryosphere: Elements of future snowpack modeling–Part 1: A physical instability arising from the nonlinear coupling of transport and phase changes. paper
  • Paper accepted at NeurIPS 2021: Self-Supervised Representation Learning on Neural Network Weights for Model Characteristic Prediction. proceedings, arxiv, blog, talk, code, data
  • New preprint: An Investigation of the Weight Space to Monitor the Training Progress of Neural Networks. paper
  • New position: I’ve joined the AI:ML Lab at University of St. Gallen as Researcher and PhD Student!
  • Poster accepted to EGU General Assembly 2019: On water vapor transport in snowpack models: Comparison of existing schemes, numerical requirements and the role of non-local advection. abstract

Reviewing

  • Conference on Neural Information Processing Systems (NeurIPS) 2022, Track on Datasets and Benchmarks
  • Winter Conference on Applications of Computer Vision (WACV) 2023