Shantam Ravan - March 2nd, 2026
Dissertation Title: Probing Magnon Dynamics and Magnetic Order at the Nanoscale using nitrogen-vacancy (NV) magnetometry
Date and Time: March 2nd, 2026, 4:00 PM
Location: ATL 3332 (https://umd.zoom.us/j/98727754244?pwd=LEj1aaSWc0jkWCOfPx4FQNwbN2tm1D.1)
Meeting ID: 987 2775 4244
Passcode: 545358
Dissertation Committee Chair: Ron Walsworth
Committee:
Prof. Amir Yacoby
Prof. Christopher Jarzynski
Prof. Victor Yakovenko
Prof. John Cumings
Abstract:
We develop and apply nitrogen-vacancy (NV) center quantum sensing techniques in diamond to investigate magnetic excitations and ordering in condensed matter systems. We utilize micrometer-scale, (111)-oriented diamond chips with shallow NV ensembles as a platform for high-resolution quantum microscopy, demonstrating diffraction-limited imaging and deterministic transfer onto diverse materials. We introduce a broadband spectrum analyzer based on qubit dressed states that extends noise magnetometry using a novel protocol via the AC Stark shift, and apply it to measure the magnon spectrum of yttrium iron garnet (YIG). Using spatially-resolved NV magnetometry, we then image nonlinear magnon dynamics in YIG, extracting four-magnon interaction strengths and discovering cascaded parametric instabilities that reveal the discrete scattering pathways through which the magnon gas redistributes energy before thermalization. Finally, we image magnetization reversal in a pinwheel artificial spin ice, resolving superferromagnetic domain switching.
Declan Alex Norton - January 13th, 2026
Dissertation Title: Efficient and Generalizable Machine Learning Models for Predicting Complex Dynamics
Date and Time: January 13th, 2026, 2:00 pm
Location: IREAP Large Conference Room (ERF 1207)(https://umd.zoom.us/j/8957596042?pwd=RVNLakJpa3BROGRaQUM3akp3NHJYUT09)
Dissertation Committee Chair: Michelle Girvan
Committee:
Christopher Jarzynski
Yanne Chembo
Brian Hunt
Daniel Lathrop
Abstract:
Machine learning offers effective approaches to modeling dynamical systems solely from observed data. However, without explicit structural priors (built-in assumptions about the underlying dynamics) or additional contextual inputs, even modern high-capacity models that demonstrate impressive generalization typically require large and diverse training datasets, and may still struggle to generalize to aspects of the dynamics that are poorly represented in the training data.
In this dissertation, we first show that reservoir computing—a simple, efficient, and versatile framework for data-driven modeling of dynamical systems—can generalize to unexplored regions of state space without explicit structural priors. Using multistable dynamical systems as a test setting, we demonstrate that reservoir computers trained on trajectories from a single basin of attraction can achieve out-of-domain generalization by capturing system behavior in entirely unobserved basins.
We then consider settings in which the underlying dynamics (governing equations) also differ between the training and test data, a challenging scenario for models both with and without structural priors. We introduce Meta-learning for Tailored Forecasting using Related Time Series (METAFORS), which builds and initializes a model tailored to short time-series data from a target system by leveraging a library of models trained on longer time series from potentially related systems. Without requiring contextual labels, METAFORS reliably predicts both short-term evolution and long-term statistical properties, even when the target and related systems exhibit substantially different behaviors.
Finally, we turn to the problem of learning complex brain dynamics from noisy and dynamically diverse electroencephalography (EEG) recordings. We outline how reservoir computing’s efficiency and versatility may help to address the specific challenges EEG poses to data-driven modeling of dynamical systems. Then, using a configurable whole-brain neural mass model to enable controlled experiments, we show that predicting both the short-term evolution and long-term statistics of a brain-like dynamical system using a single reservoir computer requires much more careful hyperparameter tuning than in previous successful applications. However, injecting noise into the autonomous reservoir system during the prediction stage can enable more robust replication of long-term statistical properties while still offering useful short-term predictions.