
Hi! I am a machine learning scientist with the Machine Intelligence Group at Lawrence Livermore National Laboratory (LLNL).
My CV (pdf, updated Mar’23).
Recent Updates
(3/6) new! Code for our NeurIPS 2022 paper “Δ-UQ” has been published — https://github.com/LLNL/DeltaUQ
(3/6) new! Know Your Space: Inlier and Outlier Construction for Calibrating Medical OOD Detectors accepted as an oral MIDL 2023!
TLDR: A new method to train calibrated OOD detectors that can flag medical imaging anomalies very effectively, achieved via carefully chosen inlier and outlier augmentations.
(3/6) new! Cross-GAN Auditing: Unsupervised Identification of Attribute Level Similarities
and Differences between Pre-trained Generative Models accepted to CVPR 2023!
TLDR: A new technique to *interpretably* compare two (or more) pre-trained GANs
(3/6) new! Robust Time Series Recovery and Classification Using Test-Time Noise Simulator Networks accepted to ICASSP 2023!
TLDR: A test-time approach for time series classification for improved robustness and recovery under a variety of unknown noise corruptions.
(1/25) new! The Surprising Effectiveness of Deep Orthogonal Procrustes Alignment in Unsupervised Domain Adaptation accepted to IEEE Access, 2023!
TLDR: Revisiting subspace alignment’s (SA) role in unsupervised domain adaptation (UDA), we find SA (& a specific training strategy) to be very competitive to modern UDA approaches.
(11/22) arXiv preprint DOLCE: A Model-Based Probabilistic Diffusion Framework for Limited-Angle CT Reconstruction
TLDR: A specially conditioned diffusion model does remarkably well, and produces useful uncertainties on the challenging, limited-angle CT reconstruction problem.