About me
I am a third-year PhD student in the USC Theory Group, fortunate to be advised by
Vatsal Sharan.
My primary research interests are generative modeling and statistical learning theory, and
I am grateful to be supported by an NSF Graduate Research Fellowship.
This summer, I am excited to be joining
Pinterest
as a machine learning intern in Seattle.
Previously, I’ve worked at Boston College as a research associate and in
industry
as a quantitative researcher. Before then, I studied math as an undergrad at Harvard.
I'm always happy to connect and talk about ideas in computer science, math, and beyond!
Research
-
Resa: Transparent Reasoning Models via SAEs
with Shangshang Wang, Ömer Faruk Akgül, Enes Burak Bilgin, Ollie Liu, Deqing Fu, and Willie Neiswanger.
In submission, 2025.
[arxiv]
| [pdf]
-
Textual Steering Vectors Can Improve Visual Understanding in Multimodal Large Language Models
with Woody Gan, Deqing Fu, Ollie Liu, Dani Yogatama, Vatsal Sharan, Robin Jia, and Willie Neiswanger.
In submission, 2025.
[arxiv]
| [pdf]
-
Tina: Tiny Reasoning Models via LoRA
with Shangshang Wang, Ömer Faruk Akgül, Enes Burak Bilgin, Ollie Liu, and Willie Neiswanger.
In submission, 2025.
[arxiv]
| [pdf]
-
On Agnostic PAC Learning in the Small Error Regime
with Mikael Møller Høgsgaard and Grigoris Velegkas.
In submission, 2025.
[arxiv]
| [pdf]
-
Local Regularizers Are Not Transductive Learners
with Sky Jafar and Shaddin Dughmi.
Conference on Learning Theory (COLT 2025).
[arxiv]
| [pdf]
-
Understanding Aggregations of Proper Learners in Multiclass Classification
with Mikael Møller Høgsgaard and Grigoris Velegkas.
Algorithmic Learning Theory (ALT 2025).
[arxiv]
| [pdf]
-
Proper Learnability and the Role of Unlabeled Data
with Siddartha Devic, Shaddin Dughmi, Vatsal Sharan, and Shang-Hua Teng.
Algorithmic Learning Theory (ALT 2025).
[arxiv]
| [pdf]
-
Transductive Learning Is Compact
with Siddartha Devic, Shaddin Dughmi, Vatsal Sharan, and Shang-Hua Teng.
Neural Information Processing Systems (NeurIPS 2024).
[arxiv]
| [pdf]
| [slides]
-
Open Problem: Can Local Regularization Learn All Multiclass Problems?
with Siddartha Devic, Shaddin Dughmi, Vatsal Sharan, and Shang-Hua Teng.
Open Problem @ Conference on Learning Theory (COLT 2024).
[pdf]
| [slides]
-
Regularization and Optimal Multiclass Learning
with Siddartha Devic, Shaddin Dughmi, Vatsal Sharan, and Shang-Hua Teng.
Conference on Learning Theory (COLT 2024).
[arxiv]
| [pdf]
| [slides]
-
Computable PAC Learning of Continuous Features
with Nate Ackerman, Jieqi Di, Cameron Freer, and Jean-Baptiste Tristan.
Logic in Computer Science (LICS 2022).
[pdf]
Thesis
Probability Monads, under
the direction of Michael Hopkins.
Teaching
Computer Science I: CSCI 1101 @ BC, Spring 2022 Head TA.
Notes here.
Applied Machine Learning: CSCI 3340.01 @ BC, Fall 2021 Teaching assistant.
Notes here.
Sets, Groups, and Topology: Math 101 @ Harvard, Spring 2020 Course assistant.
Partial notes here.
Real Analysis I: Math 112 @ Harvard, Spring 2019 Course assistant.
Notes here.
Abstract Algebra I: Math 122 @ Harvard, Fall 2018 Course Assistant.
Notes here, taken by Vaughan McDonald.
Course Notes
Advanced algorithms: CS 670 @ USC.
Shortest paths, spanning trees, matroids, Fibonacci heaps, dynamic programming, max-flow,
hardness.
Combinatorial analysis: Math 532 @ USC.
(Exponential) generating functions, inclusion and exclusion, Mobius inversion, set and number
partitions.
Algebraic geometry: Math 137 @ Harvard.
Algebraic sets, Nullstellensatz (again), local rings, DVRs. Notes for first half of the course.
Commutative algebra: Math 221 @ Harvard.
Localization, Nullstellensatz, Tor, Nakayama, dimension theory.
Category theory: Math 99r @ Harvard.
Functors, natural transformations, Yoneda, (co)limits.
Theme by orderedlist