My name is Logan Weber, and I'm a PhD student at MIT, advised by Michael Carbin. The question that drives my research is: what can neural networks learn about programs?
Previously, I was a BS/MS student in the CS department at UW, where I worked in the PLSE and SAMPL groups. I am thankful to have been mentored by Jared Roesch and Tianqi Chen and to have been advised by Zachary Tatlock.
Before I worked in programming languages and machine learning, I worked under Adam Blank as a teaching assistant for discrete mathematics and data structures courses and as a researcher in CS education. Together, we built software to assist students learning various CS concepts. Examples include: an induction tutor, an interface for constructing formal proofs, and a system for judgement-based grading.
Research Projects
Look here for more detailed descriptions.- Semantic Program Embeddings, a project to theoretically characterize the difficulty of computing embeddings of programs that preserve semantic equivalences. Check the preprints below for more details!
- µTVM, a compilers approach to ML on microcontrollers using The Power Of TVM.
- Relay, a functional and differentiable intermediate representation for ML.
Publications
Learning to Compile Programs to Neural Networks (arXiv) (PDF) (Poster)
Logan Weber,
Jesse Michel,
Alex Renda,
Michael Carbin
ICML 2024
Relay: A New IR for Machine Learning Frameworks
Jared Roesch,
Steven Lyubomirsky,
Logan Weber,
Josh Pollock,
Marisa Kirisame,
Tianqi Chen,
Zachary Tatlock
MAPL 2018
Preprints
A Theory of Equivalence-Preserving Embeddings
Logan Weber*,
Jesse Michel*,
Alex Renda,
Saman Amarasinghe,
Michael Carbin
A Theory of Semantic Program Embeddings
Jesse Michel*,
Logan Weber*,
Saman Amarasinghe,
Michael Carbin
Relay: A High-Level Compiler for Deep Learning
Jared Roesch,
Steven Lyubomirsky,
Marisa Kirisame,
Logan Weber,
Josh Pollock,
Luis Vega,
Ziheng Jiang,
Tianqi Chen,
Thierry Moreau,
Zachary Tatlock
Miscellaneous Writing
I have a (currently inactive) blog here and some artifacts from course projects and industry collaborations below.
Formal Verification of a Closest Pair Algorithm
Logan Weber
MIT 6.850 (Geometric Computing) Final Project
Towards An Algorithm for Reeb Graph Construction on Constructive Solid Geometry
Logan Weber
MIT 6.838 (Shape Analysis) Final Project
Fast(ish) Algorithms for Integer Programming
Logan Weber*,
Josh Pollock*
MIT 6.854 (Advanced Algorithms) Final Project
TinyML - How TVM is Taming Tiny
Logan Weber,
Andrew Reusch
Post on
TVM,
OctoML,
and Arm
blogs
Living Life On The Low-Power Edge
Logan Weber,
UW Master's Thesis
Presentation
A Theory of Equivalence-Preserving Embeddings
Logan Weber,
Jesse Michel
Poster Presentation at NSF Expeditions in Computing Neurosymbolic Meeting (October 2022)
µTVM: TVM on Bare-Metal Devices
Logan Weber
Lightning Talk at TVM Conference 2019
µTVM: Deep Learning on Bare-Metal Devices
Logan Weber
Poster Presentation at ARM Research Summit 2019
µTVM: Deep Learning on the Low-Power Edge
Logan Weber,
Pratyush Patel
Talk at TVM For Fun and Profit Tutorial (FCRC 2019)
Relay: An Extensible Intermediate Language for Deep Learning
Logan Weber,
Josh Pollock,
Marisa Kirisame
Poster Presentation at Paul G. Allen School of Computer Science Research Poster Fair