I am an Assistant Professor in the College of Information and Computer Sciences (CICS) at the University of Massachusetts Amherst, the flagship campus of the UMass system. I received my Ph.D. in Electrical Engineering from North Carolina State University in 2020. I am a member of the Programming Language and Systems at Massachusetts (PLASMA) lab at UMass.
My research lies in the intersection between Machine Learning and Programming Systems, with an emphasis on improving the speed, scalability, and reliability of Machine Learning through innovations in algorithms and programming systems (e.g., compilers, runtime). I am also interested in leveraging Machine Learning to improve High Performance Computing and accelerate scientific discovery. My current research focuses on both algorithm and system optimizations of Deep Multi-Task Learning and Graph Machine Learning.
- [Oct. 2022]: I’m excited to share that we have received an Amazon Research Award for our proposal “Groot: A GPU-Resident System for Efficient Graph Machine Learning” at UMass Amherst. Learn more about the program on the website.
- [Sept. 2022]: Our work on AutoMTL: A Programming Framework for Automating Efficient Multi-Task Learning is accepted to NeurIPS’22. Congratulations to Lijun. The project is open-sourced
- [Sept. 2022]: Thanks for the support of NSF to our project Transparently Scaling Graph Neural Network Training to Large-Scale Models and Graphs.
- [Jul. 2022]: Our work on Fine-Grained Personalized Federated Learning Through Dynamic Routing is accepted to CrossFL’2022 Workshop @MLSys. Congratulations to Kunjal.
- [Jul. 2022]: Our work on Improving Subgraph Representation Learning via Multi-View Augmentation is accepted to AI4Science’22 Workshop @ICML.
- [May. 2022]: Our paper “A Tree-Structured Multi-Task Model Recommender” is accepted to AutoML’22. Congratulations to Lijun. The project is open-sourced.
- [May. 2022]: Welcome a new PhD student Qizheng Yang to join our lab this summer.
- [Mar. 2022]: Thanks for the support of NVIDIA Academic Hardware Grant Program to the project “Multitasking-Centric Optimization for Deep Learning Applications”.
- [Mar. 2022]: Our paper “Rethinking Hard-Parameter Sharing in Multi-Domain Learning” is accepted to ICME’22. Congratulations to Lijun.
- [Mar. 2022]: Our paper “Enabling Near Real-Time NLU-Driven Natural Language Programming through Dynamic Grammar Graph-Based Translation” is accepted to CGO’22.
- [Mar. 2022]: Our paper “COMET: A Novel Memory-Efficient Deep Learning Training Framework by Using Error-Bounded Lossy Compression” is accepted to VLDB’22.
- [Nov. 2021]: Our collaborative project with Prof. Zhou Lin on “Accelerating Fragment-Based Quantum Chemistry via Machine Learning” received UMass ADVANCE Collaborative Research Seed Grant.
- [Oct. 2021]: Our paper “FreeLunch: Compression-based GPU Memory Management for Convolutional Neural Networks” is accepted to MCHPC’21 Workshop, in conjunction with SC’21.
- [Oct. 2021]: Our paper “Recurrent Neural Networks Meet Context-Free Grammar: Two Birds with One Stone” is accepted to ICDM’21.
- [June 2021]: Our paper “Scalable Graph Neural Network Training: The Case for Sampling” has appeared in the ACM SIGOPS Operating Systems Review.
- [June 2021]: Our paper CoCoPIE is accepted to CACM’21.
- [June 2021]: Our paper NumaPerf is accepted to ICS’21.
- [May 2021]: I have received an Adobe Research Collaboration Grant on developing resource-efficient deep multi-task learning solutions.
- [May 2021]: Our paper “Reuse-Centric Kmeans Configuration” is accepted to Information Systems. Congratulations to Lijun.
- NCSU Electrical and Computer Engineering Outstanding Dissertation Award, 2020
- IBM PhD Fellowship, 2015-2018