Songtao Lu
|
Senior Research Scientist
Mathematics and Theoretical Computer Science Department
Thomas J. Watson Research Center & MIT-IBM Watson AI Lab
IBM Research, Yorktown Heights, New York 10598, USA
E-mail: songtaoibm@gmail.com
Google Scholar
|
About me
I am a senior research scientist at the Thomas J. Watson Research Center, an IBM principal investigator at the MIT-IBM Watson AI Lab, and an IBM PI of the RPI-IBM AI Research Collaboration program. Also, I serve as senior personnel at the NSF AI Institute for Future Edge Networks and Distributed Intelligence (AI-EDGE)
My research interests broadly lie in:
Foundations of machine learning models
Design of optimization algorithms and corresponding computational complexity analysis
Natural language processing, hierarchical learning, reinforcement learning, and decentralized learning
Recent news
11/08/2024 Two proposals are funded under the RPI-IBM AI research collaboration program.
09/25/2024 Our work on a unified single-loop primal-dual framework for decentralized bilevel optimization is accepted by NeurIPS 2024.
09/18/2024 I am honored to be selected for Stanford University's 2024 World's Top 2% Scientists List.
09/12/2024 Our paper on graph neural networks with adaptive structures has been accepted by IEEE Journal of Selected Topics in Signal Processing.
08/27/2024 Our paper titled "A stochastic linearized augmented Lagrangian method for decentralized bilevel optimization" has been selected as an Honorable Mention in the IBM Pat Goldberg Memorial competition for best papers!
07/25/2024 Our research on in-context learning has been featured in IBM Research. Read more IBM and RPI researchers demystify in-context learning in large language models.
07/19/2024 I will serve as a Session Chair (Multiagent Optimization and Games) for ICCOPT 2025.
07/18/2024 I am currently serving as a Program Chair for NeurIPS 2024 Workshop on Federated Foundation Models.
05/16/2024 I will serve as an Area Chair for AAAI 2025.
05/01/2024 Five papers are accpeted by ICML 2024.
04/03/2024 Our work on toward Byzantine-robust decentralized federated learning is accepted by ACM CCS 2024.
03/27/2024 I will serve as an Area Chair for NeurIPS 2024.
01/16/2024 Our work on min-max optimization for policy evaluation with nonlinear function approximation is selected as a spotlight presentation by ICLR 2024.
12/13/2023 I am thrilled to have received an IBM Research Accomplishment Award for my contributions to advancing optimization techniques for next-generation distributed intelligence.
12/13/2023 Four papers were accepted by ICASSP 2024.
11/02/2023 It is my great pleasure to have received an IBM Plateau Invention Achievement Award.
08/21/2023 I have been elevated as an IEEE Senior Member.
07/03/2023 I am truly honored to receive an IBM Entrepreneur Award.
12/02/2022 Our work on conditional moment alignment for improved generalization in federated learning received the FL-NeurIPS Outstanding Paper Award!
07/29/2022 Our work on distributed adversarial training to robustify deep neural networks at scale received the UAI 2022 Best Paper Runner-Up Award!
Recent Representative Works
SLM: A Smoothed First-Order Lagrangian Method for Structured Constrained Nonconvex Optimization
Songtao Lu
NeurIPS, 2023.
Bilevel Optimization with Coupled Decision-Dependent Distributions
Songtao Lu
ICML, 2023.
A Stochastic Linearized Augmented Lagrangian Method for Decentralized Bilevel Optimization
Songtao Lu, Siliang Zeng, Xiaodong Cui, Mark S. Squillante, Lior Horesh, Brian Kingsbury, Jia Liu, Mingyi Hong
NeurIPS, 2022.
A Single-Loop Gradient Descent and Perturbed Ascent Algorithm for Nonconvex Functional Constrained Optimization
Songtao Lu
ICML, 2022.
Decentralized Policy Gradient Descent Ascent for Safe Multi-Agent Reinforcement Learning
Songtao Lu, Kaiqing Zhang, Tianyi Chen, Tamer Basar, and Lior Horesh
AAAI, 2021.
Linearized ADMM Converges to Second-Order Stationary Points for Non-Convex Problems
Songtao Lu, Jason Lee, Meisam Razaviyayn, Mingyi Hong
IEEE Transactions on Signal Processing, 2021.
Finding Second-Order Stationary Points Efficiently in Smooth Nonconvex Linearly Constrained Optimization Problems
Songtao Lu, Meisam Razaviyayn, Bo Yang, Kejun Huang, Mingyi Hong
NeurIPS, 2020.
Hybrid Block Successive Approximation for One-Sided Non-Convex Min-Max Problems: Algorithms and Applications
Songtao Lu, Ioannis Tsaknakis, Mingyi Hong, Yongxin Chen
IEEE Transactions on Signal Processing, 2020.
PA-GD: On the Convergence of Perturbed Alternating Gradient Descent to Second-Order Stationary Points for Structured Nonconvex Optimization
Songtao Lu, Mingyi Hong, Zhengdao Wang
ICML, 2019.
|