Follow
Liam Li
Liam Li
Carnegie Mellon University
Verified email at liamcli.com - Homepage
Title
Cited by
Cited by
Year
Hyperband: A novel bandit-based approach to hyperparameter optimization
L Li, K Jamieson, G DeSalvo, A Rostamizadeh, A Talwalkar
Journal of Machine Learning Research 18 (185), 1-52, 2018
30372018
Random search and reproducibility for neural architecture search
L Li, A Talwalkar
Uncertainty in artificial intelligence, 367-377, 2020
8342020
A system for massively parallel hyperparameter tuning
L Li, K Jamieson, A Rostamizadeh, E Gonina, J Ben-Tzur, M Hardt, ...
Proceedings of Machine Learning and Systems 2, 230-246, 2020
4702020
Hyperband: Bandit-Based Configuration Evaluation for Hyperparameter Optimization
AT Liam Li, Kevin Jamieson, Giulia DeSalvo, Afshin Rostamizadeh
ICLR, 2017
187*2017
Massively parallel hyperparameter tuning
L Li, K Jamieson, A Rostamizadeh, K Gonina, M Hardt, B Recht, ...
1852018
Federated hyperparameter tuning: Challenges, baselines, and connections to weight-sharing
M Khodak, R Tu, T Li, L Li, MFF Balcan, V Smith, A Talwalkar
Advances in Neural Information Processing Systems 34, 19184-19197, 2021
822021
Geometry-aware gradient algorithms for neural architecture search
L Li, M Khodak, MF Balcan, A Talwalkar
arXiv preprint arXiv:2004.07802, 2020
812020
Cross-modal fine-tuning: Align then refine
J Shen, L Li, LM Dery, C Staten, M Khodak, G Neubig, A Talwalkar
International Conference on Machine Learning, 31030-31056, 2023
372023
On data efficiency of meta-learning
M Al-Shedivat, L Li, E Xing, A Talwalkar
International Conference on Artificial Intelligence and Statistics, 1369-1377, 2021
322021
Rethinking neural operations for diverse tasks
N Roberts, M Khodak, T Dao, L Li, C Ré, A Talwalkar
Advances in Neural Information Processing Systems 34, 15855-15869, 2021
302021
Weight sharing for hyperparameter optimization in federated learning
M Khodak, T Li, L Li, M Balcan, V Smith, A Talwalkar
Int. Workshop on Federated Learning for User Privacy and Data …, 2020
162020
A system for massively parallel hyperparameter tuning. arXiv 2018
L Li, K Jamieson, A Rostamizadeh, E Gonina, M Hardt, B Recht, ...
arXiv preprint arXiv:1810.05934, 0
14
Massively parallel hyperparameter tuning, 2018
L Li, K Jamieson, A Rostamizadeh, K Gonina, M Hardt, B Recht, ...
URL https://openreview. net/forum, 2018
102018
Exploiting reuse in pipeline-aware hyperparameter tuning
L Li, E Sparks, K Jamieson, A Talwalkar
arXiv preprint arXiv:1903.05176, 2019
92019
Random Search and Reproducibility for Neural Architecture Search. arXiv e-prints, art
L Li, A Talwalkar
arXiv preprint arXiv:1902.07638, 2019
52019
Learning operations for neural PDE solvers
N Roberts, M Khodak, T Dao, L Li, C Ré, A Talwalkar
Proc. ICLR SimDL Workshop, 2021
32021
A simple setting for understanding neural architecture search with weight-sharing
M Khodak, L Li, N Roberts, MF Balcan, A Talwalkar
ICML AutoML Workshop, 2020
32020
Towards Efficient Automated Machine Learning
L Li
Google, 2020
22020
Weight-sharing beyond neural architecture search: Efficient feature map selection and federated hyperparameter tuning
M Khodak, L Li, N Roberts, MF Balcan, A Talwalkar
Proc. 2nd SysML Conf., 2019
22019
On weight-sharing and bilevel optimization in architecture search
M Khodak, L Li, MF Balcan, A Talwalkar
1
The system can't perform the operation now. Try again later.
Articles 1–20