PhD Defense: New Efficient Algorithms for Nested Machine Learning Problems
IRB IRB-4109
In recent years, machine learning (ML) has achieved remarkable success by training large-scale models on vast datasets. However, building these models involves multiple interdependent tasks, such as data selection, hyperparameter tuning, and model architecture search. Optimizing these tasks jointly often leads to the challenging nested objectives, where each task both influences and depends on the others. In this talk, I will start by formalizing nested ML problems as bilevel optimization tasks and presenting efficient algorithms with theoretical guarantees that solve them. Then, I will extend these ideas to the federated learning context, examining how algorithmic designs must be adapted to meet the challenges of that environment.