Large-Scale Hyperparameter Optimization (Ongoing)

Published:

Status: Research in Progress (Master’s Thesis)
Affiliation: Mila / Concordia University
Supervisor: Prof. Mirco Ravanelli

Project Scope

I am currently conducting research on the intersection of Hyperparameter Optimization (HPO) and Scaling Laws for deep learning models. As models scale, traditional optimization techniques become computationally prohibitive. This research aims to develop multi-fidelity aware strategies that optimize training resources without compromising performance.

Technical Focus

  • Infrastructure: Orchestrating large-scale experiments on Compute Canada clusters using Slurm.
  • Engineering: Designing reproducible training pipelines in PyTorch and SpeechBrain.
  • Objective: Investigating system-level constraints to improve training efficiency for vision and speech models.

(Due to the confidential nature of ongoing research, specific methodologies and results will be shared upon publication.)