Tuesday, March 03, 2020. 12:00 PM. NSH 3305
Dravyansh Sharma -- Learning piecewise Lipschitz functions in changing environments
Abstract: When using machine learning algorithms, often one needs to tune hyperparameters optimized for a given data instance. For many problems, including clustering, a small tweak to the parameters can cause a cascade of changes in the algorithm's behavior, so the algorithm's performance is a discontinuous function of the parameters. Optimization in the presence of sharp (non-Lipschitz), unpredictable (w.r.t. time and amount) changes is a challenging and largely unexplored problem of great significance.
We consider the class of piecewise Lipschitz functions, which is the most general online setting considered in the literature for the problem. To capture changing environments, we look at the 'shifting regret', which allows for a finite number of environment shifts at unknown times. We provide a shifting regret bound for well-dispersed functions, where dispersion roughly quantifies the rate at which discontinuities appear in the utility functions in expectation. Our near-tight lower bounds further show how dispersion is necessary and sufficient for low regret. We empirically demonstrate a key application of our algorithms to online clustering problems on popular benchmarks.
Joint work with Nina Balcan and Travis Dick.
Bio: Dravy is a graduate student at CMU advised by Nina Balcan. He is interested in designing algorithms for machine learning with strong and provable performance guarantees. Previously he has worked with the Speech team at Google and completed his undergraduate studies at IIT Delhi. The work presented in this talk has been accepted for publication at AISTATS 2020, Palermo, and was awarded the first prize in poster competition at YinzOR 2019, Pittsburgh.