ESE Fall Colloquium Seminar – “Alpha-loss: A Tunable Class of Loss Functions for Robust Learning”
December 7, 2021 at 11:00 AM - 12:00 PM
Organizer
Machine learning has dramatically enhanced the role of automated decision making across a variety of domains. There are three ingredients that are at the heart of designing of sound ML algorithms: data, learning architectures, and loss functions. In this talk, we focus on loss functions and the role of information theory in understanding the choice of loss functions in learning. We introduce alpha-loss as a parameterized class of loss functions that resulted from operationally motivating information-theoretic measures. Tuning the parameter alpha from 0 to infinity allows continuous interpolation between known and oft-used losses: log-loss (alpha=1), exponential loss (alpha=1/2), and 0-1 loss (alpha=infinity).
Beginning with the classification properties of alpha-loss and its information-theoretic interpretations, we will focus on a specific model, namely the logistic model, and quantify the optimization landscape of the average loss as viewed through the lens of Strict-Local-Quasi-Convexity. We discuss how different regimes of the parameter alpha enables the practitioner to tune the sensitivity of their algorithm towards two emerging challenges in learning: robustness and fairness. Finally, we comment on ongoing and future work on different applications of alpha-loss including GANs and boosting

