The Impact of Multiscale Dataset and a Multi-Rate Gradient Descent Approach

Speaker: Liangchen Liu (UT Austin)
Dates: 2024/10/24
Location: MW 154

Abstract: Data embedded in high-dimensional spaces often follows intrinsic low-dimensional structures, but assuming a consistent scale across all directions may be too idealized. Indeed, empirical observations suggest that data distributions tend to exhibit variability in scale, prompting some important questions: how does this multiscale characteristic of data affect machine learning algorithms? And how can we effectively leverage such information for better outcomes?

In this talk, we will examine the effects of multiscale data on machine learning algorithms. I will then introduce a novel gradient descent algorithm that adapts different learning rates based on the multiscale properties of the data. Convergence properties and improvement of efficiency in real-world scenarios will be shown, indicating the multirate scheme might provide potential explanations to the success of the learning rate schedulers.