Home

The Role of Explicit Regularization in Overparameterized Neural Networks


Speaker

Shiyu Liang, University of Illinois at Urbana-Champaign

Time

2021-03-01 10:00:00 ~ 2021-03-01 11:30:00

Location

腾讯线上会议(会议ID:661 493 082, 会议密码:953849)

Host

Shuai Li

Abstract

Overparameterized neural networks have proved to be remarkably successful in many complex tasks such as image classification and deep reinforcement learning. In this talk, we will consider the role of explicit regularization in training overparameterized neural networks. Specifically, we consider ReLU networks and show that the landscape of commonly used regularized loss functions have the property that every local minimum has good memorization and regularization performance.

Bio

Shiyu Liang is a sixth-year Ph.D. student at University of Illinois at Urbana-Champaign, advised by Professor R. Srikant. I am also pursuing an M.S. in Mathematics. Before joining UIUC, I graduated from Shanghai Jiao Tong University. My research interests include machine learning, optimization and applied probability.

© John Hopcroft Center for Computer Science, Shanghai Jiao Tong University
分享到

地址:上海市东川路800号上海交通大学软件大楼专家楼
邮箱:jhc@sjtu.edu.cn 电话:021-54740299
邮编:200240