Ruoyu Sun, University of Illinois at Urbana-Champaign
Aug 20, 2018, Mon, 10:00-11:30
One of the major challenges of training neural networks is the non-convexity of the loss function, which can lead to many local minima. Due to the recent success of deep learning, it is widely conjectured that the local minima of neural networks may lead to similar training performance and thus not a big issue. In this talk, we discuss the loss surface of neural networks for binary classification. We provide a collection of necessary and sufficient conditions under which the neural network problem has no bad local minima. On the positive side, we prove that no bad local minima exist under a few conditions on the neuron types, the neural-network structure (e.g. skip-like connection), the loss function and the dataset. While there seem to be quite a few conditions, on the negative side, we provide dozens of counterexamples which show that bad local minima exist when these conditions do not hold. For example, ReLU neurons lead to bad local minima while increasing and strictly convex neurons (e.g. smooth versions of ReLUs) can eliminate bad local minima.
Ruoyu Sun is an assistant professor in the Department of Industrial and Enterprise Systems Engineering Department (ISE) and Coordinate Science Lab (CSL), University of Illinois at Urbana-Champaign. Before joining UIUC, he was a visiting research scientist at Facebook AI Research, and was a postdoctoral researcher at Stanford University. He obtained PhD in electrical engineering from University of Minnesota, and B.S. in mathematics from Peking University. He has won the second place of INFORMS George Nicholson student paper competition, and honorable mention of INFORMS optimization society student paper competition. His research interests lie on optimization, machine learning and signal processing, especially large-scale optimization and non-convex optimization for machine learning.