Song Han, Massachusetts Institute of Technology (MIT)


2019-01-07 10:00:00 ~ 2019-01-07 11:30:00


Room 3-412, SEIEE Building


Weinan Zhang, Assistant Professor, John Hopcroft Center for Computer Science

In the post-Moore’s Law era, the amount of computation per unit cost and power is no longer increasing at its historic rate In the post-ImageNet era, researchers are solving more complicated AI problems using larger data sets which drives the demand for more computation This mismatch between supply and demand for computation highlights the need for co-designing efficient machine learning algorithms and domain-specific hardware architectures We introduce our recent work using machine learning to optimize the machine learning system (Hardware-centric AutoML) I’ll describe efficient deep learning accelerators that can take advantage of these efficient algorithms, and also hardware-efficient video understanding algorithms I’ll conclude the talk by giving an outlook of the design automation for efficient deep learning computing
Song Han is an assistant professor in the EECS Department of Massachusetts Institute of Technology (MIT) and PI for HAN Lab: Hardware, AI and Neural-nets. He is a member of MIT Quest for Intelligence. Dr. Han's research focuses on energy-efficient deep learning and domain-specific architectures. He proposed “Deep Compression” that widely impacted the industry. He received the best paper award in ICLR’16 and FPGA’17. Prior to joining MIT, Song Han graduated from Stanford University.

Homepage: https://songhan.mit.edu/
© John Hopcroft Center for Computer Science, Shanghai Jiao Tong University