Hang Qiu, University of Southern California
Jun 20, 2018, Wed, 10:00-11:30
Autonomous vehicle prototypes today come with line-of-sight depth perception sensors like 3D cameras. These 3D sensors are used for improving vehicular safety in autonomous driving, but have fundamentally limited visibility due to occlusions, sensing range, and extreme weather and lighting conditions. To improve visibility and performance, not just for autonomous vehicles but for other Advanced Driving Assistance Systems (ADAS), we explore a capability called Augmented Vehicular Reality (AVR). AVR broadens the vehicle’s visual horizon by enabling it to wirelessly share visual information with other nearby vehicles, but requires the design of novel relative positioning techniques, new perspective transformation methods, approaches to isolate and predict the motion of dynamic objects in order to hide latency, and adaptive transmission strategies to cope with wireless bandwidth variability. We show that AVR is feasible using off-the-shelf wireless technologies, and it can qualitatively change the decisions made by autonomous vehicle path planning algorithms. Our AVR prototype achieves positioning accuracies that are within a few percent of car lengths and lane widths, and is optimized to process frames at 30fps.
Hang Qiu is a Ph.D. candidate in Networked System Lab @ USC. His advisor is Prof. Ramesh Govindan. Previously, he was advised by Prof. Xinbing Wang and graduated from SJTU with BS in 2013. He is broadly interested in Networked Intelligence in IoT Systems, Mobile / Edge Computing, Wireless Networking, and Crowdsourcing Systems. His current research focuses on Connected Autonomous Vehicles and Networked Video Analytics.