Plenary Speakers
Motion Control of EV and Paradigm Shift to Motor/Capacitor/Wireless
Prof. Yoichi Hori Date: October 15 (Fri.), 10:50-11:50 Place: 2F Ballroom 1 Chair: Prof. Sehoon Oh (DGIST) |
|
The most distinct advantage of electric vehicle is electric motor’s quick and precise torque generation. I named this technique ‘Motion Control of EV’ and have been demonstrating its basic effectiveness of various proposed methods like adhesion control by using some really made experimental EVs. On the other hand, ‘Motor’, ‘Capacitor’ and ‘Wireless’ will be the key technologies for cars in the future, instead of ‘Engine’, ‘Battery’ and ‘Quick charge’. Future cars will be driven by electric motors, but we still have lots of problems in energy supply. Why are electric vehicles supposed to be charged with ‘stopped’, in a ‘short time’, and by ‘big energy’, even though the energy form of electricity is completely different from that of gasoline. Super-capacitors and wireless power transfer to EV’s in motion will play an important role in the future EV world by drastically reducing too big usage of recent high-capacity batteries.
Yoichi Hori received his B.S., M.S., and Ph.D. degrees in Electrical Engineering from the University of Tokyo, Tokyo, Japan, in 1978, 1980, and 1983, respectively. In 1983, he joined the Department of Electrical Engineering, The University of Tokyo, as a Research Associate. He later became an Assistant Professor, an Associate Professor, and, in 2000, a Professor at the same university. In 2002, he moved to the Institute of Industrial Science as a Professor in the Information and System Division, and in 2008, to the Department of Advanced Energy, Graduate School of Frontier Sciences, the University of Tokyo. From 1991-1992, he was a Visiting Researcher at the University of California at Berkeley. His research fields are control theory and its industrial applications to motion control, mechatronics, robotics, electric vehicles, etc. Recently, he is interested in wireless power transfer system. He is IEEE (the Institute of Electrical and Electronics Engineers) Life Fellow and a past AdCom member of IES (Industrial Electronics Society). He is now IEEJ (the Institute of Electrical Engineers of Japan) Fellow and JSAE (the Society of Automotive Engineers of Japan) Fellow, and also a member of the Society of Instrument and Control Engineers, Robotics Society of Japan, Japan Society of Mechanical Engineers, and so on. He was the President of the Industry Applications Society of the IEEJ, the President of WEVA (World Electric Vehicle Association), and the Director of Japan Automobile Research Institute (JARI). He is now the President of Capacitors Forum, the Chairman of Motor Technology Symposium of JMA (Japan Management Association), the Representative Director of NeV (Next Generation Vehicle Promotion Center), and the Vice-President of JSAE since June 2020. He is the winner of the Best Transactions Paper Award from the IEEE Transactions on Industrial Electronics in 1993, 2001 and 2013, of the 2000 Best Transactions Paper Award from IEEJ, and 2011 Achievement Award of IEEJ. |
|
Design and Control of Wearable Robots for Physical Human-Robot Interaction
Prof. Marcia O’Malley Date: October 13 (Wed.), 10:50-11:50 Place: 2F Ballroom 1 Chair: Prof. Jee-Hwan Ryu (KAIST) |
|
Robots are increasingly transitioning from factories to human environments: today we use robots in healthcare, households, and social settings. I’m particularly interested in the potential for improving human performance via wearable robotic devices. Physical interactions between robots and humans offer an opportunity for the human and robot to implicitly communicate. For example, a rehabilitation robot exoskeleton can guide and train human movements, or a wearable haptic device can be used to convey informative tactile cues to the user. As engineers, we must consider the unique design and control constraints that are introduced when we design robots that are to be worn by the human, such as the complex degrees of freedom of human joints, the limitations of our human perceptual capabilities, and the necessity for compliant control algorithms to ensure user safety. This talk will feature recent research from my lab and will highlight these challenges and the unique approaches that we have taken to ensure that the wearable robot and human achieve more together than either can achieve alone.
Marcia O’Malley is the Thomas Michael Panos Family Professor in Mechanical Engineering, Computer Science, and Electrical and Computer Engineering at Rice University where she directs the MAHI (Mechatronics and Haptic Interfaces) Lab. She received her BS in Mechanical Engineering from Purdue University, and her MS and PhD in Mechanical Engineering from Vanderbilt University. Her research is in the areas of haptics and robotic rehabilitation, with a focus on the design and control of wearable robotic devices for training and rehabilitation. She has won awards for exemplary teaching at Rice, and she was the recipient of both an ONR Young Investigator Award and an NSF CAREER Award. She is a Fellow of both the ASME and the IEEE. Her editorial roles include Associate Editor-in-Chief for the IEEE Transactions and Senior Editor for the ACM Transactions on Human Robot Interaction. She is the incoming chair of the IEEE Robotics & Automation Society Conference Editorial Board. |
|
Autonomous, Agile Micro Drones: Perception, Learning, and Control
Prof. Davide Scaramuzza Date: October 14 (Thu.), 16:40-17:40 Place: 2F Ballroom 1 Chair: Prof. Hyun Myung (KAIST) |
|
Autonomous quadrotors will soon play a major role in search-and-rescue and remote-inspection missions, where a fast response is crucial. Quadrotors have the potential to navigate quickly through unstructured environments, enter and exit buildings through narrow gaps, and fly through collapsed buildings. However, their speed and maneuverability are still far from those of birds and human pilots. Human pilots take years to learn the skills to navigate drones. Autonomous, vision-based agile navigation through unknown, indoor environments poses a number of challenges for robotics research in terms of perception, state estimation, planning, and control. In this talk, I will show that how the combination of both model-based and machine learning methods united with the power of new, low-latency sensors, such as event-based cameras, allow drones to achieve unprecedented speed and robustness by relying solely on the use of passive cameras, inertial sensors, and onboard computing.
Davide Scaramuzza (Italian) is a Professor of Robotics and Perception at both departments of Informatics (University of Zurich) and Neuroinformatics (University of Zurich and ETH Zurich), where he directs the Robotics and Perception Group. His research lies at the intersection of robotics, computer vision, and machine learning, using both standard cameras and event cameras, and is aimed at enabling autonomous, agile, navigation of micro drones in search and rescue applications. After a PhD at ETH Zurich (with Roland Siegwart) and a postdoc at the University of Pennsylvania (with Vijay Kumar and Kostas Daniilidis), from 2009 to 2012, he led the European project sFly, which introduced the PX4 autopilot and pioneered visual-SLAM-based autonomous navigation of micro drones in GPS-denied environments. From 2015 to 2018, he was part of one the three teams selected by DARPA for the Fast Lightweight Autonomy Program (FLA), the first one to research autonomous, agile navigation of micro drones in GPS-denied environments. In 2018, his team won the IROS 2018 Autonomous Drone Race and in 2019 it ranked second in the AlphaPilot Drone Racing world championship. For his research contributions to autonomous, vision-based, drone navigation and event cameras, he won prestigious awards, such as an ERC Consolidator Grant, the IEEE Robotics and Automation Society Early Career Award, an SNSF-ERC Starting Grant, a Google Research Award, the KUKA Innovation Award, two Qualcomm Innovation Fellowships, the European Young Research Award, the Misha Mahowald Neuromorphic Engineering Award, and several paper awards. He coauthored the book “Introduction to Autonomous Mobile Robots” (published by MIT Press; 10,000 copies sold) and more than 100 papers on robotics and perception published in top-ranked journals (TRO, PAMI, IJCV, IJRR) and conferences (RSS, ICRA, CVPR, ICCV). He has served as a consultant for the United Nations (UN) International Atomic Energy Agency (IAEA) Fukushima Action Plan on Nuclear Safety, and also several drone and computer-vision companies, to which he has also transferred research results. In 2015, he cofounded Zurich-Eye, building visual-inertial navigation solutions for mobile robots, which later became Facebook Zurich. He was also the strategic advisor of Dacuda, an ETH spinoff dedicated to inside-out VR solutions, which later became Magic Leap Zurich. In 2020, he cofounder, SUIND, which develops camera based safety solutions for commercial drones. Many aspects of his research have been prominently featured in wider media, such as The New York Times, BBC News, Discovery Channel, La Repubblica, Neue Zurcher Zeitung and in technology-focused media, such as IEEE Spectrum, MIT Technology Review, Tech Crunch, Wired, The Verge. |