This site is no longer being updated.

This is a static view of this site as it appeared in January 2013. Please visit us at Haptics Symposium 2014!

You are here

Plenary Talks

Keynote #1

Opening Keynote:Everyone Feels Haptics (Happy)

Hsiao-Wuen Hon - Managing Director, Microsoft Research Asia, China

March 5, 2012. 11:30 - 12:30.
Session Chair: Hong Tan

Human interfaces are always two-way interactions: receiving feedback from the other end, whether it is a machine or another human being, is always critical to us. While touch interfaces have become pervasive in human machine interfaces, they still rely predominantly on visual or audio feedback, with the direct matching haptic feedback mostly non-existent. In this talk, I would like to provide insights on the potential for practical haptics displays to take natural user interfaces, including touch, gesture, and motion, to the next level. I would also like to present a glimpse of the future natural user interface with haptics and the gap in haptics technology between the state of the art and what is required to realize this vision.

Speaker

Dr. Hsiao-Wuen Hon is the Managing Director of Microsoft Research Asia, located in Beijing, China. Founded in 1998, Microsoft Research Asia has since become one of the best research centers in the world that MIT Technology Review called “the hottest computer science research lab in the world.” Dr. Hon oversees the lab’s research activities and collaborations with academia in Asia Pacific.
    An IEEE fellow and a Distinguished Scientist of Microsoft, Dr. Hon is an internationally recognized expert in speech technology.  He serves on the editorial board of the international journal of the Communication of the ACM.  Dr. Hon has published more than 100 technical papers in international journals and at conferences.  He co-authored a book, Spoken Language Processing, which is a graduate-level textbook and reference book in the area of speech technology in many universities all over the world.  Dr. Hon holds three dozens of patents in several technical areas.
    Dr. Hon has been with Microsoft since 1995. He joined Microsoft Research Asia in 2004 as a Deputy Managing Director, responsible for research in Internet search, speech & natural language, system, wireless and networking.  In addition, he founded and managed search technology center (STC) from 2005 to 2007, the Microsoft internet Search product (Bing) development in Asia Pacific.
    Prior to joining Microsoft Research Asia, Dr. Hon was the founding member and architect in Natural Interactive Services Division at Microsoft Corporation.  Besides overseeing all architectural and technical aspects of the award winning Microsoft® Speech Server product (Frost & Sullivan's 2005 Enterprise Infrastructure Product of the Year Award, Speech Technology Magazine’s 2004 Most Innovative Solutions Awards and VSLive! 2004 Editors Choice Award.), Natural User Interface Platform and Microsoft Assistance Platform, he is also responsible for managing and delivering statistical learning technologies and advanced search. Dr. Hon joined Microsoft Research as a senior researcher at 1995 and has been a key contributor of Microsoft's SAPI and speech engine technologies. He previously worked at Apple Computer, where he led research and development for Apple's' Chinese Dictation Kit.
    Dr. Hon received Ph.D in Computer Science from Carnegie Mellon University and B.S. in Electrical Engineering from National Taiwan University.

 


 
Keynote #2

Invited Session on Haptic Human-Computer Interaction

Desney Tan - Principal Researcher and Manager of Computational User Experiences Group, Microsoft Research, USA
Patrick Baudisch - Professor, Hasso Plattner Institute, Germany
Ivan Poupyrev - Senior Research Scientist, Walt Disney Research, USA

March 6, 2012. 11:00 - 12:30
Session Chairs: Ali Israr, Vincent Levesque, and Karon MacLean

Getting HCI in Touch and Touch in HCI

We have been so successful at making computers smaller, faster, cheaper, that the challenges have now shifted to making them more accessible, more of the time and place, to more people, and to do more interesting things. This is an explicit move away from thinking about computing as a task unto itself, but rather as an augmentation that seamlessly empowers us in our everyday lives. The Human-Computer Interaction and Haptics communities are uniquely poised to tackle many of the challenges in this space, including increasing effectiveness and delight in interaction, as well as enabling fundamentally new computing scenarios. In this talk, I will briefly describe our philosophy around creating Natural User Interfaces, situate that in some of our recent projects exploring new input techniques that utilize e.g. muscle sensing, bio-acoustic sensing, leveraging the body as an electromagnetic antenna, etc, and hopefully start discussion around how Haptics and Human-Computer Interaction researchers can learn from each other and march forward together.

Speaker

Desney Tan is a Principal Researcher at Microsoft Research, where he manages the Computational User Experiences group in Redmond, Washington, as well as the Human-Computer Interaction group in Beijing, China. He also holds an affiliate faculty appointment in the Department of Computer Science and Engineering at the University of Washington.
    Desney’s research interests include Human-Computer Interaction, Mobile Computing, and Healthcare. He spends large chunks of time applying signal processing and machine learning to recognizing noisy signals, specifically those in or on the human body, and using them in interesting ways. However, he is a schizophrenic researcher and has worked on projects in many other domains.
    Desney received his Bachelor of Science in Computer Engineering from the University of Notre Dame in 1996, after which he spent a couple of years building bridges and blowing things up in the Singapore Armed Forces. He later returned to Carnegie Mellon University, where he worked with Randy Pausch and earned his PhD in Computer Science in 2004.
    Desney was honored as one of MIT Technology Review's 2007 Young Innovators Under 35 for his work on brain-computer interfaces. He was also named one of SciFi Channel's Young Visionaries at TED 2009, as well as Forbes' Revolutionaries: Radical Thinkers and their World-Changing Ideas for his work on Whole Body Computing. Among other service roles, he has served as Technical Program Chair for CHI 2008 and well as General Chair for CHI 2011.
 

Gravity + Multi-touch = 3D Tracking

We propose a new approach to tracking users, objects, and activities in a smart room. Unlike traditional approaches that point cameras into the 3D volume of a room, we provide all horizontal surfaces with touch sensitivity, including chairs, tables, and the floor. Gravity pushes people and objects against these surfaces, causing them to leave imprints, i.e., pressure distributions across the surface. We demonstrate how to decompose these imprints into the object and pose that caused them. The result is a novel type of tracking method, that is less susceptible to occlusion than regular and depth cameras, that recognizes objects using simpler and potentially more reliable algorithms, and which one day might be implemented as part of every room's carpet. We present the evolution of the concept from reconstructing finger posture on touch screens (RidgePad, CHI 2010) to stackable 3D tangibles (Lumino, CHI 2010) to our Interactive floor (Multitoe, UIST 2010). We then show our latest 8m2 prototype, a set of active and a set of passive touch-sensitive furniture, as well as the algorithms we created for recognizing objects and poses, some of which are GPU-based.

Speaker

Patrick Baudisch is a professor in Computer Science at Hasso Plattner Institute in Berlin/Potsdam and chair of the Human Computer Interaction Lab. His research focuses on the miniaturization of mobile devices and touch input. Previously, Patrick Baudisch worked as a research scientist in the Adaptive Systems and Interaction Research Group at Microsoft Research and at Xerox PARC and served as an Affiliate Professor in Computer Science at the University of Washington. He holds a PhD in Computer Science from Darmstadt University of Technology, Germany.

 

Haptic Cloud

What is the ultimate tactile and haptic interface? Assuming that there are no limitations of technology and we understand in reasonable details the inner mechanisms of human tactile and haptic perception, how such ultimate haptic display would look and feel like, how it could be controlled and what it could be used for? Inventing idealized and far-reaching interaction concepts (and I consider haptics to be essentially an interaction technology) has proved to be extremely useful as they provide a direction and the goal that can guide the development of technology, inspire researchers and developers and focus their efforts. The concepts of the "Dynabook" by Alan Kay and "Ubiquitous Computing" by Marc Weiser are some of the best examples of such idealized yet very practical and useful technological concepts. In this talk I present the concept of "Haptic Cloud" that has motivated and guided our research on haptics in recent years. I will discuss the concept itself as well as how we have approximated it by developing a range of recent haptics technology prototypes.

Speaker

Dr. Ivan Poupyrev (twitter: @ipoupyrev) directs an Interaction Technology group in Disney Research's Pittsburgh Lab, Walt Disney Imagineering, a unit of Walt Disney Company tasked with dreaming up and developing future technologies for Disney parks, resorts, and cruises.
    Dr. Poupyrev's research focuses on inventing new interactive technologies for the seamless blending of digital and physical properties in devices, everyday objects, and living environments, a direction he broadly refers to as physical computing. It's span a broad range of research fields including haptic user interfaces, tangible interfaces, shape-changing and flexible computers, augmented and virtual reality as well as spatial 3D interaction. His research was broadly published and received awards at prestigious academic conferences, exhibited world-wide and extensively reported in popular media, such as CNN, BBC, Financial Times, New York Times, New Scientist to name a few. The results of his research were also released on the market in various consumer products and applications.
    Prior to Disney, Ivan has worked as a researcher at Sony Computer Science Laboratories and the Advanced Telecommunication Research Institute International, both in Japan. He also had a stint at the Human Interface Technology Laboratory at the University of Washington as a Visiting Scientist while working on his Ph.D. dissertation in Hiroshima University, Japan.

 


 

Keynote #3

A Haptics Symposium Retrospective: 20 Years

1992 Inaugural Haptics Symposium Co-chairs
J. Edward Colgate - Breed University Professor of Design, Northwestern University, USA
Bernard (Dov) Adelstein - Scientist, NASA Ames Research Center, USA

March 7, 2012. 11:30 - 12:30
Session Chair: Tim Salcudean

The very first “Haptics Symposium” actually went by the name “Issues in the Development of Kinesthetic Displays of Teleoperation and Virtual Environments.”  The word “Haptic” didn’t make it into the name until the next year.  Not only was the most important word absent, but so were RFPs, journals, and commercial markets.  And yet, as we prepare for the 2012 symposium, haptics is a thriving and amazingly diverse field of endeavor.  In this talk, we’ll reflect on the origins of this field, and on its evolution over the past twenty years, as well as the evolution of the Haptics Symposium itself.  We hope to share with you some of the excitement that we’ve felt along the way, and that we continue to feel as we look toward the future of our field.

Speakers

Ed Colgate is the Breed University Professor of Design at Northwestern University.  His research interests lie in the areas of haptic interface, telemanipulation, prosthetics and physical human-robot interaction.  With his colleague Michael Peshkin, Colgate is the inventor of a class of collaborative robots known as “cobots.” He is the Editor-in-Chief of the IEEE Transactions on Haptics.  He also directs the Master of Science in Engineering Design and Innovation, which combines graduate-level engineering courses with a broad exposure to human-centered design. In addition to his academic pursuits, Colgate is a founder of three companies:  Stanley Cobotics, Kinea Design and Tangible Haptics.

 

Bernard (Dov) Adelstein is a scientist in the Human Systems Integration Division at the NASA Ames Research Center in California where he leads the Human Vibration Laboratory and is a member of the Advanced Controls and Displays Group.  His current research centers on human visual and manual perception and performance under whole-body vibration in aerospace applications and on the perception and management of time delay in multisensory virtual environment and teleoperator systems.  He has a continuing interest in psychophysics and in the kinematic design of haptic interfaces.