Wenqiang (Winston) Chen – Harnessing Vibrations for Intelligent Interactions in the Real World

Thursday, March 31
11:00 am - 12:12 pm

34-401A (Grier Room A)

Details

  • Date: Thursday, March 31
  • Time: 11:00 am - 12:12 pm
  • Category:
  • Location: 34-401A (Grier Room A)
Additional Location Details:

Abstract:
Vibrations convey rich information about interactions between people, objects, and environments. Exploiting ubiquitous sensors and machine learning, I have developed systems to sense, understand, and broker universal low volume vibrational interactions between people and the physical world.
In this presentation, I will focus on using subtle human body vibrations to translate freestyle finger writing and typing for interaction with wearable computing devices such as smartwatches and smart glasses. Challenges addressed include difficulties collecting and labeling a large vibration dataset, filtering human activity noise from finger typing and writing vibration signals through signal processing, designing a novel adversarial neural network to overcome human variations including those of typing strength, writing style, hand shape, smartwatch wrist position, and finally adopting a recurrent neural aligner for enabling both continuous and discrete finger movement recognition.
I will also briefly discuss the capture of vibrations from buildings, robots, and environments for use in ubiquitous computing applications such as metaverse, robotic automation, smart health, smart home tech, and security and privacy. The mission of this research is to integrate human, cyber, and physical experiences into an intelligent world of vibrational interactions.

Speaker Biography:
Wenqiang (Winston) Chen is a Ph.D. candidate at the University of Virginia working with Professor John Stankovic. His research lies at the intersection of Cyber-Physical Systems (CPS), ubiquitous and mobile computing, and human-computer interaction (HCI). In particular, his research specializes in developing Vibration Intelligence (VibInt) systems to perceive and infer information from human bodies, robots, and environments through vibrations. VibInt has been proposed to advance a wide variety of research areas, such as wearable interactions, robotics, smart health, smart homes, privacy and security. He has published his research in various top conferences and journals (e.g., Mobicom, Ubicomp, and Transactions on Mobile Computing), obtained five patents, and won the IEEE SECON 2018 Best Paper Award and the ACM SenSys 2020 Best Demo Award. Winston is also a co-founder of VibInt AI, a startup working on wearable devices using VibInt technologies, and his research IPs have been used in thousands of commodity devices. You can find out more about him at https://www.cs.virginia.edu/~wc5qd/

Host