CS Events

Faculty Candidate Talk

Enabling the Communication of Physical Experiences

 

Download as iCal file

Monday, February 21, 2022, 10:30am - 12:00pm

 

Speaker: Jun Nishida

Bio

Jun Nishida is a postdoctoral fellow at the University of Chicago's Computer Science Department, advised by Prof. Pedro Lopes. He received a Ph.D. in Human Informatics at the University of Tsukuba, Japan in 2019. He is interested in exploring interaction techniques where people can communicate their physical experiences to support each other, with applications in the fields of rehabilitation, education, and design. To this end, he engineers wearable interfaces that share bodily cues across people by means of electrical muscle stimulation, exoskeletons, virtual/augmented reality systems, along human factors. He has received ACM UIST Best Paper Award, ACM CHI Best Paper Honorable Mention Award, Microsoft Research Asia Fellowship Award, and Forbes 30 Under 30 Award among others. He has worked as a Ph.D. fellow at Microsoft Research Asia and as a research assistant at Sony Computer Science Laboratories. | http://junnishida.net

Location : Via Zoom

Event Type: Faculty Candidate Talk

Abstract: While today‚Äôs tools allow us to communicate effectively with others via video and text, they leave out other critical communication channels, such as bodily cues. These cues are important not only for face-to-face communication but even when communicating forces (muscle tension, movement, etc), feelings, and emotions. Unfortunately, the current paradigm of user interfaces used for communication between two users is rooted only in symbolic and graphical communication, leaving no space to add these additional and critical modalities such as touch, forces, etc.This is precisely the research question I tackle in my work: how can we also communicate our physical experience across people?In this talk, I introduce how I have engineered wearable devices that allow for sharing physical experiences across users, such as between a physician and a patient, including people with neuromuscular diseases and even children. These custom-built user interfaces include virtual reality systems, exoskeletons, and interactive devices based on electrical muscle stimulation. I then investigated how we can extend this concept to support interactive activities, such as product design, through the communication of one's bodily cues. Lastly, I discuss how we can further explore the possibilities enabled by a user interface that communicates more than audio-visual cues and the roadmap for using this approach in new territories, such as enabling more empathic communication.

Organization

Rutgers University School of Arts and Sciences

Contact  Host: Yongfeng Zhang

Join Zoom Meeting
https://rutgers.zoom.us/j/99885881214?pwd=aTJacWhoS3l2QW14OE5NVzVYdU1Ydz09

Join by SIP
This email address is being protected from spambots. You need JavaScript enabled to view it.

Meeting ID: 998 8588 1214
Password: 628295
One tap mobile
+13017158592,,99885881214# US (Washington DC)
+13126266799,,99885881214# US (Chicago)

Join By Phone
+1 301 715 8592 US (Washington DC)
+1 312 626 6799 US (Chicago)
+1 646 558 8656 US (New York)
+1 253 215 8782 US (Tacoma)
+1 346 248 7799 US (Houston)
+1 669 900 9128 US (San Jose)
Meeting ID: 998 8588 1214
Find your local number: https://rutgers.zoom.us/u/anHwNY38A

Join by Skype for Business
https://rutgers.zoom.us/skype/99885881214