Human Interfaces


The WHaT: a Wireless Haptic Texture sensor

D. K. Pai and P. R. Rizun, ``The WHaT: A Wireless Haptic Texture Sensor,'' in proceedings of the Eleventh Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Los Angeles, March 22-23, 2003. [pdf 700k].


The Tango whole hand input device

Coming soon...



Multimodal Interfaces

Multimodal interfaces must have their different modalities (graphics, sound, haptics) carefully coordinated so that, for instance, when the user hits an object in a virtual environment, the synthesized visual, auditory, and haptic stimuli are perceived as a single atomic event. We are exploring both the design of such multimodal interfaces and the perceptual constraints (inter-modal latency, etc). An example is the AHI, an Audio/Haptic Interface developed by DiFilippo (former M.Sc. student) which uses a planar haptic interface based on Hayward's Pantograph and our own real-time control hardware and software. The device can render contact sounds and forces synchronized to within 1ms.

D. DiFilippo and D. K. Pai, ``The AHI: An Audio and Haptic Interface for Contact Interactions,'' in Proceedings of UIST'00 (13th Annual ACM Symposium on User Interface Software and Technology), November 5-8 2000, San Diego, California. [pdf 360k]


Haptic Rendering

Haptic interfaces are devices that engage the kinesthetic sense organs, and provide a richer experience of physical contact. I am interested in algorithms for "rendering" contact texture and forces of deformation. This is demanding because haptic simulation has to be fast enough for stable, low-latency haptic rendering, but involve collision detection with complicated geometric objects and contact force computations.

D. James and D. K. Pai, ``A Unified Treatment of Elastostatic and Rigid Contact Simulation for Real Time Haptics,'' accepted for publication in Haptics-e, the Electronic Journal of Haptics Research, January 2001. (revised version coming soon)
D. K. Pai and L.-M. Reissell, ``Haptic Interaction with Multiresolution Image Curves,'' Computers and Graphics, Vol. 21, No. 4, pp. 405-411, July/August 1997. [pdf 360k]
Y. Shi and D. K. Pai, ``Haptic Display of Visual Images,'' in Proceedings of IEEE Virtual Reality Annual International Symposium (VRAIS '97), Albuquerque, NM, pp. 188--191, March 1997. [ps.gz 540k].
J. Siira and D.K. Pai, ``Haptic Textures -- A Stochastic Approach,'' in IEEE International Conference on Robotics and Automation, (Minneapolis), pp. 557-562, April 1996. [ps.gz 200k].


 

Auditory Perception of Material

Since our sound models are designed for providing sensory feedback to humans, it is important to evaluate the perceptual relevance of these models. During my sabbatical leave at CMU, I began a collaboration with Prof. R. Klatzky , to evaluate our sound synthesis algorithms with human subjects. We found that a shape-invariant auditory-decay parameter was a powerful determinant of the perceived material of an object. Our results also provide an empirical metric of the perceived similarity of contact sounds, as a function of decay and fundamental frequency.

R. L. Klatzky, D. K. Pai, and E. P. Krotkov, ``Perception of Material from Contact Sounds,'' Presence: Teleoperators and Virtual Environments, 9:4, August 2000, The MIT Press. pp. 399-410. [pdf 500k]



(c) Dinesh Pai 2003