Human physical interaction is inherently multisensory. The Haptic, Auditory, and Visual Environment (HAVEN) is a novel physical space that supports multisensory human interaction and measurement. It consists of a specially constructed and instrumented chamber in which humans can interact with other humans, physical objects, and computer simulations. The HAVEN provides multisensory display (including 3D projection displays, multichannel auditory displays, and haptics) as well as several sensors for interactive measurement (including a Vicon motion capture system and force sensors). Funded by NSF.