Motion parallax is a crucial visual cue produced by translations of the observer for the perception of depth and selfmotion.
Therefore, tracking the observer viewpoint has become inevitable in immersive virtual (VR) reality systems
(cylindrical screens, CAVE, head mounted displays) used e.g. in automotive industry (style reviews, architecture design,
ergonomics studies) or in scientific studies of visual perception.
The perception of a stable and rigid world requires that this visual cue be coherent with other extra-retinal (e.g.
vestibular, kinesthetic) cues signaling ego-motion. Although world stability is never questioned in real world, rendering
head coupled viewpoint in VR can lead to the perception of an illusory perception of unstable environments, unless a
non-unity scale factor is applied on recorded head movements. Besides, cylindrical screens are usually used with static
observers due to image distortions when rendering image for viewpoints different from a sweet spot.
We developed a technique to compensate in real-time these non-linear visual distortions, in an industrial VR setup, based
on a cylindrical screen projection system.
Additionally, to evaluate the amount of discrepancies tolerated without perceptual distortions between visual and extraretinal
cues, a "motion parallax gain" between the velocity of the observer's head and that of the virtual camera was
introduced in this system. The influence of this artificial gain was measured on the gait stability of free-standing
participants. Results indicate that, below unity, gains significantly alter postural control. Conversely, the influence of
higher gains remains limited, suggesting a certain tolerance of observers to these conditions. Parallax gain amplification
is therefore proposed as a possible solution to provide a wider exploration of space to users of immersive virtual reality
systems.
|