P OV AND P ERCEPTUAL F LOW

P OV AND P ERCEPTUAL F LOW

Gibson used fi lm to study self-motion (as part of a research program on pilot training) and identifi ed motion perspective as a relevant perceptual variable. Gibson (1954, p. 321) states that the analysis of motion perspective for a large portion of the visual fi eld “suggests that the impression of forward movement of the observer can be produced optically without any contribution from the vestibular or muscle sense.” He reports that observers of a moving picture of “a landing fi eld ahead of an airplane” reported an experience of locomotion along a glide path toward a visible spot on the ground, and even more compelling experiences of locomotion can be induced using the panoramic motion picture. We can experience such locomotion in fl ight simulators used for pilot training and testing, in fi rst-person fl ight simula- tion video games and some fi lms, as well as other VR and VG applications. The

1968 fi lm 2001: A Space Odyssey 2 pioneered technical effects used to create the “Star Gate,” streaming, whirling lights representing space travel. Technological and cinematic advances allow us to better imitate the per- ceptual experience of the natural environment. The subjective or point-of-view (POV) camera increases interface transparency because it simulates the spatial component of direct perception specifying where the viewer is within a spatial array, and POV Self-Motion (POV-SM), common in VR simulations, VGs, and some fi lms, provides the navigational component, the perceptual fl ow of loco- motion (Preston, 1998). With its increased navigational realism, simulating a path of view, POV-SM is associated with the experience of immersion or presence in mediated environments.

2 The 1968 fi lm 2001: A Space Odyssey (screenplay by Stanley Kubrick & Arthur C. Clarke, who also wrote the book). More information is available at the Internet Movie Data Base (http://www.

imdb.com/)

11 Mediated Environments to Consciousness

Gibson argues that affordance meanings can be perceived indirectly, via mean- ingful still and moving pictures, in a way that is part historical (i.e., experience) and part cultural as well as ecological. Our social and individual processes of cognition and awareness are thoroughly mixed; mediated apprehension gets combined and fused with direct apprehension (Gibson, 1976, unpublished, cited by Reed, 1988, p. 307). The ecological self, the self that inhabits its environment, brings together directly and indirectly-apprehended information. Therefore, experiences in one’s real-life environment and in one’s simulated environment are expected to affect one another. In a study focusing on virtual action space (Preston, 2005), participants experienced simulated environments from the “subjective point-of-view,” that is, the viewer saw what would be seen if present in and moving about the space (POV- SM). Therefore, it was expected that factors relevant to real-life action space as well as those occurring in virtual action space would affect participant interpretation and responses.

Eighty-six university students (45 men, 41 women, mean age 20 years) vol- unteered to view, on a large screen (7.5 feet wide),10 short POV-SM clips (race car, railroad, sky diver, airplane, luge/sled, dune buggy, wind surfi ng, stunt plane, fi ghter plane, and roller coaster). After each clip, participants indicated their emotional responses and degree of motion symptoms. Participants also completed a ques- tionnaire and pre- and post-viewing tests measuring balance and perceptual and spatial abilities. The questionnaire yielded scores about real-life variables including enjoyment of rides, type and amount of sports participation, and typical real-life symptoms (nausea, vision diffi culty, balance diffi culty). Greater sports participa- tion was associated with more enjoyment of the clips and better post-test balance. It was also related to higher post-test spatial scores. Participants having real-life nausea rated the clips as more arousing and dominant and reported stronger symp- toms during viewing. Participants having real-life diffi culty with balance reported higher emotional ratings and did more poorly on the perceptual and spatial post- tests. Participants reporting real-life nausea, as well as vision and balance diffi culties,

performed poorly on the spatial post-test. The indices of sports participation, real-life symptoms (diffi culties with nausea, vision, and balance), and enjoyment of rides taken together provided information about participants’ ability to establish and maintain basic orientation to events in the natural environment, while the post-tests indexed ability to reorient following view- ing POV-SM media. Participants who are better able to maintain basic orientation in the natural environment may interpret motion symptoms during viewing as a normal and expected part of the experience of motion. They enjoyed the viewing experience more and also performed better on the post-tests, indicating that they were better able to orient to the virtual event and to reorient to real-life.

For Gibson, the perceptual systems are functional, actively obtaining infor- mation about both the self and the environment. We continuously adjust to chang- ing stimulation. To function in the natural environment, to perceive and to act, the

296 Joan M. Preston

individual must establish and maintain orientation to the environment. Lack of ori- entation interferes with direct perceptual exploration. We not only actively main- tain orientation when experiencing changing optical information, but we reorient as we move to and from natural and mediated environments. Mou et al. (2004) conducted a study of spatial updating by participants in augmented reality (AR) environments. These are systems that blend computer-generated virtual objects or environments with real environments. Participants performed tasks in mobile AR systems having either an environment-stabilized (ESF) or a body-stabilized (BSF) frame of reference. In the ESF condition (objects remained in place when the person moved, as is typical in the natural environment), participants were able to update the location of objects to perform a spatial task when they rotated their body. The fi ndings also indicated that spatial memory is orientation dependent. In the BSF condition, objects maintained their position relative to the participant’s body (e.g., an object directly in front of the person remained directly in front when the person rotated 90 ° ). Naive users initially used an environment-stabilized frame to perform the spatial task, but after just 2 minutes of exposure, their representation changed to body-stabilized frame. Our “basic orienting system” incorporates all the perceptual and action systems and enables us to maintain our orientation to all the forces and surfaces around us. This study demonstrates how readily we can do this in virtual space.

Reed (1988) states that wherever our eyes look, the world ordinarily does not tilt, swing, or distort; rather, we see an upright world where we are moving or tilt- ing. When moving, we use the wide-angle visual fi eld to help us navigate through

a terrain because horizontal cues, linear perspective, and motion fl ow provide visual cues important to maintaining spatial orientation and posture control (Gibson, 1966, 1979; Previc, 1998). We continuously detect environmental information and changes in that information to update our spatial awareness. Reed (1988) argues that the goal of perception is to obtain clear information out of a sea of potential stimulation, thus perceptual activity is one of selection. In an unfamiliar space, we perceptually explore to identify relevant information to support our actions. In both natural and mediated environments, maintaining our basic orientation (using continuous spatial updating) facilitates good information selection, on which adequate or successful perception depends.