1 Introduction
This paper lays out research into the
use of motion tracking technology for real-time, embodied telepresence
ad collaboration. The central question underlying this essay
is, "In what ways can telepresence and collaboration be
enhanced by motion tracking technology in performance and installations?"
Preliminary findings suggest
that motion tracking technology makes it possible for multiple
users to manipulate not only data objects like images, video,
sound, and light but also hardware and equipment, such as computers,
robotic lights, and projectors, with their bodies in a 3D space
across a network. Implications for use of such enhanced telepresence
may be of interest to those working on digital media projects
where hardware, software, and peripherals must be controlled
in real-time by teams working together at-a-distance or where
physical computing research is undertaken.
It begins with a brief explanation
of the evolution of the motion tracking technology called the
GAMS system within the context of motion sensor technologies,
providing, along the way, examples of projects undertaken at
each stage of its innovation; the paper, then, moves to a detailed
discussion of GAMS' application to two networked performances
where embodied telepresence was used to enhance collaboration.
|
2 Motion Tracking Technology and the
Evolution of the GAMS System
Motion tracking technology has
been used for surveillance and training, virtual reality experiences,
smart rooms, human computer interfaces, and health and therapy
programs (O'Sullivan and Igoe xxvii-xxix). In terms of its use
for projects involving real-time interaction in the production
of graphical and media art, it was pioneered by , whose Very Nervous System (VNS) was developed between
1986 and 1990. That system, described as a "third generation
of interactive sound installations," utilized "video
cameras, image processors, computers, synthesizers, and a sound
system to create a space in which the movements of one's body
create sound and/or music" (Rokeby). In effect, the system
made collaboration possible between installation and performer,
resulting in a "state of mutual influence," a condition
Rokeby referred to as "interaction" (Wilson 731). Thus,
at this stage of development motion tracking, or motion sensor
technology, as it is sometimes referred to, consisted of non-networked
experiences focusing on one user interacting with computing and
media devices.
In
the late 1980s, influenced, in part, by Rokeby, engineer-artist
Will Bauer, of Acoustic Positioning Research, Inc. (later, called
APR), created the first iteration of his Gesture and Media System,
calling it at the time the Grid Activated Sonar Production, or
GASP.
Figure 1. The ultrasonic system
|