Skip to content
Thesis Info
- LABS ID
- 00363
- Thesis Title
- I Move In Decades
- Author
- Ellen Pearlman
- E-mail
- ellenluminescense AT gmail.com
- 2nd Author
- 3rd Author
- Degree
- Master of Arts
- Year
- 2011
- Number of Pages
- 120
- University
- University of Calgary
- Thesis Supervisor
- Dr. Kenneth FIelds
- Supervisor e-mail
- Other Supervisor(s)
- Dr. James Parker
- Language(s) of Thesis
- English
- Department / Discipline
- Interdisciplinary Graduate Program
- Copyright Ownership
- Ellen Pearlman
- Languages Familiar to Author
- English
- URL where full thesis can be found
- Keywords
- Telematic Art, motion capture, data mapping, World Trade Towers, 9/11, Visual data, dance
- Abstract: 200-500 words
- I Move In Decades is a work of visual art, gesture, movement, and sonic representations translated into hybridized data combinations and communicated over a high speed telematic research network. Data mapping creates interactive productions and responsive environments over vast distances. Data describing human motion impacts transdisciplinary processes. This thesis describes a series of experiments leading to a work using physical gestural data as a compositional indicator in telematic productions. Theoretical and artistic concepts leading to the interpretation and evaluation of this data in the context of a complex international networked performance will be discussed. The technologies of motion capture, data mapping and visual effects produced in both VDMX and MAX/MSP is shown to be experimentally problematic in multimodality arts practices. Data describing human motion can disrupt creative practices in terms of new combinations but does significantly impact transdisciplinary processes. The technical part of the investigation explored the use of a 12-camera EvArt infra-red motion capture technology and body mapping. I translated body motion into triggers for visual effects by recording the data of three locations on a dancer's body; the joints of the wrist, the ankle and the elbow. Tracking the range of these movements on a motion capture grid the data of these joints X, Y and Z positions moving in space were recorded. The Y position was set to trigger visual effects when reaching specific parameters in a range. This data was sent over the network and transformed by a C++ program that changed it to OSC. The OSC data was imported into a MAX/MSP/Jitter patch to trigger visual effects in a remixed video. An alternate method was deployed using live time effects sent over the network using VDMX.