Artist Glyph Graves has pushed the boundaries between virtual and physical realities once again. He programmed and scripted his own hack for the Kinect to map and move his real life physical body and movements onto virtual objects in Second Life. Glyph's earlier piece, "Faceted Existence," (see: http://www.youtube.com/watch?v=9q5N1X5Cs30) used 2500 spherical primes to indicate his face, which loomed large above avatars in the virtual landscape, and was revolutionary as the first use of Kinect to make art that others could see, as opposed to only controlling an avatar with the alternative controller.
"Disembodiment," the new performance piece that Glyph debuted at the Linden Endowment for the Arts exhibition, InterACT! on Sunday, December 4, 2011, extends his previous experimentation to represent not only his face, but now a body represented by spheres. Here he discusses how and why:
Lori Landay (L1Aura Loire): Can you tell us how you use the kinect controller to manipulate objects in Second Life?
Glyph Graves: For those that don't know what a kinect is, it's a sensor device that can capture depth and shape data using a combination of infrared and visible light. I take the data and stream it into Second Life, then reconstruct the shapes by positioning prims.
For all of us technical geeks, how about some more detailed specs? The rest of you can take a catnap and come back for the answer to the next question!
I first made this in August right after the face piece but never got round to showing it except to a few people.
I used the Microsoft kinect drivers on a PC. Using C# programming language I can access the inbuilt functions that pull out skeletal data ... that is the screen positions of the body's joints, like hips, elbows etc.
I extract this data using my own functions and server code on my pc. From there it goes to my off world server then to the receiving program in world
The whole process is a loop of programs ... it is triggered from within SL which sends a request to a php program I wrote on my off world server (there are actually two here). The request then sends a request to my pc to send any data it has ( I usually have 5 frames of data at any one time), which it sends back to the php program then into SL.. This SL program then moves each prim in the linkset then sends a request for more data. At the same time it takes my RL hand positions and maps them to sets of notes. Thus my body becomes a musical instrument.
Several different people figured out hacks for kinect or wii mote, including other artists at InterACT!, but what you do that is unique is to use the interface to make an art performance that people can watch that is separate from using the controllers to move the avatars. How did you come up with the idea?
Both the face and this were made in September -August and do something quite different to the triggering of premade animations in avatars which are the pieces that you are referring to.
The idea of using the kinect or wii to trigger pre-made animations in your avatar has never appealed to me. I feel the use of prims allows a more direct and personal representation. It doesn't rely on the presence of a set of pre-made animations that may or may not correspond to what you are actually doing. That was done in SL more than 2 years ago using mocap software.. Ohio Uni?
What interests you about this kind of art and how does it connect to your other work?
I have played with the idea of merging the physical world and SL to make joint world pieces for a long time now and have used it in pieces like "Cells" in May 2010 and "Antarctica: an individual existence" ( Sept 2010) and had a whole sim installation this year devoted around the concept and which featured not only "Faceted Existence" but pieces like "Forest of Water" http://www.youtube.com/watch?v=GaEFX38bp8U
This took real time data from individual rivers (each individual tree used the data of a different river ..about 35 in all) in the states and in interaction with avatars moving through them mapped it to the note sets of several different instruments. "Enfolded" was another piece at that installation. It took real time satellite data of the earth's magnetic field and rotated the prims elements in the sculpture to create a dynamic real time visual piece.
Thanks very much, Glyph, for explaining your work, and for experimenting with and exploring the boundaries between the virtual and psychical worlds with your art. I'm delighted that you performed "Embodiment" at InterACT! an exhibition by the Linden Endowment for the Arts showcasing interactive virtual art on LEA4 in Second Life, where your installation, "Diversity, A Tapestry Spun" and also "Antarctica" can currently be seen and experienced. I can't wait to see what you do next.
A Linden Endowment for the Arts exhibition showcasing interactive virtual art