The Virtual VeeJeying animation at Siggraph 2005 was a low-cost and interactive application where a lighted glove was the interface to manipulate and mix medias such as videos and still images.
This work mixes image-analysis algorithms and real-time computer graphics images to illustrate what could be the future of some human-graphic interfaces.
Innovation: This is a multimodal graphic interface that uses a simple natural body language. The core technical innovation is the connection between a new haptic techinque and video-jockey software that allows participants to interact with several flows of images in real time. The lighted glove is equivalent to a multimodal device, such as a mouse: with one finger, a participant can select an image element; with two fingers, a participant might move a selected element; and special hand motions change the mode of the interface. For expanle, turning one's hand switches the interface to a special-effect mode that applies effects to the image elements instead of moving them.
Vision: This presentation is part of the trend that merges technique and art. It is an example of what the future everyday interfaces could look like. Using virtual graphic interfaces should provide a better feeling for digital content, and anyone will be able to experience being a VJ by mixing images and videos. The gloves could replace current remote controls for interaction with our environments at home or at work. This application might also be adapted to applications in theme parks, where visitors can be actors in immersive environments.
Use of our system in a show:
A performer (a VJ) or even a speaker on stage first prepares his (or her) presentation using our authoring tool. Once the presentation is ready, he (or she) declares the lights of his (or her) glove to the system that will automatically show them in the authoring tool in order to assign them to special effects.
|On the left image, a performer, can, for instance apply a swirl effect on his horizontal movement of his finger, while he can generate a particle emitter to reveal a video in the second mode (shown on the right image).
On the left image, One can scratch videos moving a finger horizontally and add a color splitter moving the same finger vertically.
The image on the right side illustrates the possibility to grab and move videos in real time from one screen to another. We used this latter feature in order to realize an award ceremony at Laval Virtual.
Our solution can be an universal interface capable of controling any devices (VCR, Lighting controlers, Synthesizers, Samplers, Amplifier,...) directly from the stage using a simple glove mounted of lights. Technician is then located on stage and becomes the master of the ceremony.
A special thanks to Sophie Cluet, Dominique Begis and Laurent Kott, from Inria, for their faith in this work. To Jean-Francois Fontaine from the National Assembly and to Patrick Saint Jean from ACM Siggraph for selecting our work for Siggraph. To Thomas Besson, Paul Cayon and Francis Maes for their really good work in the authoring tool. To Malika Ait Gherbi for her support and the fantastic lighted dress presented on the booth. To Emannuel Cayla, Jo and Evelyne Lutton for the great fractal-based movies we presented in the "Minority Report"-like mode.