Gesture interaction
A gesture based human-computer interaction system
This project aimed to create a robust, scalable, gesture based human-computer interaction system.
A camera and a projector are mounted on the ceiling and project an image down onto a screen constructed from a partially retro-reflective screen material. The screen is constructed of a laminate of polyester, sheer fabric, and a reflective microstructure film (provided by Reflexite Corporation).
The screen is partially retro-reflective, so the camera sees only the projected image. No additional lighting is required. This screen structure allowing self-calibration, and the screen does not need to be a fixed distance from the projector, nor does it need to be flat. Any object that intersects the projected image appears black – even white paper. Gestures can be detected by thresholding the camera image, then looking for closed surfaces with a blob detection algorithm. Looking at metrics like size, shape and neighboring objects lets us differentiate a hand from a business card from a laptop. Retro-reflective tags can even be used as controls or to identify mobile devices.
Examples of how the system can be used:
A blank sheet of paper is used as a magnifier on maps:
Pinch gestures are used to create multiple simultaneous virtual paintbrushes – where the aspect ratio of the gesture sets the width of the paintbrush:
And gestures are used as controls – shown here controling a virtual musical instrument: