Skip to content

Gesture based computing

June 15, 2012

“Gesture-based computing is changing the ways that we interact with computers, both physically and mechanically.”

It is already common to interact with a new class of devices entirely by using natural movements and gestures. The Microsoft Surface, Apple’s iOS devices (iPad, iPhone and iPod Touch), and other gesture-based systems accept input in the form of taps, swipes, and other ways of touching. The Nintendo Wii and Microsoft’s Kinect system extend that to hand and arm motions, or body movement. These are the first in a growing array of alternative input devices that allow computers to recognize and interpret natural physical gestures as a means of control. Gesture-based computing allows users to engage in virtual activities with motions and movements similar to what they would use in the real world, manipulating content intuitively. The idea that simple gestures and natural, comfortable motions can be used to control computers is opening the way to a host of input devices that look and feel very different from the keyboard and mouse — and that are increasingly enabling our devices to infer meaning from the movements and gestures we make. Horizon 2012 (adoption horizon 4 to 5 years)


Thanks in part to the Nintendo Wii, the Apple iPhone and the iPad, many people now have some immediate experience with gesture-based computing as a means for interacting with a computer. The proliferation of games and devices that incorporate easy and intuitive gestural interactions will certainly continue, bringing with it a new era of user interface design that moves well beyond the keyboard and mouse. While the full realization of the potential of gesture-based computing remains several years away, especially in education, its significance cannot be underestimated, especially for a new generation of students accustomed to touching, tapping, swiping, jumping, and moving as a means of engaging with information. Horizon 2011 (adoption horizon 4 to 5 years)


For nearly forty years, the keyboard and mouse have been the primary means to interact with computers. The Nintendo Wii in 2006 and the Apple iPhone in 2007 signaled the beginning of widespread consumer interest in — and acceptance of — interfaces based on natural human gestures. Now, new devices are appearing on the market that take advantage of motions that are easy and intuitive to make, allowing us an unprecedented level of control over the devices around us. Cameras and sensors pick up the movements of our bodies without the need of remotes or handheld tracking tools. The full realization of the potential of gesture-based computing is still several years away, especially for education; but we are moving ever closer to a time when our gestures will speak for us, even to our machines. Horizon 2010 (adoption horizon 4 to 5 years)


Here some initial thoughts/questions

  • Gesture-based computing is still on a four to five year time-to-adoption horizon. Why did gesture-based computing stagnate since 2010?
  •  How realistic is gesture-based computing for education? The technic definitely improved during the last couple years, especially the convergence of gesture-sensing technology with voice recognition is remarkable, devices became more affordable , but was has Xbox Kinect or the Nitendo Wii to do with learning?
  • What are the possible applications for education?
  • Gesture based computing a chance for people with disabilities? The horizon report describes gesture-based computing as an enabling or assistive technology, which already having profound implications for special needs and disabled individuals. But, how affordable is that technology for people with disabilities and how widespread is the use already?

I completely agree with the Horizon Report that “the full realization of the potential of gesture-based computing within higher education will require intensive interdisciplinary collaborations and innovative thinking about the very nature of teaching, learning, and communicating”. I am looking forward to find out what others think about this topic.

Further readings:

From → #OPCO12

Leave a comment