Over the past 30 years, advances in computer technology have led to giant leaps in memory, speed and connectivity. But one aspect that has failed to keep pace since Xerox and Apple introduced the first graphical user interfaces (GUIs) is the way humans interact with computers. The holy trio of screen, keyboard and mouse have undergone many refinements but no truly disruptive innovation.
Now several viable new methods of interacting with the digital world are being developed that will make recent innovations in touch screen technology look like a mere blip in the arc of IT progress. They are all in the field of what is termed gesture-based computing, where the means of interaction with the PC is based on a “natural” user interface (NUI) that receives commands via intuitive hand movements. Such a concept will be familiar to anyone who has seen the 2002 movie Minority Report, where a gloved Tom Cruise manipulates images on his computer screen using hand gestures (as in the picture, above).
Remarkably, one of the hottest systems in development is the work of John Underkoffler, a former research scientist at MIT, who was the science adviser on the movie. Director Steven Spielberg was keen to ensure that all the technology in the picture was within the realms of possibility, and so allowed Underkoffler to use his work for the movie as the basis of a research project.
Now chief scientist at high-tech start-up Oblong Industries, Underkoffler has developed g-speak, which actually brings that Minority Report technology to life, and is already finding customers in Fortune 500 companies in sectors that include – among others – financial services, automotive, and logistics and supply-chain management. The application is particularly suited to managing large volumes of data and combining it with geo-spatial information, as well as real-time collaboration on 3-D design and engineering projects.
Rival systems are also being developed at MIT. One of the most promising is from PhD student Pranav Mistry. Called SixthSense, it requires the user to wear a small projector and camera, connected to a pocket-sized computing device.
The result is a highly portable system that completely obviates the need for any further hardware, as any surface can be used as a monitor, keypad or keyboard. Other functions allow data to be (almost literally) “picked up” and “dropped” from one device to another – or even from paper to PC – with natural hand movements, while instinctive gestures can be used to perform tasks such as taking a photograph.
An even simpler system being prototyped at MIT, by grad student Robert Wang and associate professor Jovan Popovic, requires just a normal webcam and PC, and a pair of coloured Lycra gloves.
All the developers above are confidently predicting widespread adoption by consumers and enterprises in the short to medium term. “We’ve forgotten to invent new interfaces,” says Underkoffler. “It’s been 25 years – can there really only be one interface? There can’t. In five years’ time when you buy a computer, you’ll get this.”