Two new research projects are looking to improve accessibility by using the human body as a source of input. Skinput, a concept developed in partnership by Microsoft and Carnegie Mellon University's Human-Computer Interaction Institute, uses sensors to “listen to” and interpret vibrations when a user taps his or her arm.
The technology can be used alone, or combined with a small projector which lets users visualize the tappable areas of their arm which act as buttons. Recording these unique vibrations, which differ along the arm due to variations in bone and muscle mass, the system can determine which spot is touched and performs an associated computer function in real-time. Read More...
"Accuracy is already good, in the high 90s percent accuracy for finger input," said project team member Chris Harrison, from Carnegie Mellon's Human-Computer Interaction Institute.
While still early in the prototype stage, the innovative concept opens doors for many future applications. In its current form, Skinput is built as a armband which rests just above the elbow.
"We (the researchers) have worn it for extended periods of time," said Harrison. "But it does occasionally need to be retrained. As we collect more data, and make the machine learning classifiers more robust, this problem will hopefully reduce."
In later stages of development, project researchers hope to miniaturize the device, so that it includes a projector and is about the size of a wristwatch. They hope to develop Skinput into a complete and portable system that could be hooked up to any compatible electronics no matter where a user goes.
One demonstration by two members of the research team shows the ability to use the system, although still very much imperfectly, to play Guitar Hero. Another, in a more technical video demonstrates the science behind the technology.
While only built as a proof-of-concept, the device could easily become a game-changer, revolutionizing the way we interact with computers. A fully detailed project paper is set for release on April 12, 2010.
The second project comes from students at the Imperial College London, UK. There, young researchers have developed an eye-tracking prototype which lets users play Pong using only their eyes. While other similar technology already exists, this one is set apart by its inexpensive design.
The students built a set of special glasses containing an infrared light and a webcam that records the movement of one eye. The webcam is linked to a laptop where a computer program syncs the player’s eye movements to the game. A video demonstrates the technology in action.
Using strictly off-the-shelf hardware, the device cost students approximately £25 (about $40 USD) to build. Comparatively, professional eye movement tracking systems can cost a thousand times more.
Using Pong was a simple starting point for the students due to the basic nature of the game. However they expect to adapt their technology to other video games and software and maybe even to control wheelchairs, for example.
“This game is just an early prototype, but we’re really excited that from our student project we’ve managed to come up with something that could ultimately help people who have really limited movement,” said Ian Beer, a third year undergraduate student working on the project. “It would be fantastic to see lots of people across the world creating new games and applications using our software.”
“Remarkably, our undergraduates have created this piece of neurotechnology using bits of kit that you can buy in a shop, such as webcams,” said Dr. Aldo Faisal, the team’s supervisor from the Department of Computing and the Department of Bioengineering at Imperial College London. “The game that they’ve developed is quite simple, but we think it has enormous potential, particularly because it doesn’t need lots of expensive equipment.”
Looking ahead, Faisal also wants to make the supporting software available widely so others in the field can build compatible applications. “We hope to eventually make the technology available online so anyone can have a go at creating new applications and games with it and we’re optimistic about where this might lead.”
“We hope it could ultimately provide entertainment options for people who have very little movement. In the future, people might be able to blink to turn pages in an electronic book, or switch on their favourite song, with the roll of an eye,” added Faisal.