Game Accessibility

FWD News: Accessibility Through Hands-Free Gaming

A User Controlling Eye MarioWaterloo Labs, a group of young engineers from Texas, has developed the Eye Mario system which lets users control Nintendo Entertainment System (NES) software using only eye movements. The technology, which can be reproduced by following instructions on the team's website, reads player’s eye position using a technique known as electrooculography (EOG).

At the basis of EOG is the fact that your eyes are polarized. Changing eye position will also modify the electric field surrounding them, which can be interpreted by captors and translated into basic computer signals. Read More...

While the average person would not have the skills to build his or her own Eye Mario, Waterloo Labs does present a viable option as an accessible gaming input method. The system would be particularly useful for individuals living with quadriplegia or other significant motion impairments. Eye Mario could one day be commercialized though the group mentions no plans for it at this time. Click here to view a video demonstration of Eye Mario.

Accessibility is also a popular theme when it comes to the controller-free Kinect system from Microsoft. It was widely reported in mainstream publictions like PCMag that the new motion control system could support American Sign Language (ASL). While Microsoft admits having included sign language recognition in a recent patent filed in relation to Kinect, it has also confirmed that the system’s first generation hardware would not offer that feature.

“We are excited about the potential of Kinect and its potential to impact gaming and entertainment,” said a spokesperson. “Microsoft files lots of patent applications to protect our intellectual property, not all of which are brought to market right away. Kinect that is shipping this holiday will not support sign language.”

While the hardware originally included the ability to read individual fingers, a decision to downgrade the system’s power and resolution to lower cost now prevents it. However, the second generation of Kinect may offer this accessibility feature.

Microsoft’s Kinect has also been in the spotlight due to the problems some early demos showed when attempting to use the system while seated. The folks over at AbleGamers had the chance to speak with Microsoft during E3 2010 about this accessibility issue.

At the time, a Microsoft spokesperson stated “as new technologies like Kinect for Xbox 360 come to market, we recognize that they won't always immediately work for some people because of the inherent physical nature of the gameplay.”

However, representatives stated the company’s commitment to improving this aspect and to making Kinect as accessible as possible. “We continue to work on the technology to improve its ability to recognize and render more types of form factors and positions, so that as many people as possible can experience what Kinect has to offer. And as we work on future versions of the technology we hope to learn from the Accessibility community about the scenarios and features that will allow Kinect to appeal to more users.”

Since then, reports have indicated that Microsoft has improved the system’s ability to read player’s gestures while seated. According to Blitz Games' chief technical officer, Andrew Oliver, updated software algorithm libraries are have solved the problem. While results will never be perfect in every case, Oliver adds that game developers have the ability to work off the base provided by Microsoft.

“I would need to do a software algorithm that kind of works that out. It's just a bit of image processing. So they've [Microsoft] given you a generic piece, which is actually pretty impressive and covers most cases - certainly all the standing up, and now sitting down. If you want to go further than that, then do it yourself in software,” he said.