MIT Researchers Unveil Voice-Controlled Robotic Wheelchair

By September 26, 2008

Researchers at MIT are developing a voice-controlled wheelchair that will be easier to use than previous efforts. Users simply need to tell the chair where to go in order to get there. Outdoors, the chair uses GPS to navigate. Inside, where GPS is not reliable, it is guided by a “mental map” of the locale. Unlike other robotic chairs that rely on lasers or extensive environment prepping, MIT’s wheelchair is self-learning. The chair is programmed simply by being taken on a “guided tour,” during which key locations are identified via WiFi.

“It's a system that can learn and adapt to the user,” says co-developer Nicholas Roy, assistant professor of aeronautics (picture: left).

Along with Roy, the wheelchair is designed by Seth Teller (picture: right), professor of computer science and engineering, and Bryan Reimer, a research scientist at MIT's AgeLab. Funding is being by Nokia and Microsoft.

Teller is head of the Robotics, Vision, and Sensor Networks (RVSN) group at MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL), which is doing significant work in developing machines with "situational awareness," that can "learn these mental maps, in order to help people do what they want to do, or do it for them,” says Teller. Other projects include an autonomous forklift and a location-aware cell phone.

A real-world trial of the chair is currently underway at Boston Home in Dorchester, a nursing home where all of the patients use wheelchairs. The researchers hope to add a collision-avoidance system of the wheelchair, as well as mechanical arms to pick up and manipulate items.

Photo / Patrick Gillooly

Legal mentions © L’Atelier BNP Paribas