Teller and MIT colleagues develop wheelchair that listens

SHARE:
October 27, 2008

As reported by the MIT News Office, September 19, 2008, EECS professor Seth Teller and assistant professor of aeronautics and astronautics Nicholas Roy are developing a new kind of autonomous wheelchair that can learn all about the locations in a given building, and then take its occupant to a given place in response to a verbal command.

Teller, who is head of the Robotics, Vision, and Sensor Networks (RVSN) group at MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL), stated to the MIT News Office, "I'm interested in having robots build and maintain a high-fidelity model of the world."

In fact, the MIT system can learn about its environment in much the same way as a person would: By being taken around once on a guided tour, with important places identified along the way. For example, as the wheelchair is pushed around a nursing home for the first time, the patient or a caregiver would say: "this is my room" or "here we are in the foyer" or "nurse's station."

Just by saying "take me to the cafeteria" or "go to my room," the wheelchair user would be able to avoid the need for controlling every twist and turn of the route and could simply sit back and relax as the chair moves from one place to another based on a map stored in its memory.

Teller says the RVSN group is developing a variety of machines, of various sizes, that can have situational awareness, that is, that can "learn these mental maps, in order to help people do what they want to do, or do it for them." Besides the wheelchair, the devices range in scale from a location-aware cellphone all the way up to an industrial forklift that can transport large loads from place to place outdoors, autonomously.

This research has been funded by Nokia and Microsoft. Read more:

MIT News Office, Sept. 19, 2008 article: "Robot wheelchair finds its own way MIT invention responds to user's spoken commands"