Assistive Technologies for Quadriplegics

The goal of this project is to develop a robotic arm system that allows patients with quadriplegia to complete tasks of daily living. The system will operate with collaborative, low-level intelligent control by the robot (given the data from its accompanying vision system) with high level commands given by the user. As an example, a task could be picking up a cup with a straw, bringing it to the user to drink, and returning it to the table. The user will be presented with a picture of the world in front of them on a computer screen. He or she will select the object to be grasped using eye tracking by clicking on the object with their eye gaze. The computer will perform object recognition and pose estimation to determine the best grasp for the object. The robotic arm will then navigate to that object and the user will be able to accept or alter that grasp and position using facial movements corresponding to gripper positions and orientations. This project is a collaborative effort that includes the Applied Physics Lab and the Johns Hopkins Hospital.

 

Standard_Facial_Grid MartyCropped2

 

 

The video below shows a preliminary version of the desired system.  The quadriplegic user can pick up the mug by selecting “Mug” on the screen with his eyes.  This version of the program has pre-programmed trajectories coupled to the on-screen commands.


 

People Involved:

  • Faculty: Greg Hager and Dr. Albert Chi
  • Students: Amanda Edwards

Funding Sources:

  • Applied Physics Lab Graduate Bimanual Fellowship
  • NeuroEngineering Training Initiative

Comments are closed.