Skip Navigation Archive: U.S. Department of Health and Human Services U.S. Department of Health and Human Services
Archive: Agency for Healthcare Research Quality
Archival print banner

This information is for reference purposes only. It was current when produced and may now be outdated. Archive material is no longer maintained, and some links may not work. Persons with disabilities having difficulty accessing this information should contact us at: Let us know the nature of the problem, the Web address of what you want, and your contact information.

Please go to for current information.

  • Publication # 14-RA004

Computer reads surgeon's hand gestures to manipulate radiological image without breaking operating room sterility

Patient Safety and Quality of Care

A surgeon is gowned, masked, and gloved for an operation, looking at a radiographic image of the area of the patient's body requiring surgery. But the surgeon needs to move or enlarge the image to see the problem more clearly. What does he (or she) do? Using a mouse or computer keyboard (both difficult to rid of infectious bacteria) breaks sterility. This allows the possible transfer of antibiotic-resistant bacteria to the surgeon's glove, thus increasing the chance of a health-care-associated infection. 

To prevent this, Juan Pablo Wachs, M.Sc., Ph.D., and his colleagues at Purdue University have developed and tested a computer–vision system that recognizes hand signals made by the surgeon and translates them into commands for manipulating the radiographic image display, without the need to risk contamination of the sterile gloves. In the project, the researchers asked 10 surgeons to suggest hand and arm motions that would be easy to learn as hands-free commands for the manipulation of magnetic resonance imaging (MRI) representations of a patient's internal organs and tissues.

Dr. Wachs and his colleagues were able to identify a group of 10 fairly intuitive gestures—each suggested by at least 2 surgeons—for commands to enlarge or reduce the magnification of the MRI, rotate it clockwise or counterclockwise, move it right/left or up/down, and to increase or decrease the image brightness. The gesture-recognition software was tested by 20 volunteers (12 men, 8 women). The observed mean gesture recognition accuracy for all 10 commands was 97.2 percent, and ranged from 82.5 percent for the "decrease brightness" gesture to 100.0 percent for "browse right" and "browse down" gestures. Use of contextual information (such as the user's body orientation) reduced the false-positive rate from 20.8 percent to 2.3 percent. 

The study was funded by AHRQ (HS19837). More details are in "Hand-gesture-based sterile interface for the operating room using contextual cues for the navigation of radiological images," by Mithun George Jacob, M.S.E., Dr. Wachs, and Rebecca A. Packer, M.S., D.V.M., in the June 2013 Journal of the American Medical Informatics Association 20(e1), pp. e183-e186.


Page last reviewed February 2014
Internet Citation: Computer reads surgeon's hand gestures to manipulate radiological image without breaking operating room sterility: Patient Safety and Quality of Care. February 2014. Agency for Healthcare Research and Quality, Rockville, MD.


The information on this page is archived and provided for reference purposes only.


AHRQ Advancing Excellence in Health Care