Machine learning enables prosthetic hands to control grip for daily tasks

Holding an egg requires a gentle touch. Squeeze too hard, and you'll make a mess. Opening a water bottle, on the other hand, needs a little more grip strength.

According to the U.S. Centers for Disease Control and Prevention, there are approximately 50,000 new amputations in the United States each year. The loss of a hand can be particularly debilitating, affecting patients' ability to perform standard daily tasks. One of the primary challenges with prosthetic hands is the ability to properly tune the appropriate grip based on the object being handled.

In Nanotechnology and Precision Engineering, by AIP Publishing, researchers from Guilin University of Electronic Technology, in China, developed an object identification system for prosthetic hands to guide appropriate grip strength decisions in real time.

We want to free the user from thinking about how to control [an object] and allow them to focus on what they want to do, achieving a truly natural and intuitive interaction."

Hua Li, author 

Pens, cups and bottles, balls, metal sheet objects like keys, and fragile objects like eggs make up over 90% of the types of items disabled patients use daily. The researchers measured the grip strength needed to interact with these common items and fed these measurements into a machine learning-based object identification system that uses a small camera placed near the palm of the prosthetic hand.

Their system uses an electromyography (EMG) sensor at the user's forearm to determine what the user intends to do with the object at hand.

"An EMG signal can clearly convey the intent to grasp, but it struggles to answer the critical question, how much force is needed? This often requires complex training or user calibration," said Li. "Our approach was to offload that 'how much' question to the vision system."

The group plans to integrate haptic feedback into their system, providing an intuitive physical sensation to the user, which can establish a two-way communication bridge between the user and the hand using additional EMG signals.

"What we are most looking forward to, and currently focused on, is enabling users with prosthetic hands to seamlessly and reliably perform the fine motor tasks of daily living," said Li. "We hope to see users be able to effortlessly tie their shoelaces or button a shirt, confidently pick up an egg or a glass of water without consciously calculating the force, and naturally peel a piece of fruit or pass a plate to a family member."

Source:
Journal reference:

Li, Y., et al. (2026). An intelligent artificial hand with force control based on machine vision. Nanotechnology and Precision Engineering. DOI: 10.1063/5.0253551. https://pubs.aip.org/tu/npe/article/9/1/013009/3377418/An-intelligent-artificial-hand-with-force-control

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of News Medical.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
How air pollution harms kids’ eyesight, and why clean air makes a difference