UK prosthetic hand sees, thinks, and grips automatically

[ad_1]

U of Newcastle bionic hand with vision

“The hand ‘sees’ and reacts in one fluid movement”, said the University.

A 99p webcam added to an i-limb Ultra hand and Motion Control wrist. Interfacing these is a convolutional neural network running on a laptop

A number of amputees have already tried it, and now the Newcastle team is working with the Newcastle upon Tyne Hospitals NHS Foundation Trust to offer the hands to patients at Newcastle’s Freeman Hospital.

The current way of operating bionic hands – through myoelectric activity recorded from the skin surface of the stump – takes practice, concentration and is slow, according to Newcastle engineer Dr Kianoush Nazarpour.

“Responsiveness has been one of the main barriers to artificial limbs. For many amputees the reference point is their healthy arm or leg so prosthetics seem slow and cumbersome in comparison,” he said. “Using computer vision, we have developed a bionic hand which can respond automatically – in fact, just like a real hand, the user can reach out and pick up a cup or a biscuit with nothing more than a quick glance in the right direction.”

Instead the hand moves itself, under the control of a convolutional neural network. Newcastle electronic engineer Ghazal Ghazaei trained it using 72 images each of 500 objects from different angles, under different lighting and against different backgrounds, and linked these with suitable hand grips.

“So the computer isn’t just matching an image, it’s learning to recognise objects and group them according to the grasp type the hand has to perform to successfully pick it up,” she said. “It is this which enables it to accurately assess and pick up an object which it has never seen before.”

Processing, from image capture to sending movement control signals, is “all within a matter of milliseconds and ten times faster than any other limb currently on the market,” said the University.

Four different grasps (see video) are used:

  • palm wrist neutral (picking up a cup)
  • palm wrist pronated (picking up the TV remote)
  • tripod (thumb and two fingers)
  • pinch (thumb and first finger)

“The beauty of this system is that it’s much more flexible and the hand is able to pick up novel objects – which is crucial since in everyday life people effortlessly pick up a variety of objects that they have never seen before,” said Nazarpour.

The work was led by biomedical engineers at Newcastle University and funded by the Engineering and Physical Sciences Research Council (EPSRC). Working alongside Newcastle in this project are the universities of Leeds, Essex, Keele, Southampton and Imperial College London.

The seeing hand is an interim solution – the work is part of a larger research project to develop a hand that can sense pressure and temperature and transmit the information to the brain, as well as include devices that connect to the forearm neural networks, allowing two-way communications with the brain.

Details of the seeing hand are published as ‘Deep learning-based artificial vision for grasp classification in myoelectric hands‘ in the Journal of Neural Engineering.

According to the paper, classification accuracy of the neural network was 85% for know objects and 75% for previously unseen objects. After training users, they could pick up and move objects in 88% of attempts.

 

[ad_2]

Source link

Leave a Reply