Prosthetic limbs. They’ve been 3D Printed, they’ve been automated, and now? Now they can feel, too.
Researchers in Bradley Greger’s Neural Engineering Lab at Arizona State University are creating an interface between neural nerves and prosthetic limbs to give patients a heightened level of control.
The goal is to create limbs that patients will use as true extensions of themselves. To accomplish this, the team is seeking to establish two-way communication between a user and a new prosthetic limb so that it is capable of controlling more than 20 different movements.
It gets even more exciting as virtual reality is integrated into the next phase of their study. The aim is to create a neural code that not only maps brain signals to hand movements, but creates a real synergy in movement.
…markers will be applied to the patient’s healthy hand to record its movements, and then these measurements will be used to direct the virtual hand, with the patient using an Oculus Rift virtual reality headset. Another team member, Kevin O’Neill, a doctoral student at ASU, is developing technology that allows the patient to see what the virtual limb is doing and decodes the neural messages that enable the motion to happen.
Read the full article here!