My fav Star Wars movie is definitely The Empire Strikes Back for a lot of reasons, including a very epic -although cliffhanger-y- final for every main character in the saga. The infamous “No, I am your father” sentence between Luke and Vader has become legendary by now, and it certainly helps action that he had his hand cleanly cut by a light saber not a minute earlier. One would have thought that Luke’s career as a swordman was over, but, hey, it is the future after all! Just before the end credits, he’s already flexing a bionic one.
Whereas this was science fiction back in the 80s, at the moment prosthetics are improving fast. Whereas the most one could expect 30 years ago was a rigid plastic looking hand, at the moment technology is searching for features like artificial realistic-looking skin, providing mobility to every piece of the hand and, in extreme, brain control of the prosthetic.
The main challenges of bionic limbs are precision tasks that need to be carrier with very specific force and finger positioning. Particularly, gripping is not easy for a robotic hand, because an excess of force will break or damage whatever we are trying to catch, whereas we won’t be able to grab it if we don’t use enough force. Of course, the solution to the problem is feedback: we start gripping with a reasonable strength and increase it gradually depending on the force we perceive in the fingers. In fact, this problem has been solved for surgery robots, which can now press a knife against skin with just enough force not to pierce it or cut a watermelon in half if they want to.
Most commercial devices, like i-limb ultra prosthetic hand from Touch Bionics, are operated by behavior based software: they offers a range of preprogrammed skills to assist in daily tasks that the user can select on a need basis.
The idea is quite simple. Each time we decide to perform some task with our hand, some neurons in our brain are triggered and our arm muscles release some electric impulses. The first input can be captured via Brain Computer Interfaces, whereas the second can be captured via electromyography, i.e. electrodes inserted in our limbs. Since captured patterns are very similar if we want to perform the same movement, if we capture enough of these patterns, at some point we’ll be able to split them into clusters depending on the action we want to perform and, hence, command an artificial hand to actually do it for us. The hand is controlled by a microcontroller that receives the action to be performed and decomposes it in the sequence of commands to the different motors of the hand required to accomplish the desired action. These sequences are pre-recorded in the microcontroller and it just needs to adjust their parameters to the environment: feedback from any force sensor in the hand is used to control how much we close the finger, how much strength we apply, etc.
There are other commercial prosthetics already in the market, like Bebionics‘ arm. It has several buttons for behavior selection, but if you follow the video for a while, you’ll check how it obeys electric impulses from the muscles in the arm (myoelectric impulses) and, after calibration, it allows precision tasks like tying up shoe laces.