Moving Objects With Your Mind
by Alissa Mallinson
All you have to do is think about it.
Or at least that’s what it would look like to someone watching you use the robotic finger system that PhD student Faye Wu is designing in Professor Harry Asada’s lab.
Wu – who earned an SB and an SM in mechanical engineering from MIT in 2009 and 2012, respectively, and was the first recipient of the Sonin Graduate Fellowship in 2014 – has been a member of Professor Asada’s lab for two years, working on a set of futuristic robotic fingers that look like they just burst through the pages of a graphic novel.
An outside observer would see a user with two hands – and 12 fingers. One of the hands would likely be inactive due to stroke or handicap – as Wu is developing the robotic hand as a wearable assistive device – while the active hand would include its five natural fingers and two robotics ones that extend from a wrist mount. They are designed to work cooperatively with the five healthy fingers to achieve tasks that would otherwise be impossible with just one hand, such as opening a pill bottle or grabbing a heavy object. Users who are rehabilitating an impaired hand could also wear the robotic fingers on their weaker arm to provide improved physical therapy by allowing normal motions despite a lack of function.
Of course, the robotic fingers don’t actually read the wearer’s mind, but Wu does expect that they will be able to take implicit commands toward a common goal. Those commands are likely to follow from a set of pre-established triggers for common tasks.
“Human hand motion,” says Wu, “as complex as it is, is really just combinations of a few major movements. We were surprised to see that this actually extends out to robotic fingers too. This discovery made us very happy because it means we can create more natural robotic motion quite simply.”
How it Works
The wearer’s alpha hand fits inside a sensor glove, which gathers data on motion of the fingers, as well as contact forces between the object and the hand. A camera measures the relative positions of the hand with respect to the objects in its grip and records the motion of the adjoining arm. The robot hand, with its own set of sensors, corresponds to movement of the alpha hand in order to support and complement it.
For example, if the commanding hand grasps a bottle cap, it would trigger the robotic hand to grasp the bottle body and provide the opposing force needed to twist off the cap. Another trigger might be a drop of the elbow, which the robot is trained to interpret as the sign for “holding” rather than “moving.” Alternatively, the trigger could be something as simple as a tap of the foot or a nod of the head, indicating to the robot that it should prepare for a specific task.
But, says Wu, the best option is for wearers to simply approach an object or perform a task as they normally would. In this scenario, the robot would correctly detect the user’s intentions based on their natural movements, which would then trigger the robot to appropriately respond. Wu is interested in the intuitiveness and ease of use associated with this option, and has narrowed her focus on this type of robotic assistive hand for her thesis.
“An algorithm will be continuously running in the background, deciding if you are simply holding an object, if you are manipulating it, in what way the manipulation is being done, and what future actions you may be taking,” says Wu.
In addition to the controls techniques associated with the development of a “mind-reading” third hand, Wu is also interested in its design. The comfort and flexibility of “wearables” are crucial to consumer satisfaction, and at the same time, they need to be sturdy and powerful enough to carry out the required tasks. The solution, says Wu, is likely a combination of soft, compliant robotic elements and more classic materials and actuators.
“I’m thinking of what a real human finger is like,” she says. “It is soft on the outside, and a combination of soft and hard compliances on the inside, with some stiffness to hold up the structure. I imagine that in the future the robotic fingers will have comparable properties. We can even take it one step further, for example, to enable the robotic fingers to tune the stiffness of their surfaces, so if a wearer’s grip is slipping, they could grab a little harder, or if they’re gripping something more fragile, they could grab more softly.”
Wu, who comes from a family of doctors, hopes to begin official testing of her prototype this summer in cooperation with local hospitals, including the Spaulding Rehabilitation Hospital in Boston.