The researchers hypothesized that a similar "biomechanical synergy" may exist not only among the five human fingers, but also among seven. To test the hypothesis, Wu wore a glove outfitted with multiple position-recording sensors, and attached to her wrist via a light brace. She then scavenged the lab for common objects, such as a box of cookies, a soda bottle, and a football.
Wu grasped each object with her hand, then manually positioned the robotic fingers to support the object. She recorded both hand and robotic joint angles multiple times with various objects, then analyzed the data, and found that every grasp could be explained by a combination of two or three general patterns among all seven fingers.
The researchers used this information to develop a control algorithm to correlate the postures of the two robotic fingers with those of the five human fingers. Asada explains that the algorithm essentially "teaches" the robot to assume a certain posture that the human expects the robot to take.
Bringing robots closer to humans
For now, the robot mimics the grasping of a hand, closing in and spreading apart in response to a human's fingers. But Wu would like to take the robot one step further, controlling not just position, but also force.
"Right now we're looking at posture, but it's not the whole story," Wu says. "There are other things that make a good, stable grasp. With an object that looks small but is heavy, or is slippery, the posture would be the same, but the force would be different, so how would it adapt to that? That's the next thing we'll look at."
Wu also notes that cer
|Contact: Sarah McDonnell|
Massachusetts Institute of Technology