DURHAM, N.C. The day may be getting a little closer when robots will perform surgery on patients in dangerous situations or in remote locations, such as on the battlefield or in space, with minimal human guidance.
Engineers at Duke University believe that the results of feasibility studies conducted in their laboratory represent the first concrete steps toward achieving this space age vision of the future. Also, on a more immediate level, the technology developed by the engineers could make certain contemporary medical procedures safer for patients, they said.
For their experiments, the engineers started with a rudimentary tabletop robot whose eyes used a novel 3-D ultrasound technology developed in the Duke laboratories. An artificial intelligence program served as the robots brain by taking real-time 3-D information, processing it, and giving the robot specific commands to perform.
In a number of tasks, the computer was able to direct the robot's actions, said Stephen Smith, director of the Duke University Ultrasound Transducer Group and senior member of the research team. We believe that this is the first proof-of-concept for this approach. Given that we achieved these early results with a rudimentary robot and a basic artificial intelligence program, the technology will advance to the point where robots without the guidance of the doctor can someday operate on people.
The results of a series of experiments on the robot system directing catheters inside synthetic blood vessels was published online in the journal IEEE Transactions on Ultrasonics, Ferroelectrics and Frequency Control. A second study, published in April in the journal Ultrasonic Imaging, demonstrated that the autonomous robot system could successfully perform a simulated needle biopsy.
Advances in ultrasound technology have made these latest experiments possible, the researchers said, by generating detailed, 3-D moving images in real-time.<
|Contact: Richard Merritt|