Navigation Links
Robotics insights through flies' eyes
Date:7/31/2009

Common and clumsy-looking, the blow fly is a true artist of flight. Suddenly changing direction, standing still in the air, spinning lightning-fast around its own axis, and making precise, pinpoint landings all these maneuvers are simply a matter of course. Extremely quick eyesight helps to keep it from losing orientation as it races to and fro. Still, how does its tiny brain process the multiplicity of images and signals so rapidly and efficiently?

To get to the bottom of this, members of a Munich-based "excellence cluster" called Cognition for Technical Systems or CoTeSys have created an unusual research environment: a flight simulator for flies. Here they're investigating what goes on in flies' brains while they're flying. Their goal is to put similar capabilities in human hands for example, to aid in developing robots that can independently apprehend and learn from their surroundings.

A fly's brain enables the unbelievable the animal's easy negotiation of obstacles in rapid flight, split-second reaction to the hand that would catch it, and unerring navigation to the smelly delicacies it lives on. Researchers have long known that flies take in many more images per second than humans do. For human eyes, anything more than 25 discrete images per second will merge into a continuous movement. A blow fly, on the other hand, can perceive 100 images per second as discrete sense impressions and interpret them quickly enough to steer its movement and precisely determine its position in space.

Yet the fly's brain is hardly bigger than a pinhead, too small by far to enable the fly's feats if it functioned exactly the way the human brain does. It must have a simpler and more efficient way of processing images from the eyes into visual perception, and that is a subject of intense interest for robot builders. Even today, robots have great difficulty perceiving their surroundings through their cameras, and even more difficulty making sense of what they see. Even the recognition of obstacles in their own work space takes too long. So people still need to protect their automated helpers, for example, by surrounding them with safety enclosures. Yet a more direct, supportive collaboration between human and machine is a central research goal of the "excellence cluster" named CoTeSys, Cognition for Technical Systems. This Munich-area research initiative was co-founded by around one hundred scientists and engineers from five universities and institutes.

Within the framework of CoTeSys, brain researchers from the Max-Planck Institute for Neurobiology are exploring how flies manage to apprehend their environment and their own movement so efficiently. They've built a flight simulator for flies under the leadership of neurobiologist Prof. Alexander Borst. Here, on a wraparound display, the researchers present diverse patterns, movements, and sensory stimuli to blow flies. The insect is held in place by a halter, so that electrodes can register the reactions of its brain cells. Thus the researchers observe and analyze what happens in a fly's brain when the animal whizzes in criss-cross flight around a room.

The first results show one thing very clearly: The way flies process the images from their immobile eyes is completely different from they way the human brain processes visual signals. Movements in space produce so-called "optical flux fields" that characterize specific kinds of motion definitively. In forward motion, for example, objects rush past on the sides, and foreground objects appear to get bigger. Near and distant objects appear to move differently. The first step for the fly is to construct a model of these movements in its tiny brain. The speed and direction with which objects before the fly's eyes appear to move generate, moment by moment, a typical pattern of motion vectors, the flux field, which in a second step is assessed by the so-called "lobula plate," a higher level of the brain's vision center. In each hemisphere there are only 60 nerve cells responsible for this; each reacts with particular intensity when presented with the pattern appropriate to it. For the analysis of the optical flux fields, it's important that motion information from both eyes be brought together. This happens over a direct connection of specialized neurons called VS cells. In this way, the fly gets a precise fix on its position and movement.

Prof. Borst explains the significance of this investigation: "Through our results, the network of VS cells in the fly's brain responsible for rotational movement is one of the best understood circuits in the nervous system." Yet these efforts don't end with the purely fundamental research. The discoveries of the neuroscientists in Martinsried are also particularly interesting to the engineers associated with the academic chair for guidance and control at the Technischen Universitt Mnchen (TUM), with whom Prof. Borst collaborates closely in the framework of CoTeSys.

Under the leadership of Prof. Martin Buss and Dr. Kolja Khnlenz, the TUM researchers are working to develop intelligent machines that can observe their environment through cameras, learn from what they see, and react appropriately to the current situation. Their long-range aim is to enable the creation of intelligent machines that can interact with people directly, effectively, and safely. Even in factories, the safety barriers between humans and robots should fall. To that end, simple, fast, and efficient methods for the analysis and interpretation of camera pictures are absolutely essential.

For example, the TUM researchers are developing small, flying robots whose position and movement in flight will be controlled by a computer system for visual analysis inspired by the example of the fly's brain. One mobile robot, the Autonomous City Explorer (ACE) was challenged to find its way from the institute to Marienplatz at the heart of Munich a distance of about a mile by stopping passers-by and asking for directions. To do this, ACE had to interpret the gestures of people who pointed the way, and it had to negotiate the sidewalks and traffic crossings safely.

Increasingly natural interaction between intelligent machines and humans is unthinkable without efficient image analysis. Insights gained from the flight simulator for flies through the scientific interplay CoTeSys fosters among researchers from various disciplines offer an approach that might be simple enough to be technically portable from one domain to the other, from the insects to the robots.


'/>"/>

Contact: Patrick Regan
regan@zv.tum.de
49-892-892-2743
Technische Universitaet Muenchen
Source:Eurekalert  

Related medicine news :

1. Health Robotics Announces American Joint Venture With i.v.SOFT
2. Cardiorobotics Closes $11.6M Series A Financing to Advance Clinical Development of Snake Robot for Surgery
3. Ochsner 1st in Louisiana to use Robotics in Living-Donor Kidney Transplant for Faster Recovery
4. Health Robotics Announces i.v.STATION(TM) Contract With Grifols in Spain, Portugal and South America
5. Health Robotics Announces the Early Release of the i.v.STATION(TM) Profile Option and DUPHAT Presentation
6. Health Robotics Announces European Launch of i.v.STATION(TM) and Global List of 2009 Installations
7. New Website Launched, The Robot Report, Tracking The Business of Robotics
8. Habitat for Humanity and WARP Robotics Team Up
9. Health Robotics Announces IV Station (TM Pending) a Centrally-Controlled,
10. Yales Scassellati gets Microsoft breakthrough research award for robotics
11. Robotic Surgeons From Around the World Attend One-of-a-Kind Conference by Global Robotics Institute at Florida Hospital
Post Your Comments:
*Name:
*Comment:
*Email:
Related Image:
Robotics insights through flies' eyes
(Date:2/12/2016)... Seattle, WA, and Washington, DC (PRWEB) , ... ... ... PATH and the Siemens Foundation today announced a new initiative—the Siemens ... technologies for low-resource settings. The partnership will recruit top students from U.S. ...
(Date:2/12/2016)... Falls Church, VA (PRWEB) , ... February 12, 2016 , ... CDRH Enforcement Trends: , ... http://www.fdanews.com/cdrhenforcementtrends , As Winston Churchill said, “Those who don’t learn ... 2015 will show what to expect when they come knocking this year. But that takes ...
(Date:2/12/2016)... ... February 12, 2016 , ... Each year, the American Physical Therapy Association ... CA at the Anaheim Convention Center. Almost 10,000 physical therapists across the country are ... action, learn more about their chosen field and network with their colleagues. As ...
(Date:2/12/2016)... ... February 12, 2016 , ... Coco Libre, the maker ... Red Carpet Events LA GRAMMY’s Style Lounge Event. Coco Libre will offer musicians and ... hydrated before the big event. The invitation-only gifting suite, held this year at the ...
(Date:2/12/2016)... Appleton, Wis. (PRWEB) , ... February 12, 2016 ... ... its second Lean Leadership Series at Zuckerberg San Francisco General Hospital on April ... to practice new behaviors and create new habits. The workshops cover a broad ...
Breaking Medicine News(10 mins):
(Date:2/12/2016)... , 12 februari 2016 ... een toonaangevende leverancier van productie en ontwikkeling ... industrieën, kondigt vandaag een uitbreiding aan van ... haar locatie in Charleston, SC ... geleid tot meerdere recente investeringen. ...
(Date:2/11/2016)... -- The primary goal of this research is to ... usage of liquid biopsy. Key information the survey seeks ... of liquid biopsy adoption amidst future users - Predominantly ... - Sample inflow to conduct liquid biopsy tests - ... so on. - Correlation analysis of sample type and ...
(Date:2/11/2016)... 11, 2016  Walgreens has committed to provide drug ... Washington, D.C. as part of a ... commended by shareholder advocacy organization As You Sow. ... "Many people hold on to unneeded drugs because they lack ... consequences." --> Conrad MacKerron , Senior Vice ...
Breaking Medicine Technology: