The BU team has built a prototype capable of simple detection of objects and open spaces, and preliminary tests show that most people can echolocate a little using the device, and improve quickly with practice. They are now refining their prototype to function in more complex, real-world environments. Morland believes that given enough practice, people should be able to echolocate very well using the device - perhaps better than they could unassisted, since higher frequencies outside the normal range of human hearing are more useful for echolocation. (Movies of the device can be found at http://cns.bu.edu/~cjmorlan/research)
Paper 2pUWa6, "What it is like to be a bat: A sonar system for humans" will be presented at 5:20 p.m. on Tuesday, July 1 in room AMPHI BORDEAUX.
9) TAKING AURAL CUES FROM FLIPPER
Dolphins have a very keen sonar system that is able to make fine distinctions between complex targets such as buried mines. But what are the cues these animals use for fine target discrimination? Whitlow Au (firstname.lastname@example.org) of the University of Hawaii will present the latest results from a series of human listening experiments, using echoes from real targets and a simulated broadband dolphin echo-ranger. The echoes are stretched in time to shift them into the lower frequencies of the human auditory range. He finds that human performance is usually as accurate as the dolphins when it comes to object discrimination. Participants are then asked to identify which aural cues were most important in enabling them to make those determinations. Th
|Contact: Jason Bardi|
American Institute of Physics