If we've learned anything from post-apocalyptic movies it's that computers eventually become self-aware and try to eliminate humans.
BYU engineer Dah-Jye Lee isn't interested in that development, but he has managed to eliminate the need for humans in the field of object recognition. Lee has created an algorithm that can accurately identify objects in images or video sequences without human calibration.
"In most cases, people are in charge of deciding what features to focus on and they then write the algorithm based off that," said Lee, a professor of electrical and computer engineering. "With our algorithm, we give it a set of images and let the computer decide which features are important."
Not only is Lee's genetic algorithm able to set its own parameters, but it also doesn't need to be reset each time a new object is to be recognizedit learns them on its own.
Lee likens the idea to teaching a child the difference between dogs and cats. Instead of trying to explain the difference, we show children images of the animals and they learn on their own to distinguish the two. Lee's object recognition does the same thing: Instead of telling the computer what to look at to distinguish between two objects, they simply feed it a set of images and it learns on its own.
In a study published in the December issue of academic journal Pattern Recognition, Lee and his students demonstrate both the independent ability and accuracy of their "ECO features" genetic algorithm.
The BYU algorithm tested as well or better than other top object recognition algorithms to be published, including those developed by NYU's Rob Fergus and Thomas Serre of Brown University.
Lee and his students fed their object recognition program four image datasets from CalTech (motorbikes, faces, airplanes and cars) and found 100 percent accurate recognition on every dataset. The other published well-performing object recognition system
|Contact: Todd Hollingshead|
Brigham Young University