"Computer scientists normally focus on the computational aspects of a problem, but the HR issues of working with crowds can be just as challenging," says Krzysztof Gajos, Assistant Professor of Computer Science at SEAS and the students' adviser.
PlateMate works in coordination with Amazon Mechanical Turk, a system originally intended to help improve product listings on Amazon.com. Turkers, as the crowd workers call themselves, receive a few cents for each puzzle-like task they complete.
PlateMate divides nutrition analysis into several iterative tasks, asking groups of Turkers to distinguish between foods in the photo, identify what they are, and estimate quantities. The nutrition totals for the meal are then automatically calculated.
The researchers did encounter some common-sense problems with sending photographs to strangers without any context. A latte made with whole milk looks no different than one made with skim milk, a fast-food burger might pack in more calories than one cooked at home, and a close-up photo of a bag of chips could indicate either a sample-sized snack or a late-night binge on a bag designed for 12.
Early tests also identified some cultural limitations. Overseas Turkers routinely identified a burger bun with ketchup as a muffin with jam.
Even after restricting the tests to American workers, Noronha and Hysen discovered that portions of chicken were being characterized as "chicken feet," again and again. The puzzling result drew their attention to another significant and common problem in crowdsourcing: worker laziness. "Chicken feet" was simply the first option in a list of chicken-related foods, so lazy Turkers were just clicking it and moving on to a new task.
Noronha and Hysen solved these problems by designing simple, clearly defined tasks, and algorithms that compare several answers, selecting the best one. They provided warnings about common er
|Contact: Caroline Perry|