It’s ALMOST like the problem isn’t just technical: Contractors target homeless people and BET awards in quest for more non-white faces

As anyone following the state of facial recognition and other automated identification systems knows, these systems suffer from bias problems: some have trouble recognizing facial features or detecting motion if a user has darker skin. The answer, we’re told, is obvious: make the databases bigger. Yet this amazing New York Daily News article by Ginger Adams Otis and Nancy Dillon describes a near-cartoonish level of deception that occurs when a company tries to deal with the problem.

When building the facial recognition system for its new Pixel 4, Google wanted to improve its facial recognition system, so it hired a contractor whose “teams were dispatched to target homeless people in Atlanta, unsuspecting students on college campuses around the U.S. and attendees of the BET Awards festivities in Los Angeles, among other places.”

“It was a lot of basically sensory overloading the person into getting it done as quickly as possible and distracting them as much as possible so they didn’t even really have time to realize what was going on,” he said.

“Basically distract them and say anything. ‘Just go ahead and hit the next button. Don’t even worry about that.’ That kind of stuff. Really just move it along. ‘Let’s go. Hit all the next buttons,’” the former temp said, snapping his fingers for emphasis.

It’s almost… almost as if the problem isn’t just with the technology, but somehow runs deeper.

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll Up