Posted on January 22, 2010

Are Face-Detection Cameras Racist?

Adam Rose, Time, January 22, 2010

When Joz Wang and her brother bought their mom a Nikon Coolpix S630 digital camera for Mother’s Day last year, they discovered what seemed to be a malfunction. Every time they took a portrait of each other smiling, a message flashed across the screen asking, “Did someone blink?” No one had. “I thought the camera was broken!” Wang, 33, recalls. But when her brother posed with his eyes open so wide that he looked “bug-eyed,” the messages stopped.

Wang, a Taiwanese-American strategy consultant who goes by the Web handle “jozjozjoz,” thought it was funny that the camera had difficulties figuring out when her family had their eyes open. So she posted a photo of the blink warning on her blog under the title, “Racist Camera! No, I did not blink. . . . I’m just Asian!” {snip}

Nikon isn’t the only big brand whose consumer cameras have displayed an occasional–though clearly unintentional–bias toward Caucasian faces. Face detection, which is one of the latest “intelligent” technologies to trickle down to consumer cameras, is supposed to make photography more convenient. Some cameras with face detection are designed to warn you when someone blinks; others are programmed to automatically take a picture when somebody smiles–a feature that, theoretically, makes the whole problem of timing your shot to catch the brief glimpse of a grin obsolete. Face detection has also found its way into computer webcams, where it can track a person’s face during a video conference or enable face-recognition software to prevent unauthorized access.

{snip}

Indeed, just last month, a white employee at an RV dealership in Texas posted a YouTube video showing a black co-worker trying to get the built-in webcam on an HP Pavilion laptop to detect his face and track his movements. The camera zoomed in on the white employee and panned to follow her, but whenever the black employee came into the frame, the webcam stopped dead in its tracks. “I think my blackness is interfering with the computer’s ability to follow me,” the black employee jokingly concludes in the video. “Hewlett-Packard computers are racist.”

{snip}

{snip} TIME tested two of Sony’s latest Cyber-shot models with face detection (the DSC-TX1 and DSC-WX1) and found they, too, had a tendency to ignore camera subjects with dark complexions.

But why? It’s not necessarily the programmers’ fault. It comes down to the fact that the software is only as good as its algorithms, or the mathematical rules used to determine what a face is. There are two ways to create them: by hard-coding a list of rules for the computer to follow when looking for a face, or by showing it a sample set of hundreds, if not thousands, of images and letting it figure out what the ones with faces have in common. In this way, a computer can create its own list of rules, and then programmers will tweak them. You might think the more images–and the more diverse the images–that a computer is fed, the better the system will get, but sometimes the opposite is true. The images can begin to generate rules that contradict each other. “If you have a set of 95 images and it recognizes 90 of those, and you feed it five more, you might gain five, but lose three,” says Vincent Hubert, a software engineer at Montreal-based Simbioz, a tech company that is developing futuristic hand-gesture technology like the kind seen in Minority Report. It’s the same kind of problem speech-recognition software faces in handling unusual accents.

And just as the software is only as good as its code and the hardware it lives in, it’s also only as good as the light it’s got to work with. {snip} That’s one reason why a person watching the YouTube video can easily make out the black employee’s face, while the computer can’t. “A racially inclusive training set won’t help if the larger platform is not capable of seeing those details,” says Steve Russell, founder and chairman of 3VR, which creates face recognition for security cameras.

The blink problem Wang complained about has less to do with lighting than the plain fact that her Nikon was incapable of distinguishing her narrow eye from a half-closed one. An eye might only be a few pixels wide, and a camera that’s downsampling the images can’t see the necessary level of detail. So a trade-off has to be made: either the blink warning would have a tendency to miss half blinks or a tendency to trigger for narrow eyes. {snip}

{snip}