Google's temps told to find homeless and people with darker skin for Face unlock data

Google's temps told to find homeless and people with darker skin for Face unlock data
Back in July, Google released a video confirming that the new Pixel 4 series will be equipped with a more secure Face unlock feature. About a week before that video was disseminated, word got out about a project that the company had embarked on that related to a facial recognition system. To test and improve this more advanced Face unlock, Google visited major cities with a heavily disguised Pixel 4 handset; in exchange for a $5 Amazon or Starbucks gift card, contractors collected selfies and videos of random people. But this seemingly innocuous program actually had a racial undertone to it according to the New York Daily News.

Apparently, current facial recognition technology has had problems when identifying users with darker skin. To make sure that this does not become an issue with the Face unlock system that will debut on the Pixel 4 line, and future iterations of the feature, the database that Google has been creating targeted homeless people in Atlanta, those attending the BET Awards show in L.A., and college campuses around the U.S. This information comes from Daily News sources who worked on the project.

The temps collecting the data were told to focus on the homeless, college students and people with darker skin

The report noted that the temps that collected the database for Google were known as TVCs, an acronym indicating that these people were temps, contractors or vendors. The temps were paid through a third-party company called Randstand. And the temps indicated that Google used misleading methods to obtain these images. One of the TVCs told the newspaper that "We were told not to tell (people) that it was video, even though it would say on the screen that a video was taken. If the person were to look at that screen after the task had been completed, and say, 'Oh, was it taking a video?’… we were instructed to say, ‘Oh it’s not really.'"

The temps were taught to rush their subjects through survey questions, and to walk away if anyone got too suspicious. They were also told to target the homeless because they were less likely to approach the media about the program. The temps were also told to target college students because they live on a tight budget and would be interested in the $5 Starbucks card. As one of the TVCs said, "They (Google) were very aware of all the ways you could incentivize a person and really hone in on the context of the person to make it almost irresistible." Another temp said, "I feel like they wanted us to prey on the weak."

And here is what their spiel sounded like to those lending their face and personal information to the project. 18-year old Kelly Yam recalled the whole pitch used on her. "They just said, 'Do you want to enter a survey? We’ll give you a Starbucks gift card. We had to follow a dot with our nose. I did the action of holding the phone up to my face. Maybe next time I'll be more aware. I was mostly tempted by the gift card. They said it was a survey and we thought they were students. I don’t think I even realized there was a consent form."

The agreement form that the subjects were supposed to sign noted that Google could keep the image of their face and any other information "as long as needed to fulfill the purposes which is expected to be about five years." It also noted that Google could aggregate the research data, which would make those participating anonymous. However, under the agreement, "there is no limit to how long or in what manner Google may retain, use or share the aggregate data." The agreement adds that Google can data retain, use or share non-personally identifying or aggregate data without limitation for any purpose."



1. Dr.Phil

Posts: 2435; Member since: Feb 14, 2011

1. While I acknowledge it looks bad that Google targeted darker skin people, I understand the reasoning behind it. They wanted to ensure the technology works for everybody. If they didn’t do this kind of testing and the Pixel 4 was released with reviewers finding out that it doesn’t work as great for darker skin individuals, I think there would have been even MORE of a backlash. 2. I don’t agree with the tactics they used. When I read that they rushed people through the consent process, that’s never ok. Give people a chance to understand what they are consenting to, and even tell them it’s for the new Pixel product. Otherwise you’re being shady. 3. College kids are so dumb it hurts. One of the students said: “I know he said Android. I didn’t hear anything about Google,” she claimed.

8. TBomb

Posts: 1563; Member since: Dec 28, 2012

Gotta agree with all 3 points. 1. The homeless "target" was possibly for faces with lots of bushy facial hair and people who don't have great skin. But I agree, that it needed to be done, and hopefully it doesn't get blown out of proportion too much. 2. Rushing people is bad... could possibly bite them some day. But the temps probably didn't care/realize what they were doing and it probably wasn't a "google decision". 3. College kids are some of the dumbest out there. They definitely don't think at all nowadays.

2. JustJosh00

Posts: 1; Member since: Sep 24, 2019

Which would have caused more backlash: Getting the required data needed to improve their phone simply by getting pictures of black people? Or face unlock working better for white people than for blacks? People should stop getting so uptight when they see the word "racial".

5. JamesW

Posts: 24; Member since: Jun 13, 2013

Or maybe they should stop being closed minded and remember that darker skin doesn't equate black. There are so many races in the world besides black and white. A darker hue does not determine race.

3. MsPooks

Posts: 159; Member since: Jul 08, 2019


4. KingSam

Posts: 1466; Member since: Mar 13, 2016

The worst part is them intentionally exploiting the vulnerable. How hard it is to disclose what you are doing and state the intent behind it? Im sure many people would be fine with it because the reason itself is good. But Google is always shady with data collection.

7. Cyberchum

Posts: 1093; Member since: Oct 24, 2012

Yeah, It's lacking an important element—informed consent. That's actually the bad, worse and worst part, though.

9. AlienKiss

Posts: 198; Member since: May 21, 2019

My phone's cameras are all covered with stickers.. This face recognition system is nothing but a gimmick to make you people feel safe, but primarily to have a huge database. What's scary is that I've read just yesterday that china wants to implement a low that will make navigation on the internet impossible without face recognition.

10. lyndon420

Posts: 6823; Member since: Jul 11, 2012

Is it wrong to use the terms 'liberal' or 'leftist' in the PA comment sections?

11. jeffpom

Posts: 67; Member since: Dec 11, 2016

Dear Google: Honesty is the best policy. Sneaky tactics are just that - sneaky. And will bite you in the end. Openness could have gone far. "We are aiming our technology for EVERYONE. We want it to work smoothly and be secure for ALL. While our current technology works great with those who have lighter skin, we are finding it does not work as well with those who have darker skin. We would like to help us make our phones better. In return, we will provide you with a $5 gift card." Shame on you, Google.

* Some comments have been hidden, because they don't meet the discussions rules.

Latest Stories

This copy is for your personal, non-commercial use only. You can order presentation-ready copies for distribution to your colleagues, clients or customers at or use the Reprints & Permissions tool that appears at the bottom of each web page. Visit for samples and additional information.