Facebook reportedly tested a facial recognition app
It would be interesting to see how many people flat out don't trust Facebook. This is the company that got fined $5 billion by the Federal Trade Commission (FTC) for failing to adhere to a consent decree it signed back in 2011. The terms of the consent decree prevented Facebook from using member profiles without the express consent of subscribers. In 2015-2016 Aleksandr Kogan, a Russian-American professor at Cambridge University, collected profiles through the use of an app he developed ostensibly for research purposes. But Kogan sold as many as 87 million user profiles to a company called Cambridge Analytica, which was hired by the Trump campaign to turn the data into information that it could use.
certain Android apps sent users' personal information to Facebook. The social-media site allegedly received this personal data even if the user did not have a Facebook account.As recently as the first day of this year, an organization called Privacy International issued a report claiming that
Facebook has had an issue with biometric data before
What brings up Facebook's apparent inability to keep members' private data private is a report from Business Insider stating that the company had developed a facial recognition app between 2015 and 2016 that was developed for employees. The app was never released to consumers and has been discontinued. The frightening thing about the system is that according to one source, it could identify any Facebook member if enough data about the member was available. The app was in the early stages of development, according to the report. Facebook employees with the app installed on their phones could point the camera at a person and seconds later the display would show their name and Facebook profile photo.
Last year, a lawsuit against Facebook was certified as a Class Action meaning that several similar suits were consolidated into one. The plaintiffs claimed that the app was using facial recognition on their phones without permission. Since 2010, the company had been collecting facial templates based on users' physical characteristics in order to show members' names in photographs. But the plaintiffs say that this violates the 2008 Illinois Biometric Information Privacy Act which prohibits companies from collecting and storing biometric data without permission. Facebook's defense is that facial templates do not count as biometric data. This feature remains on the app, and when someone "tags" a Facebook subscriber in a photo it links back to the subscriber's Facebook profile. This feature used to be enabled by default on the app, but users must now opt-in.
Facebook is also reportedly developing its own AI assistant similar to Google Assistant, Siri and Alexa. The company would use it for its Portal line of smart displays; currently, the Portal speakers use Amazon's Alexa digital helper. Back in 2015, Facebook did add such a feature for the Messenger app which it called "M." While "M" used AI to answer certain questions, those it couldn't handle were sent to a call center manned by humans. In January 2018, Facebook eliminated the feature.
Facebook is allegedly working on an AI assistant for its Portal smart display
Meanwhile, Facebook is one of four tech firms (along with Apple, Google, and Amazon) that is being investigated by the House of Representatives' Judiciary Committee for possible antitrust violations. Just this past week, the committee released written responses from the four tech firms to questions it asked each of the companies. Facebook admitted in its reply that it dropped certain apps from its developer platform if they competed with Facebook's own features. As an example, the company admitted that it dropped Vine, Twitter's now-defunct app that created six-second video loops. Facebook said that Vine was a copy of its News Feed. Committee members also wanted to know the "exact circumstances" behind Facebook's decision to drop apps like Phhhoto, MessageMe, Voxer, and Stackla. The company said that it "will restrict apps that violate its policies."
If Facebook cannot be trusted with personal and biometric data, we should be breathing a sigh of relief that it stopped developing the aforementioned facial recognition app. Or did it? Can we believe Facebook when they say that it is no longer developing such a tool?