GCHQ. Behind this abbreviation for Government Communications Headquarters are the spy agencies of the United Kingdom, just like the NSA is an umbrella for other three-letter organizations that are tasked with keeping our country safe from adversaries foreign and domestic.
These, apparently, include chat apps like Apple's iMessage or WhatsApp that are now all encrypted so that your communications stay between you and the other people you are talking to. According to recent profile of the GCHQ in the Financial Times
, the agency is working on an AI project (what else) to decide on the best access points for gathering data.
Taking a lead from the way DeepMind taught its algorithms what winning looked like, thereby making the computer’s chess tactics less predictable and more human-like, GCHQ hopes that eventually the system will learn the most productive places to harvest communications metadata (the location, time and type of message rather than the actual content, which is commonly protected by end-to-end encryption).
That's all fine and dandy, and sounds like the braniacs at GCHQ are trying to balance individual freedoms with national security by employing machine learning and other modern methods that don't infringe on a person's right to privacy.
It's a glowing report on the GCHQ's methods for the digital age by a reporter whose report about the agency depends on access given by that same agency. In reality, Apple, Google, Facebook with WhatsApp, and a number of privacy organizations just penned an open letter against the implementation of GCHQ's so-called "ghost protocol" that was proposed in November by its codebreaking chief. In a nutshell, the "ghosting" tack would allow government security officials to hook up to chat sessions as silent observers without any notification for the other parties involved.
While the GCHQ argues that this is no different than the digital "crocodile clips" it uses to eavesdrop on regular communication services, it would be a devastating blow to the public's confidence in privacy-oriented ones like iMessage or WhatsApp if the government gets cc-ed on a supposedly encrypted conversation.
First, it would require service providers to surreptitiously inject a new public key into a conversation in response to a government demand. This would turn a two-way conversation into a group chat where the government is the additional participant, or add a secret government participant to an existing group chat.
Second, in order to ensure the government is added to the conversation in secret, GCHQ’s proposal would require messaging apps, service providers, and operating systems to change their software so that it would 1) change the encryption schemes used, and/or 2) mislead users by suppressing the notifications that routinely appear when a new communicant joins a chat.
Apple didn't budge when the FBI wanted a backdoor
to its messaging encryption a few years back, and being a signatory to the "ghost protocol" letter means that it is not planning to budge now as well. Granted, Apple is not immune to privacy mishaps itself, like the recent FaceTime group video chat conundrum
but at least it's not purposeful. The UK's GCHQ chat ghosting proposal, however, would change that dynamic, and got a parallel rebuttal from the American Civil Liberties Union (ACLU) as well:
The Ghost proposal institutionalizes a significantly worse user interface failure than Monday’s FaceTime flaw. With the FaceTime bug, the vulnerable user at least gets an alert about an incoming call to know that something is happening, even if the user interface is misrepresenting the situation and violating the user’s expectations. With the Ghost proposal, the user has no way of even knowing that something is happening that violates their expectations.