Who, what, why? A guide to understanding the Facebook encryption debate if you have lost the plot
Photo by Daniel Jensen on Unsplash
Everyone is talking about Facebook's end-to-end encryption plans and the US, UK and Australian government's response. Feeling lost? Here is what you need to know.
What's Facebook trying to do?
First let's be clear: Facebook has many faults when it comes to privacy. It's also suffered a number of security failures recently. See here for instance.
In response to their successive failures to protect your privacy, Facebook announced in their 'pivot to privacy' that they would start following the example set by other messaging apps to introduce 'end to end encryption'. It's already built into WhatsApp (and Signal, and Wire, and soon Skype). Facebook announced it was going to extend it to Messenger.
We like this for a variety of reasons, including because it means that finally some of what you do on Facebook, meaning the content of your communications, will not be accessible to Facebook to exploit for advertising and other purposes. It's long over-due in fact.
It's also quite hard to do well. Have no doubt: security is hard, and implementing this will require the greatest minds at Facebook to implement meaningful security.
At the moment who has access to my Messenger communications?
Facebook, and the users who are recipients. Facebook also generates metadata about the communications -- though when we've asked them to detail this, they've refused to respond as to what data is generated and who may have access.
Why is Facebook doing this?
A noble way of putting it is this: Facebook is trying to rebuild your confidence in their products. A less noble way of putting it: Facebook is trying to catch up with others in industry.
As we've seen repeatedly, foreign governments and nefarious actors want to have access to your data, your accounts, and your communications. This helps them undermine you and perpetrate fraud, and even undermine democratic processes. Facebook has been a key target.
By protecting at least some of your data from abuse, using encryption, Facebook thinks that people will increasingly trust them with their time, attention and data.
What on earth has the UK-US-Australia agreement to do with this?
Very little. The agreement is the only real news being announced today -- the commentary about the letter to Facebook is pure political positioning.
Previously, when conducting an investigation, UK police agencies would have to ask the U.S. Government to approach Facebook and other US companies. This is because US law on surveillance by police is remarkably protective of the rights of Americans and those who reside in the US, which has been extended (until now) to anyone whose data was stored in the US. Governments across the world complained that it was unfair to them -- despite their often much more permissive legal regimes that it took so long to get access to data.
One solution would have been for governments across the world, including democracies like the UK, to increase the protections to a standard set by the U.S. Governments didn't like the sound of that. Even if they had done so, the reality was that the U.S. Government was also very slow at undertaking this cooperation -- mostly because they didn't want to provide it with adequate funding. Rather than make it work well, the U.S. Government walked away from the responsibility.
So instead the U.S. Government passed the Cloud Act, which allows these other governments to go directly to U.S.-based companies and ask for data under these weaker legal protections, so long as the US government decides those protections are of a certain, minimum standard.
But don't the police need to be able to find child abusers and terrorists?
Absolutely they do and they should have all the lawful powers to do so.
First, Governments should be ensuring that these powers are granted to law enforcement agencies, while upholding human rights and constitutional protections. But Governments often fail to do even these basic things.
Second, there are many other ways to get access to data about people. In fact, Facebook produces loads of data about billions of people, including non-Facebook users, and this data can be shared for ongoing investigations, under lawful and rights-respecting procedures.
Third, increasing the security of our data and the services that billions of people use is a key protection of human rights. If the last few years have taught us anything, it's that there are powerful institutions who want to do us harm and harm our democracies. They want to gain access to our data and communications in order to do harm -- so protecting us from this requires protecting our data.
Facebook has only just started down this path and so much more is required.