… or why it’s useless to have the most secure crypto system in the world, when using non-free and untrustworthy tools and libraries to implement it.
tl;dr: There is a “backdoor” in Signal nobody cares about, only Google can use it.
This blog post was updated several times to add additional explanation and make it easier to understand for people that do not know a lot about the Android platform. The changes are always highlighted accordingly.
Recently, there was a huge discussion about a possible backdoor in the crypto code of WhatsApp, which finally led Moxie Marlinspike, the developer of the Signal crypto messenger to an interesting statement.
We believe that WhatsApp remains a great choice for users concerned with the privacy of their message content.
So, what you want to say me is that Signal is essentially useless, because WhatsApp is as good? So why don’t stop Signal and join forces with WhatsApp?
If you use Signal, you probably would not want that: Signal is an open-source messenger, well audited and suggested by major security and privacy experts. WhatsApp however is closed-source and owned by Facebook.
If you think the same about Signal, I have to tell you, that you’re wrong. Well beside the point about Signal being suggested by several widely known people: While Signal is indeed open-source, it is still not free software because it uses an external library that is not open. So every audit that works on the source code will fail to find security issues and backdoors in that externally included code – and this is where I want to start to look into Signal now.
Google Play Services
Google Play Services consists of numerous features and thus the library that may be included is separated into multiple parts. To stay with the Signal example, it includes the parts for using Google Cloud Messaging, showing a Google Map and asking the user for it’s current location (ref). All three work differently and obviously all of them leak certain data to Google, usually via TLS encrypted connection that does not use pinned certificates. For further analysis, we consider Google to be the evil party (which they might become due to US law enforcement) and every Signal user uses Signal on a default Android system (as shipped with a device).
EDIT: In a previous version of this blog post, the following two sections were in reversed order. People seem to stop reading when they hear about sth. they already know, even if new points might come later, so I decided to reorder…
After selecting to share a location, Signal shows a small map with a pin on the selected location. After the map is loaded, a screenshot of it is sent as an image to the other side, together with some string describing the location.
The relevant thing here is the displaying of the map. This is done by embedding the MapView when displaying a conversation, and showing it up when there is something to show. This means the MapView is already initialized when opening a conversation in Signal. Critical about this is that the MapView view as used by Signal is just a wrapper that loads the actual MapView, by including code from the Google Play Services binary (which means code outside of the apk file you meant to use). This code is included by calling the createPackageContext-method together with the flags CONTEXT_INCLUDE_CODE and CONTEXT_IGNORE_SECURITY. The latter is a requirement as the android system would deny loading code from untrustworthy sources otherwise (for a good reason). The code is then executed in the Signal process, which includes access to the Signal history database and the crypto keys.
The Google Play Services binary can easily be updated in background through Google Play Store, even targeted to single users, and the updated code would become active inside Signal the moment you use it next time. Can it get worse? Yes. An apk update would be detectable to the user, but Google Play Services uses a dynamic module loading system (called Chimera and/or Dynamite) that seems to be capable of replacing the Maps implementation from a file not installed to the system, as long as it’s signed by Google. If it is possible for Google to push an update only for this module and remove it later, it might be possible for them to inject code into the Signal client that uploads your complete local chat history unencrypted and afterwards removes all signs of it’s existence.
What does “seems to be able” mean? Well it’s hard to determine exactly. The relevant binary is highly obfuscated and thus hard to understand. Maybe someone wants to waste his time on this, but remember it can be changed in the next release again…
Google Cloud Messaging and other METADATA
Signal uses Google Cloud Messaging (GCM) to transfer information about incoming messages to the client. These packets send from Signal servers to devices using GCM contain no valueable data (no text, not even encrypted strings, only a friendly ping). However they are metadata. What kind of metadata is in there? It is just the information that there is a new incoming message, what’s so wrong with it?
Well, you need to consider that Google knows a lot about everyone. The usual Android user will grant Google access to his contact database, list of installed apps, when which app is started, your current location, which text you write using the devices virtual keyboard (when using the “Gboard”) and even more. For most of this data, current implementations will share only a certain amount of it to Google servers and keep some of the analysis on the device. However if Google is the attacker all of this belongs to them and depending on your privacy settings, quiet a lot of it is already available to them.
So, when using Gboard, Google already knows the content of the messages shared, but not the recipient. And this is where the GCM data can be used. It shouldn’t be too hard to correlate the fact that one user is sending a message via Signal to another user receiving it. Sending is easily known due to the app being active on one device (information Google collects for statistical reasons). With Gboard active, Google even knows the exact moment the message is sent (enter key is pressed). Of course there are millions of Signal users with different users likely sending messages in approximately the same time, but it’s no problem if you need multiple messages to be sure and there is another bunch of information further restricting the search space: both users likely have the phone number of the other one in their phone’s contact list and Google is usually well aware of the contents of your contact list.
EDIT: Even when not using Gboard, the issue with GCM metadata + contact list + app in foreground is still present and thus makes it possible to gather the information who is chatting with whom at what time. Without using GCM, this data would be available to the Signal server operator only, by using GCM, Google gets it as well.
You can hardly name this thing a backdoor? But metadata can still kill you.
Backdoors everywhere (?)
The same details also apply to every app that uses the here mentioned parts of Google Play Services. I don’t know all the crypto messengers, but WhatsApp seems to be affected by exactly the same issues and probably some other messengers too.
But, is this really a backdoor? The feature that allows Google to update libraries without users consent is not really a backdoor, but more a security feature. Of course the fact that it’s not possible to disable it is bad and maybe it should not execute its code in the sandbox of the app, but rather in a separate sandbox. But I bet, when Google set up this system, they had no intention to add a backdoor, neither did Signal or WhatsApp. They just wanted to make it user-friendly – but unfortunately user-friendly and secure are still not available in combination.
But Signal is Forward secret, You can’t read old messages
The forward secrecy feature is only on the transport level. At both ends (in groupchats, all ends) the messages are available in plain text in form of a history. If you or the other(s) don’t wipe your chat history regularly, your history is still attackable. But this is how users want it, because it’s more user-friendly.
EDIT: This is not a huge problem, if the operating system ensures that data of other apps is not easily reachable, which Android does. And this is the reason why the aforementioned backdoor is in fact important: without this backdoor, Google would not be able to read your messages, due to it they are.
Did you know that you can have forward secrecy in TLS as well? This means that if you have proper TLS encrypted links in a highly federated network (using a protocol like XMPP) you can have forward secrecy as well.
This is not primarily about a specific problem. The problem is that you just can’t trust a messenger that is – even partially – non-free software. Signal is, WhatsApp is, Facebook Messenger is, Google Allo is. Deal with it!
You can of course also fix the Signal app. LibreSignal was a fork of Signal that does not use any Google services and thus does not suffer from the problems mentioned, but releases were halted following WhisperSystems request to do so.
Or you just switch to a better system. As mentioned XMPP can be used with forward secrecy. XMPP is a federated system where users can set up their own servers, write their own clients. It’s about moving trust away from large US companies and to people next to you that you actually can trust. If you can trust your and the other’s XMPP server provider as much as you trust Google and the servers are properly set up, you can just use XMPP over TLS even without OTR/PGP/OMEMO (which all are crypto systems on top of XMPP) and reach the same level of secrecy as with Signal. Set up your very own XMPP server and use a Google-free Android device and you can communicate even more secret then with Signal. Fantastic, isn’t it?
No, I’m not going to make this an ad about microG.