What are the reasons to use Signal over Telegram
-
Its impossible to verify what code their server is running.
Signal has posted multiple times about their use of SGX Secure Enclaves and how you can use Remote Attestation techniques to verify a subset of the code that’s running on their server, which directly contradicts your claim. (It doesn’t contradict the claim that you cannot verify all the code their server is running, though.) Have you looked into that? What issues did you find with it?
I posted a comment here going into more detail about it, but I haven’t personally confirmed myself that it’s feasible.
-
Nice try FBI.
Wouldn’t “NSA” or “CIA” be more appropriate here?
Well, if my pin is four numbers, that'll make it so hard to crack. /s
If you’re using a 4 number PIN then that’s on you. The blog post I shared covers that explicitly: “However, there’s a limit to how slow things can get without affecting legitimate client performance, and some user-chosen passwords may be so weak that no feasible amount of “key-stretching” will prevent brute force attacks” and later, “However, it would allow an attacker with access to the service to run an “offline” brute force attack. Users with a BIP39 passphrase (as above) would be safe against such a brute force, but even with an expensive KDF like Argon2, users who prefer a more memorable passphrase might not be, depending on the amount of money the attacker wants to spend on the attack.”
If you can't show hard evidence that everything is offline locally, no keys stored in the cloud, then it's just not secure.
If you can’t share a reputable source backing up that claim, along with a definition of what “secure” means, then your claim that “it’s just not secure” isn’t worth the bits taken to store the text in your comment.
You haven’t even specified your threat model.
BTW, "keys" when talking about encryption is the keys used to encrypt and decrypt,
Are you being earnest here? First, even if we were just talking about encryption, the question of what’s being encrypted is relevant. Secondly, we weren’t just talking about encryption. Here’s your complete comment, for reference:
I have read that it is self hostable (but I haven’t digged into it) but as it’s not a federating service so not better than other alternative out there.
Also read that the keys are stored locally but also somehow stored in the cloud (??), which makes it all completely worthless if it is true.
That said, the three letter agencies can probably get in any android/apple phones if they want to, like I’m not forgetting the oh so convenient “bug” heartbleed…
Just so you know, “keys” are used for a number of purposes in Signal (and for software applications in general) and not all of those purposes involve encryption. Many keys are used for verification/authentication.
Assuming you were being earnest: I recommend that you take some courses on encryption and cybersecurity, because you have some clear misconceptions. Specifically, I recommend that you start with Cryptography I (by Stanford, hosted on Coursera. See also Stanford’s page for the course, which contains a link to the free textbook). Its follow-up, Crypto II, isn’t available on Coursera, but I believe that this 8 hour long Youtube video contains several of the lectures from it. Alternatively, Berkeley’s Zero Knowledge Proofs course would be a good follow-up, and basically everything (excepting the quizzes) appears to be freely available online.
it wouldn't be very interesting to encrypt them, because now you have another set of keys you have to deal with.
The link I shared with you has 6 keys (stretched_key, auth_key, c1, c2, master_key, and application_key) in a single code block. By encrypting the master key (used to derive application keys such as the one that encrypts social graph information) with a user-derived, stretched key, Signal can offer an optional feature: the ability to recover that encrypted information if their device is lost, stolen, wiped, etc., though of course message history is out of scope.
Full disk encryption also uses multiple keys in a similar way. Take LUKS, for example. Your drive is encrypted with a master key. You derive the master key by decrypting one of the access keys using its corresponding pass phrase. (Source: section 4.3 in the LUKS1 On-Disk Format Specification (I don't believe this basic behavior was changed in LUKS2).)
-
They have to know who the message needs to go to, granted. But they don’t have to know who the message comes from, hence why the sealed sender technique works. The recipient verifies the message via the keys that are exchanged if they have been communicating with that correspondent before or else it is a new message request.
So I don’t see how they can build social graphs if they don’t know who the sender if all messages are, they can only plot recipients which is not enough.
-
You need to identify yourself to receive your messages, and you send and receive messages from the same IP address, and there are typically not many if any other Signal users sharing the same IP address. So, the cryptography of "sealed sender" is just for show - the metadata privacy remains dependent on them keeping their promise not to correlate your receiving identity with the identities of the people you're sending to. If you assume that they'll keep that promise, then the sealed sender cryptography provides no benefit; if they don't keep the promise, sealed sender doesn't really help. They outsource the keeping of their promises to Amazon, btw (a major intelligence contractor).
-
Just in case sealed sender was actually making it inconvenient for the server to know who is talking to who... Signal silently falls back to "unsealed sender" messages if server returns 401 when trying to send "sealed sender" messages, which the server actually does sometimes. As the current lead dev of Signal-for-Android explains: "Sealed sender is not a guarantee, but rather a best-effort sort of thing" so "I don't think notifying the user of a unsealed send fallback is necessary".
Given the above, don't you think the fact that they've actually gone to the trouble of building sealed sender at all, which causes many people to espouse the belief you just did (that their cryptographic design renders them incapable of learning the social graph, not to mention learning which edges in the graph are most active, and when) puts them rather squarely in doth protest too much territory?
-
-
If I share an IP with 100 million other Signal users and I send a sealed sender message, how does Signal distinguish between me and the other 100 million users?
If you shared an IP with 100 million other Signal users that would be pretty amazing, given that there are less than 50 million users total. And don't you think the majority of Signal users share an IP with zero other Signal users most of the time? And when they do share an IP, it is with a relatively small number of other users? Anyway, even if the point of sealed sender was only to provide sender anonymity within the anonymity set of people you share an IP with, that would still be providing useful metadata. But also: it can be silently disabled by the server. See my other comment here.
-
With Signal, the key to encrypt your messages are on your device, and is never sent to the company.
Signal, and anyone who hacks them, or governments that attack them, cannot read your messages. This has been proven in court.
With Telegram, the key to encrypt your messages are on their server.
Telegram, and anyone who hacks them, or governments that attack them, can read all of your messages. This has also been proven in court.
-
Just so you know, “keys” are used for a number of purposes in Signal (and for software applications in general) and not all of those purposes involve encryption. Many keys are used for verification/authentication.
And it's I who should take a course in encryption and cybersecurity.
ROFL
Good to see you have your study material at hand though, and yes cryptography is complicated but you'll get the hang of it eventually.
-
And it's I who should take a course in encryption and cybersecurity.
Yes. I was trying to be nice, but you’re clearly completely ignorant and misinformed when it comes to information security. Given that you self described as a “cryptography nerd,” it’s honestly embarrassing.
But since you’ve doubled down on being rude, just because I pointed out that you don’t know what you’re talking about, it’s unlikely you’ll ever learn enough about the topic to have a productive conversation, anyway.
Have fun protecting your ignorance.
-
If I share an IP with 100 million other Signal users
That's already not very likely, but ignoring IP, you're the only one with your SSL keys. As part of authentication, you are identified. All the information about your device is transmitted. Then you stop identifying yourself in future messages, but your SSL keys tie your messages together. They are discarded once the message is decrypted by the server, so your messages should in theory be anonymised in the case of a leak to a third party. That seems to be what sealed sender is designed for, but it isn't what I'm concerned about.
daniel sent a user an image...
Right, but it's not other users I'm scared of. Signal also has my exit node.
What you’re describing is (not) alarming (...) Signal’s security team wrote.
I mean if strangers can find my city on the secret chat app I find that quite alarming. The example isn't that coarse, and Signal, being a centralised platform with 100% locked down strict access, they well could defend users against this.
What do you mean when you say “conversation” here?
When their keys are refreshed. I don't know how often. I meant a conversation as people understand it, not first time contact. My quick internet search says that the maximum age for profile keys is 30 days, but I would imagine in practice it's more often.
Even if we trust Signal, with Sealed Sender, without any sort of random delay in message delivery, a nation-state level adversary could observe inbound and outbound network activity and derive high confidence information about who’s contacting whom.
That is true, but no reason to cut Signal slack. If either party is in another country or on a VPN, then that's a mitigating factor against monitoring the whole network. But then if Signal is sharing their data with that adversary, then the VPN or being in a different country factors has been defeated.
Here’s the blog post from 2017
I appreciate the blog post and information. I don't trust them to only run the published server code. It's too juicy of an honeypot.
I don't have any comment on SGX here. It's one of those things where there's so many moving parts and so much secret information, and so much you have to understand and trust that it basically becomes impossible to verify or even put trust in someone who claims to have verified it. Sometimes it's an inappropriate position, but I think it's fine here: Signal doesn't offer me anything, I have no reason to put so much effort into understanding what can be verified with SGX.
And thanks for the audits archive.
-
you’re the only one with your SSL keys. As part of authentication, you are identified. All the information about your device is transmitted. Then you stop identifying yourself in future messages, but your SSL keys tie your messages together. They are discarded once the message is decrypted by the server, so your messages should in theory be anonymised in the case of a leak to a third party. That seems to be what sealed sender is designed for, but it isn’t what I’m concerned about.
Why do you think that Signal uses SSL client keys or that it transmits unique information about your device? Do you have a source for that or is it just an assumption?
-
No, that's just an assumption. It's very standard. But they do, this is the code for it. https://github.com/signalapp/Signal-Android/blob/main/app/src/main/java/org/conscrypt/ConscryptSignal.java
That doesn't confirm they send anything extra about your device, that's an assumption as well. -
I’m familiar with SSL in the context of webdev, where SSL (well, TLS) is standard, but there the standard only uses server certificates. Even as a best practice, consumer use cases for client certificates, where each client has a unique certificate, are extremely rare. In an app, I would assume that’s equally true, but that shared client certificates - where every install from Google Play uses the same certificate, possibly rotated from version to version, and likewise with other platforms, like the App Store, the apk you can download from their site, F-Droid, if they were on it, and releases of other apps that use the same servers, like Molly. Other platforms might share the same key or have different keys, but in either case, they’re shared among millions of users.
I’m not sure Signal does have a client certificate, but I believe they do have a shared API access key that isn’t part of the source code, and which they (at least previously) prohibited the use of by FOSS forks (and refused to grant them their own key)
That said, I reviewed that code, and while I’m not a big fan of Java and I’m not familiar with the Android APIs, I’m familiar with TLS connections in webdev, the terms are pretty similar cross-language, and I did work in Java for about five years, but I didn’t see anything when reviewing that file that makes me think client certificates are being generated or used. Can you elaborate on what I’m missing?