What are the reasons to use Signal over Telegram
-
[email protected]replied to [email protected] last edited by
In Telegram, you never have to expose your phone number. If you like walking into traps then of course you can.
But can make minimal efforts to not be a degenerate avoiding this obvious easily avoidable trap.
How to avoid exposing your phone number
Make a group called
i'm not a complete utter idiot
. Whenever you have a friend wanting to connect, make a group link, send it to them, have them join. After joining have them send a message in the group. Just, "Hi". Nothing more. Less is more.Look for that message and click on the person's name. You are now connected. Send them a personal message, "Hi!".
You can also add them as a contact without sharing your phone number.
Your friend will probably be a degenerate and expose their phone number. Teach them how to go into settings to always hide it.
Try not to call them a degenerate, degenerates hate that.
Also try not to think of them as a degenerate, they will already know that and be proud of it and not understand why you don't share their enthusiasm.
So control what your thoughts.
-
[email protected]replied to [email protected] last edited by
You are right. But just not a fun person capable of seeing the humor in this.
Everyone is listing the features of both and not choosing wrong on purpose!
I like sending all my conversations to Russia rather than the US. Or both of them.
As long as i have someone wasting their time trying to snap out of their sleepy deer in headlights stupor after listening to a coder talk about coding.
I also love doing this on facebook messenger too.
Everyone worries about censorship. One thing that is never censored is a coder talking about coding. Cuz the DEI hire head explodes after one second of listening to that.
Try it! It's magical.
-
[email protected]replied to [email protected] last edited by
Signal pretends not to.
I prefer Telegram's honesty.
We are Telegram and we are here to help. And to make it more fun we will send all your communications to Russia for a change.
Oh man! Where do i sign up /nosarc
-
[email protected]replied to [email protected] last edited by
This
1 + 1 = 2
logic is boring. It's trying to escape out of a wet paper bag over and over again. Whatever your1 + 1 = 2
logic is their is another guy who can drive a bus staight thru it. Every single time.In a year from now you will find out you are completely mistaken and just repeating nonsense. Every freak'n time.
Just for once, do the wrong thing. Make the wrong choice on purpose.
Instead of seeing never ending red flags. Today see purple flags. And tomorrow orange. Cuz why do flags always have to be red?
You can be right or you can have fun.
Do the wrong thing sometimes. Live a little.
-
[email protected]replied to [email protected] last edited by
The real world, with non-tech people needs solutions that are easy, fast and as close to foolproof as possible.
Nope. Grandma gets a smartphone
Meaning they are hopeless and it's impossible for them to emulate a techie.
It's a fools errand.
Just stop trying to pretend Grandma is something more than completely unimportant and forgettable and hopeless and more likely than not merely a pest.
I'm so tired of entertaining Grandmas.
-
[email protected]replied to [email protected] last edited by
I was sold on threats and coersion. Lets do more of that
-
[email protected]replied to [email protected] last edited by
Hopefully you aren't driving any buses while you're this high.
It's not never ending red flags. In fact, I see lots of green flags from signal. Telegram, though, that's a different story.
-
[email protected]replied to [email protected] last edited by
Then talk about coding. Non-techies curl up into a ball and die slightly inside as they run for the exits.
Highest form of encryption possible.
Try it
-
[email protected]replied to [email protected] last edited by
You are right but
we like doing the wrong thing over and over again. And being surprised, each and every time, when it turns out to be wrong. Never picking up onto the repeating simple pattern.
1111111111111 what's the next number ... errrr Signal! That's it you got it. Good job.
Embrace the idiocracy!
This is why Telegram is awesome.
Eventually you will come around and realize how hopeless humanity is and embrace that it is well beyond hope.
And then you will have a larger network and enjoy each and every one of them.
-
[email protected]replied to [email protected] last edited by
That's a neat trick, thanks for sharing
-
[email protected]replied to [email protected] last edited by
Also read that the keys are stored locally but also somehow stored in the cloud (??),
Which keys? Are they always stored or are they only stored under certain conditions? Are they encrypted as well? End to end encrypted?
which makes it all completely worthless if it is true.
It doesn’t, because what you described above could be fine or could have huge security ramifications. As it is, my guess is that you’re talking about how Signal supports secure value recovery. In that case:
- The key is used to encrypt your contacts, profile name, group avatars, social graph, etc., but not your messages.
- Your key is only uploaded to the cloud if you have a recovery PIN or passphrase
- Your key is encrypted using your PIN or passphrase using techniques (key-stretching, storing in server secure enclaves) that make it more difficult to brute force
The main criticism of this is that you can’t opt out of it without opting out of the Registration Lock, that it necessarily uses the same PIN or passphrase, and that, particularly because it isn’t clear that your PIN/passphrase is used for encryption, users are less likely to use more secure pass phrases here.
But even without the extra steps that we can’t 100% confirm, like the use of the Secure Enclave on servers and so on, this is e2ee, able to be opted out by the user, not able to be used to recover past messages, and not able to be used to decrypt future messages.
-
[email protected]replied to [email protected] last edited by
Message history won’t be fully fixed. It can’t be without storing message backups in some cloud somewhere (whether it’s to iCloud, Google Drive, Dropbox, or Signal’s servers) and Signal omits its message history from system backups on iOS and Android.
iOS users are completely incapable of backing up their message history in the event of their phone being lost, stolen, or broken. This omission isn’t justified in any way, as far as I’m aware; I don’t know of any technical reason why following the exact same process as on Android wouldn’t work.
Android users are able to back up locally via Signal, but that isn’t on by default, can’t be automated, needs to be backed up separately, requires you to record a 30 digit code to decrypt it, and has limitations on when it can be used for a restore (can’t restore on iOS, for example). See https://support.signal.org/hc/en-us/articles/360007059752-Backup-and-Restore-Messages for more details.
Message history on linked devices - meaning iPads and desktop computers - is being improved, but it still won’t mean that a user who loses or trades in their phone as they get a new phone will be able to simply restore their phone from a system backup and restore their Signal message history. And even that isn’t anywhere near as easy as on Telegram, where a user can just log in with their password and restore their message history, no backup needed.
It’s great that they’re improving the experience for linked devices, but right now that doesn’t actually help if you lose, break, or trade in your phone. Maybe they’ll later allow users to restore to a phone from a linked device or support backups on iPhones, but right now the situation with message history isn’t just an unfriendly UX, but one that is explicitly and intentionally unreliable for a huge portion of Signal’s user-base.
-
[email protected]replied to [email protected] last edited by
Your welcome. Use it in good health. And please excuse my colorful prose.
There is many many comments on Telegram bleeding the phone number. And only one comment saying that doesn't have to be the case.
-
[email protected]replied to [email protected] last edited by
i'm a milk tea addict. Carry around cinnamon and nutmeg. And hang out on github.
These are horrible vices. But no excuse for having divergent opinions.
Telegram is fine.
Signal will be gone tomorrow and you'll lose your network. Moving networks from one platform to another is impossible. So we end up creating new networks.
Currently i'm making a network of Python coders i've collaborated with. The communication medium is not consistent nor ideal.
Hate email with a passion. So of course most the communication is going over plain text email. Tried pushing for communication on plain text mastodon.
-
[email protected]replied to [email protected] last edited by
Okay. But this method doesn't address that the service doesn't need the message to include the sender to know who the sender is. The sender ('s unique device) can with 100% accuracy be appended to the message by the server after it's received. Even if we trust them on the parts that require trust, the setup as described by the blog doesn't do anything to prevent social graphs from being derived, since the sender is identified at the start of every conversation.
If we trust them not to store any logs (unverifiable), then this method means they can't precisely know how long a conversation was or how many messages were exchanged. But you can still know precisely when and how many messages both participants received, there's just a chance that they're talking to multiple people. Though if we're trusting them not to store logs (unverifiable), then there shouldn't be any data to cross reference to begin with. So if we can't trust them, then why are we trusting them not to take note of the sender?
The upside is that if the message is leaked to a third-party, there's less info in it now. I'm ignoring the Github link, not because I don't appreciate you finding it, but because I take the blog-post to be the mission statement for the code, and the blog doesn't promise a system that comprehensively hides the sender's identity. I trust their code to do what is described.
-
[email protected]replied to [email protected] last edited by
Despite being us-american and from Zuckerberg, it's an incredibly horrible app. I would not touch this shit with a 10m-pole. It might be e2e, but can I verify this in the source? Oh right.
-
[email protected]replied to [email protected] last edited by
I actually always deemed that a quality aspect. If those shitbags use tgram it has a reason.
Sadly it's not really great for the app itself. So he had to do something about it. IMHO the best compromise he could do other than just staying "the bad guy". -
[email protected]replied to [email protected] last edited by
Nice try FBI.
Well, if my pin is four numbers, that'll make it so hard to crack. /s
If you can't show hard evidence that everything is offline locally, no keys stored in the cloud, then it's just not secure.
BTW, "keys" when talking about encryption is the keys used to encrypt and decrypt, it wouldn't be very interesting to encrypt them, because now you have another set of keys you have to deal with.
-
[email protected]replied to [email protected] last edited by
The sender ('s unique device) can with 100% accuracy be appended to the message by the server after it's received.
How?
If I share an IP with 100 million other Signal users and I send a sealed sender message, how does Signal distinguish between me and the other 100 million users? My sender certificate is encrypted and only able to be decrypted by the recipient.
If I’m the only user with my IP address, then sure, Signal could identify me. I can use a VPN or similar technology if I’m concerned about this, of course. Signal doesn’t consider obscuring IPs to be in scope for their mission - there was a recent Cloudflare vulnerability that impacted Signal where they mentioned this. From https://www.404media.co/cloudflare-issue-can-leak-chat-app-users-broad-location/
404 Media asked daniel to demonstrate the issue by learning the location of multiple Signal users with their consent. In one case, daniel sent a user an image. Soon after, daniel sent a link to a Google Maps page showing the city the user was likely in.
…
404 Media first asked Signal for comment in early December. The organization did not provide a statement in time for publication, but daniel shared their response to his bug report.
“What you're describing (observing cache hits and misses) is a generic property of how Content Distribution Networks function. Signal's use of CDNs is neither unique nor alarming, and also doesn't impact Signal's end-to-end encryption. CDNs are utilized by every popular application and website on the internet, and they are essential for high-performance and reliability while serving a global audience,” Signal’s security team wrote.
“There is already a large body of existing work that explores this topic in detail, but if someone needs to completely obscure their network location (especially at a level as coarse and imprecise as the example that appears in your video) a VPN is absolutely necessary. That functionality falls outside of Signal's scope. Signal protects the privacy of your messages and calls, but it has never attempted to fully replicate the set of network-layer anonymity features that projects like Wireguard, Tor, and other open-source VPN software can provide,” it added.
I saw a post about this recently on Lemmy (and Reddit), so there’s probably more discussion there.
since the sender is identified at the start of every conversation.
What do you mean when you say “conversation” here? Do you mean when you first access a user’s profile key, which is required to send a sealed sender message to them if they haven’t enabled “Allow From Anyone” in their settings? If so, then yes, the sender’s identity when requesting the contact would necessarily be exposed. If the recipient has that option enabled, that’s not necessarily true, but I don’t know for sure.
Even if we trust Signal, with Sealed Sender, without any sort of random delay in message delivery, a nation-state level adversary could observe inbound and outbound network activity and derive high confidence information about who’s contacting whom.
All of that said, my understanding is that contact discovery is a bigger vulnerability than Sealed Sender if we don’t trust Signal’s servers. Here’s the blog post from 2017 where Moxie describe their approach. (See also this blog post where they talk about improvements to “Oblivious RAM,” though it doesn’t have more information on SGX.) He basically said “This solution isn’t great if you don’t trust that the servers are running verified code.”
This method of contact discovery isn’t ideal because of these shortcomings, but at the very least the Signal service’s design does not depend on knowledge of a user’s social graph in order to function. This has meant that if you trust the Signal service to be running the published server source code, then the Signal service has no durable knowledge of a user’s social graph if it is hacked or subpoenaed.
He then continued on to describe their use of SGX and remote attestation over a network, which was touched on in the Sealed Sender post. Specifically:
Modern Intel chips support a feature called Software Guard Extensions (SGX). SGX allows applications to provision a “secure enclave” that is isolated from the host operating system and kernel, similar to technologies like ARM’s TrustZone. SGX enclaves also support a feature called remote attestation. Remote attestation provides a cryptographic guarantee of the code that is running in a remote enclave over a network.
Later in that blog post, Moxie says “The enclave code builds reproducibly, so anyone can verify that the published source code corresponds to the MRENCLAVE value of the remote enclave.” But how do we actually perform this remote attestation? And is it as secure and reliable as Signal attests?
In the docs for the “auditee” application, the Examples page provides some additional information and describes how to use their tool to verify the MRENCLAVE value. Note that they also say that the tool is a work in progress and shouldn’t be trusted. The Intel SGX documentation likely has information as well, but most of the links that I found were dead, so I didn’t investigate further.
A blog post titled Enhancing trust for SGX enclaves raised some concerns with SGX’s current implementation, specifically mentioning Signal’s usage, and suggested (and implemented) some improvements.
I haven’t personally verified the MRENCLAVE values for any of Signal’s services and I’m not aware of anyone who has (successfully, at least), but I also haven’t seen any security experts stating that the technology is unsound or doesn’t actually do what’s claimed.
Finally, I recommend you check out https://community.signalusers.org/t/overview-of-third-party-security-audits/13243 - some of the issues noted there involve the social graph and at least one involves Sealed Sender specifically (though the link is dead; I didn’t check to see if the Internet Archive has a backup).
-
[email protected]replied to [email protected] last edited by
I think whas this video https://www.youtube.com/watch?v=A8ZXDiQLH9I