Which Messaging App is Truly Secure?

Published Oct 25, 2023, 4:00 AM

Robert sits down with Cooper Quinton and Caroline Sinders from the Tech Policy Press to discuss encrypted messaging apps, and which are really secure.

Also media.

Welcome back to it could happen here. I am Robert Evans, and this is a podcast about things falling apart. Sometimes it's about how to make things not fall apart, and other times it's more about enduring it. Today is more on the endurance side of things. And we're talking about a subject that we get a lot of requests about here. We've discussed this a year or so ago with one of our guests, a great Carl Pasarta. We're talking about like security culture, and particularly the aspect of security culture that involves digital devices and how to communicate with your friends' affinity groups, whomever via your phone essentially or your computer. This is a thing where there's a huge amount of disinformation as to which apps are safe. What does it actually mean to say that an app is encrypted? How far does encryption get you? What sort of like cultural things come alongside the actual, like physical reality of the security of the device in order to kind of make a comprehensive security profile. We're gonna be talking about all that today and hopefully giving you some good advice on what you can trust. Because I am the furthest thing in the world from a technical expert. We have two actual experts with us today. Carolyn Senders and Cooper Quinton have both recently published a paper alongside several other authors Leila Wagner, Tim Bernard, Ami Meta, and Justin Hendricks called what Is Secure? An Analysis of Popular Messaging Apps, and it's it's basically going over what is the actual level of security with a number of things like Telegram, you know, Telegram's private messaging system, Facebook Messenger, Apple Message, or I Message. I guess it's called and obviously signal and kind of as a spoiler, signal is your best bet, but that also isn't where you should end, right I think we want to also talk about kind of like why and to what extent that's the case. But anyway, I'm going to turn things over to Carolyn and Cooper now because I have talked enough about this. Hey guys, welcome to the show.

Hey, Robert, thanks so much for having us on.

Yeah ah, yeah, thank you so much, A big fan of the podcast, so always lovely, really lovely to be here.

Yeah, thank you so much.

Yeah, it's really lovely to have you both again. Listeners, if you want to take a look at this their paper, if you just google what is secure and analysis of popular messaging apps. You'll find the Tech Policy Press has a summary of it that's pretty quick. The full paper is eighty six pages or so. I also recommend reading that, but if you wanted to give this the summary a skin before you continue, that might help. But I kind of wanted to start by asking you, guys, what is it that makes Signal a good option for people? Right? Because I think most folks you describe it as sort of security folklore, right, the stuff that you hear about security from your friends, and if you're not a technical person, you kind of just like trust what the folks around you were saying. And that was sort of how I got into Signal. Right, I'm not a technical person, but people I knew and trusted who were were like, this is your best option?

Yeah, thank you so much.

That's such a good question, and I think Cooper and I probably have similar but also like very different answers to it.

Cooper, I can go first if you want.

One of the things I love about Signal is it's just really easy to use. It's in and encrypted, it's a messaging app. There's not a lot of stuff on it, but you can do a lot with it, so you can do video calls, you can send actually pretty large files like PDFs.

You can have drag and drop stuff.

It's like such a low threshold for use for users because it is a messaging app, but it does so many different kinds of things. But then related to that, it's also actually quite minimal. So the paper which everyone should read and we'll probably get into this later. Different apps like Telegram or Facebook's Messenger app, for example, have have.

This thing we've been calling feature bloat.

They are messaging services that actually feel a bit more like social networks if you look at the amount of stuff that's on there, and by stuff, I don't just mean like stickers, I mean if you look at there's all these sort of specific and strange settings you can use to have all different kinds of messages and all different kinds of privacy settings, and all privacy settings are really really great. Because Telegram and Facebook Messenger are not encrypted by default, actually some of those settings can make you feel more secure when you're not so. Kind of the beauty of Signal is that out of the box, it's incredibly secure. It's an inn encrypted They're not holding any data about you. I believe the only only day they hold is like when you've like when a phone number or a profile has downloaded signal, like when you've when you've signed up. But again it's it's incredibly easy to use. And another thing is, you know, if this was a few years ago, we've been looking at wire for example.

One of the nice things about Signal.

And this might be controversial to some, is that it does follow modern design patterns and standards. So if you're using an iOS or Android version, like there are buttons in places where you expect them to be. Signal is not perfectly designed, but it is quite usable. Yeah, So for me, that's kind of what I think makes it makes it really wonderful.

Yeah, it's definitely as much as I love it, and it's my like standard messaging app I do every now and then run into the thing where like my friends will call me through Signal, which is great if you need a call to be secure, but it's not nearly as good, Like it drops a lot more often than a regular phone call, and I'm like, we're just trying to meet at the movie theater. It's okay if the nsay noes right, Like I've.

Definitely had that with friends where I'm like, I'm like yeah, I'm like, we're just calling to talk about like your dog.

It's probably fine.

Yeah, the FBI can have this stuff.

Yeah, please send, please send, please send dog picks through all all messaging apps.

You know. But on that note, it's uh, right, writing usable software that is also secure is really hard, right, And like as a like as cryptographer, I'm not a cryptographer, but like as somebody cryptographer adjacent, we got that wrong for a long time, right, Like before Signal the problem, you know, there were the the the sort of most used encryption methods were probably uh PGP email, which is a method for encrypting email, and off the record chats, and both of those none of those ever got to the sort of level of user base that Signal and and certainly not WhatsApp have, right, And and that's largely because they were pretty much unusable, like PGP, almost entirely unusable even by cryptography professionals, right, even by computer security professionals like ourselves. OTR chat total pain in the butt, right, like just just a real nightmare to use. So, like Signal, there are still some rough edges, and we talked about some of those in our paper. But overall, I think that the big the big innovation they've had is just remembering that what people want to do on a chat app is not encrypt things. What people want to do on a chat app is they want to they want to chat right. And and the second that that that the security sort of gets in the way of that, people will stop using it and go find something that's more usable. And it seems like that's been Signals sort of guiding star and it's and they've you know, doing the doing the most secure thing that you can will still being fun and usable to actually just chat on right. And I think that that has served them quite well.

Yeah, I think there's it's it's so important. One of I think one of the things that that contributes to good overall security is setting yourself up for success, which means setting yourself up for a system that can function well if you're lazy, which is one of the nice things that you know, with Signal, you don't have to worry about like opting in and out and like selecting a bunch of stuff. It's pretty safe, especially for a normal person's uses right out of the box, which is huge and kind of in the same line, as that is the fact that because Signal doesn't store metadata, you're not relying upon them being like committed you know, anti state actors or whatever like, because they don't have access to the thing that for example, Facebook will hand over to the cops if the cops just like breathe in their direction.

Yeah, that's that's exactly right, And that's that is that is the other really cool thing about Signal. You know, we, as Carolyn said, the only data that Signal gives over in response to uh A subpoena is the time that the phone number signed up for Signal account and the last time it connected to the Signal server. And the reason we know that is because Signal publishes transparency reports with the full text and full response of any subpoena that they get, so like we can actually just see in the responses that all they've given over is these two pieces of information, because that's all they have, and they've done some pretty clever things to make that be the.

Case, right, And that's actually so different than how other companies are I think reporting on either subpoenas or any kind of.

Weight that law enforcement puts on them.

So for our report, I don't remember how much it's it's mentioned in the report actually, but we did go through and look at Apple Meta and I think Google like in their own transparency reports to try to get a sense of how how that would stack up in comparison to Signals. I think in some cases it's saying like they received some kind of like notification, but like no, nothing really clear or specific on like what they received from law enforcement or government, but rather just that they received one. And so that's also the really great thing about Signal is you are getting all of this information that you're not getting from other companies or PLATF forms.

Yeah, you know, I wanted to kind of in the same subject and going back to we kind of opened this introducing the concept that y'all introduced me to. I guess I was aware of this, but not the terminology security folklore, and I wanted to chat a little bit about kind of the most recent example of this something a lot of folks have probably been wondering about since we started talking about Signal, which is that roughly a week before y'all and I sat down to talk about this, a kind of viral info meme started coming through that was like Signal has a zero day exploit, which is basically a hole that a hacker found in an Apple program. That is that can't expose you. You have to turn off link previews, right, which is that when you when someone sends you like a link to an article in Signal, you get a little preview not dissimilar to how it is. And I think to be fair, just based on my very limited knowledge, that is, when I think about, like, what are potential holes in Signal, I don't think it's unreasonable to be concerned about that specific feature. But that warning was not what it kind of seemed to be basically, or not as accurate as I think a lot of people took it as being.

I don't know.

I'll let I'll tell I'll turn it over to you, guys. I think that's the next thing I want to talk about.

I'll turn it over to Cooper, who had you had a Uh, you have a lot of feels about that.

I have so many feelings about this. I was working on this all weekend.

So this yeah, so this copy pasta I'm calling this like this signal copy pasta yeah, which is a term from you know, four Chan and other horrible internet places, but some.

Media audience is probably Internet enough.

Yeah, I'm gonna guess a good half of the people listening at least got that message.

Yeah yeah. And it's it's like, first of all, this is not if you if you had a zero day in Signal, which is it, which is an exploit for Signal that has been unpatched but has not been patched by the vendor so you can actively exploit it. There are no people in the world who would choose to quietly leak this over you know, over vague signal texts. There are two types of people. One uh, you know, people like us that would bring this to Signal immediately and get them to patch it to protect the you know, millions of high risk users that you signal, or to the type of people that would go sell this exploit to some horrible company that would use it, you know, sell it to to Saudi Arabia or something and use it to kill activists. Right, Like there is and there's no in between. There's nobody that is going to quietly leak this for you know, just for fun with vague details. Right. So, so this this message set up red flags immediately, and like it's because I really do not like lying previews, And in our paper we discussed some of the issues that we have with link previews. You know, we think that they can they can leak some information about your chats to the owner of the website. Right. We think it's a kind of a large attack service. It's not super necessary.

Would you mind explaining to actually the audience to like a little bit about what what we found when looking at link previews.

Yeah. So, the way that link previews work is when you the way that they work on Signal and on WhatsApp is that when you send a link to somebody, the Signal app or WhatsApp goes and like fetches the web page that that you know for that link, Right, It goes and downloads, you know, downloads the content of that link and gets a There are some there's some special HTML tags that describe, you know, sort of what the page is about, what the title of the page is, and like an image for the page. And it gets those tags and it puts them all together in this little package and then sends that all as part of the signal message. So when you put a link in Signal, your phone actually goes out and gets that web page, and it gets that web page with a what's called the user agent, which is like a piece of text that's attached to the request that uniquely that that identifies it as being a request from Signal and from like from signal and from your IP address. Right, So when you put a link in, the owner of that website, whoever has the logs for that website can know that somebody at your IP address is using signal and sending this link over signal. What we're what our concern is is that if that.

Link is unique, then anybody else who visits that link can be inferred to be somebody that you are talking with over.

Signal, right, And so like this can be this can be a good an interesting a source of intelligence for website owners, especially for big websites that can easily generate unique links with like tracking parameters at the end of the right, Like when you share a Instagram post and like at the end it's like question mark I G S H I D equals you know, a long string of numbers and letters, right, or a Twitter post where you know T equals a long string of letters and numbers. Right. That makes a unique link, and then anybody who visits that same link can be determined to be somebody that you're speaking with over Signal, so and also WhatsApp and so.

So for that reason, we we we think that Signal and WhatsApp should turn link previews off by by default because we think that that's an unncessary information. Link Signal and WhatsApps pushed back on that is that link previews are a core feature that people demand and if they if they were to turn off link previews by default, they're worried that people would leave the platform for less secure platforms like Telegram.

Yeah.

I mean, I don't want to tell them their business, because I'm sure they have data on this, but I've never thought about link previews as being a thing that I needed.

It's like, yeah, I think it's I think it's one of those things. And you know, we haven't necessarily done like extensive general design research in this right, Like we haven't surveyed like three thousand people in the US. We haven't had like a Pew Research survey across countries and be like, what are your thoughts on link previews?

But I would.

Probably argue because it is it is included in so much of modern messaging.

Apps that we now assume it's like a core feature.

One thing I will give Signal that I think is amazing that other apps don't do, and this is true of WhatsApp is pretty much every feature except for encryption, you can there's something you can toggle or turn off. Right, So like link preview already was available for people to turn off on Signal, WhatsApp does not allow that, and it seems like they're making no moves to allow that future to be optional to turn on or off.

But that is I will say.

One of the things that's really lovely about Signal that is so different from modern design and modern like big tech platforms and just platforms in general, is that those a lot of features are optional, whereas you know, WhatsApp in metas sort of stance on design is that a lot of things are not optional, that those are things users would want. Why would we make foundational elements like link previews optional? And you're just like starting like gesturing wildly, but like you know, it's like, well, you don't know what people want, and I mean, what's the harm in turning off some of some of these things?

Right?

You know, like maybe maybe people don't want to receive gifts. I don't know, maybe they don't want to receive stickers. Why don't you like let them have that option. What's the harm that could happen?

Yeah?

Yeah, yeah, I couldn't agree more.

Yeah. Two things I want to say that one is one is that and first we should acknowledge that this it turns out that there was no zero day, there was no vulnerability. Yeah, this was absolutely just something that that spread virally out of nowhere. I'd be really interested to find out what the origin of this copy of past I was, but I haven't. I haven't been able to. But it's I'm.

Curious about that as as well.

Because I was in another group threw that was like, we really need outside auditors to look at these.

And I was like, we have a whole report that we wrote that didn't look at this.

Speaking of outside auditors, I gotta pause you guys just a second, because it is time for an ad break, So please spend your money and then come back to learn more. Ah and we're back. Okay, sorry about that, Cooper, Carolyn, you make continue as you were.

The other thing I was, I was going to say that the idea that anybody would leave WhatsApp because they stopped having link previews is completely preposterous to me. Like Clownish has over two billion users. They are the you know, in a position to set the standard for what people expect from a messaging app, and so like they could do things like turn on disappearing messages by default and change that culture. They could do things like turn off link previews by default and change that culture. Like, they could do these things, and you know they would you know, they would not lose enough users to even notice or care about.

Right.

Yeah, they are the only people in the position in the world, in the position to decide what the culture should be, and this is what they've decided the culture should be.

Totally.

I hate to break it to you, but if WhatsApp just got rid of link previews, I'm just throwing my whole phone into the garbage garbage can, getting rid of it.

Just tossing it back to a landline.

Yeah, I'm just.

Gonna eat it into a river. I feel like I don't need this anymore. Actually, I'm going back to carryer pigeons.

That's how far back I'm going to go.

I mean that that does kind of lead into the next thing I wanted to talk about, which is sort of the other wing from security folklore, which is security nihilism. And yeah, this is kind of you introduce this when talking about sort of if you do try to engage somewhat with the technology, or if you wind up just kind of in the position I think most lay people are, where you know, maybe you have some friends who know more, or maybe you have some friends who think they know more, and you get all these conflicting things about like this is safe, No, it's not. You can't trust signal. The FEDS could be running signal all this kind of stuff, And to be fair, the FEDS have run security based services before. It's not like I don't believe that's happening with signal, but it's not like I understand where parent like that can can enter into people's calculus, especially if you're not technically knowledgeable, and that can lead to this sort of state of security nihilism where you're just like, you can't communicate it all online. There's no way to do it securely, and obviously there's no perfect right you never have it, but you don't have one hundred percent with like talking in person to somebody. Right there are individuals in prison right now who you know somebody they loved and trusted rat it on them. There's no one hundred percents in this world. But that doesn't mean nihilism is the right response to like trying to figure out how to set up your communications standards with people right.

Totally, I mean, I think the approach we take in because throughout this report we were also teaching workshops to reproductive justice activists across the US and states where abortion is banned. I'm from Louisiana, I live half the year there, the abortion is banned there, and we were also working with journalists in India. So a big big thing for us was also teaching threat modeling and different kinds of what Matt Mitchell, a security trainer and expert, calls digital hygiene, and so a lot of this was recognizing that there was certain practices we were picking up on, particularly with folks we were working with. So like a lot of reproductive justice activists we were working with are new to security, they're new to technology, they don't have a background in tech, and generally, you know, the American South, the American Deep South is super overlooked in terms of tech policy, in terms of just I think a general focus when people are talking about tech or tech literacy or tech activism, and that is like leaving really massive gaps and knowledge for people. And so you know, when we were working on this security folkal or and security nihilism, we're both actually very.

Almost like I won't say, like a pendulum, but they were very connected. And so some of that was.

People hearing things like oh, I should put my phone in a microwave when I'm having a very sensitive conversation, right, And so that's where some of that security folklore is coming in. It is something that is technically safe, but it's like not the thing you necessarily, like totally need to do in that moment. And with security nihilism, what it kind of came down to, and this is stuff we've seen with other groups and other circumstances. A great example are are you know Palestinian activists and journalists. Let's say, who are you know facing the threat of all different kinds of governmental censorship and surveillance of sort of saying like, when there's this large threat sort of hanging on us, and there's also physical surveillance. And this is true for a lot of journalists in other countries like India as well. For example, you know, like should everything go through signal or does it really matter?

Like does it really matter?

And this is also something again we saw with some some reproductive justice activists as well, where it's like if everything is being monitored, what's safe?

Like can I send stuff? Like can I even use Google?

And part of this was, you know, by teaching privacy and security workshops, by teaching things like threat modeling, which is a framework for just assessing what.

Are what are threats?

Like are what are all the potential threats you could face and kind of mapping them from like the most minor to like the most major, and what you can do about that. That's a way to try to combat security nihilism. But I think an approach Cooper and I are also really fond of is thinking of this like safer sex. There's all different kinds of things you can do that our mitigations are actually incredibly helpful, and we can't look at it as a binary of safe or not safe. It's actually like much more of a gradient. But you know, the focal are and the nihilism, I think come from a very similar place, which is we're asking people like society is kind of asking or demanding that people be experts and something that's really hard. I am like a fairly technical person, and even there are some things that I find hard to serve wrap my head around. And I've been working in privacy and security for like quite a while, and I think think, you know, it's also really hard when you think about these apps as like a brand new person. So, like, one of the things that popped up a lot in our research is like why should we trust signal? And that's actually a great question, Like what about Signal in its interface and its design.

Would cause you to trust it? Like some people were like it's a nonprofit. That's great, but I don't know what that means. I'm like, that's actually a fantastic question, Like what does that mean? Right?

Like why should you trust this? You've heard through the grapevine that you should. And I think these are kind of all the things that people are dealing with because if you sort of take a step back and just look at software or any different kinds of software generally, why should you trust that it's safe and secure when there have been so many different kinds of leaks or breaches or things breaking, right, Yeah, Like, so these are I think really really closely tied. But I think a big thing for us is trying to combat that security nihilism. Like when whenever we can like, there is things you can do. I don't want to say like no matter how great the threat, but I believe like, no matter how great threat, there is stuff, there is stuff you can do.

No matter how great the threat is, there's stuff that you can do to make it more difficult and more expensive for that person to attack you. Right, Like we all lock the doors to our house, or you know, for the most part, or you know, we all we all do things to to protect ourselves like that that aren't fool proof, right. Somebody can always break a window to get into your house, right. Somebody can find other ways to get into your house. But locking the door makes it so that somebody has to do the noisy thing of breaking a window. Right. It makes it so that, you know, somebody has to spend more time and effort and more risk of getting caught in getting into your house. Right. And that's and that's like we layer. When you layer these protections, right, the idea, you know is that you're you're you're making it harder, You're making there be more friction right to piercing your security.

Yeah, I think that's that's a really good point, and that the concept of friction, you know, this is something I've talked about. Not that these are exactly the same things, but in the although there's not not wildly different when it comes to like how insurgents win insurgencies, right, it's not by carrying out these sort of like great battlefield victories that sweep the enemy from the field. It's by friction, right, which wears down both the culture and the kind of readiness of the opponent until they simply bounce, which is a pretty durable and effective strategy. You can keep it up. It's this matter of like there's no like sweeping sudden like ninety minute three act win here. It's more a matter of the more difficult, the more expensive you make it, the more you hold on to and the more all of us hold on to. Right. That's the other benefit is like, even if you're not even if you are the most law abiding person in the world like myself, having these security measures in place means that you're kind of contributing to the overall immune system of a of a kind of community of people who don't want the NSA listening to their ship.

Yeah, exactly, exactly. And the friction thing is is also exactly what Signal does, right, Like by the threat model for Signal is stopping the NSA or other global adversaries from listening to all communications as they travel over the internet, right, And that's when you can when you can do that, like when you can when you can listen to everybody's conversations as they travel over the Internet, it's really cheap to spy on anybody. Right when you're encrypting that communication, then the NSA or whatever other global adversary has to go actually hack your phone, right, they have to. They have to target you specifically, they have to burn resources and you know, burn weapons, right, zero days to get access to your phone. And that's a lot more costly, it's a lot more noisy, it's a much higher risk of them getting caught. So it's introduced to huge friction, uh in that in that area.

Go ahead, okay, go ahead, go.

Ahead, I say, and I think you're asymmetic. The sort of comparison to asymmetric warfare is exactly spot on, because none of us are ever going to have the money that that the NSA or Masade has. None of us are ever going to have the the total technical acumen that the NSA or MASAD has, right, but like those that you know, so we have to kind of fight a you know, in terms of caryption, in terms of encryption, a guerrilla war, right, and we have to make things so expensive and so annoying for them that it's not.

Worth it totally. And just to sort of building that.

One of the things I love about Signal is while they're creating friction for our adversaries, it's actually so frictionless to use as a user. And I think that's one of the things I find just continually impressive about that. I don't want this to turn into like the like.

We're all himbos for signal. Looks we probably are.

But because like that's one of the things as a researcher like Kuber and IOMs have to be like, we're not paid by a signal at all, Like, but this is in fact, like one of the best things you can use. But again, one of the things I think is amazing is that it is so easy to use and it really is designed for and I'm using the term usability as as a design term, meaning that it is they're thinking about a common user, including those with like lower digital literacy or those that are have never used any kind of any kind of security tool, and so they're hitting a specific threshold of usability for things to be understandable. And again, that's incredibly hard to do well, and they are they are doing it quite well. Like it's very I would argue, it's very easy and sort of seamless for people to make a jump from WhatsApp or if you're on like Google or Android using like Google Messages, sorry Google, if you're on Android or an iPhone, from Like Messages to Google Messages to signal like it doesn't It might look slightly different. I might feel a lot more blue, I might feel a lot more black, depending on how yours is constructed. But for the most part, a lot of the features are kind of where you expect them to be, and it's not at all difficult to get it up and running, which is not something against Cooper said earlier.

We could say about things like PGP.

Yeah, I wanted to kind of move on to talking about other apps and their security or lack of it, and I think we should start probably by talking about Telegram, because that's probably close to top of the list of things people use for secure communications that is not nearly as secure as they think. So, yeah, I wanted to kind of chat with you about like why that is, and I specifically I wanted to talk one of the things that is frustrating about Telegram is they kind of have they have like a secret chat or private chat, like they have a couple of different options that don't necessarily mean what they sound like they mean to most people.

Yeah, so that's actually one thing our report found. So private chat and secret chat are in fact.

The same thing.

They're just called slightly different things in the app, which for for again, for those listening.

That are don't have the background in design, that's bad design.

That's actually not that's not professional, that's a that is a mistake. There's no reason for a feature to have like two different names inside of inside of your software. And so I don't know if that's an oversight on their part. I'm assuming so, but like those two things correlate to the same feature, and so they should actually be called the same thing. But then even further that being said, what does private mean to a user?

What does secret mean?

You know, Facebook Messenger they call their encrypted message secure or no, they also call it secret.

Sorry, they also call it secret. But does that mean security? Does that mean encrypted?

And so that's like one of the one of the weird things where it's like, you know, I think by using a very sort of like normalize or culturally almost like emotional name like private, it makes something seem like it's actually quite safe, when in fact it's not. And there's a variety of reasons as why, like Telegram is not not a very secure app that I will let Cooper.

Talk about more.

Yeah, I would never advise anybody to have a chat over Telegram if they are concerned about the privacy of that. Yeah, So we were talking about friction and the fact that and encrypted chats are not the default in Telegram creates a friction for users to have an actually secure chat. Right you have to go remember to turn it.

On, and you can only turn it on turn it on individually per message. It's not like an overall feature on Telegram or Facebook Messenger. You have to go select a specific like the specific conversation per conversation, which is and another thing ourper gets into is how also those chats don't look very different. They look almost identical to a normal chat. So for for low vision users or anyone with any kind of like disability, especially a vision related disability, it's almost impossible to it's like nearly impossible to recognize which chat you're using if you're looking at.

The chat logs.

Yeah, outside of that, like if people, you know, in terms of like things, that may not be options right now, I think basically everyone listening signal is a perfectly viable option. But it's not impossible that, for example, you might wind up in a country where, even if there's not a specific law against it, there is a precedent established that if you have signal on your phone, you know, it can be at least used as a justification for charges that you are planning to use. Like you know, with Atlanta, people are getting charges because they had a lawyer's name written on their arm right, and and so the state saying, well, that's evidence that we're planning to commit a crime. You know, that doesn't mean that convictions will go through in that kind of thing, but it may be a reason why signal might not be an option, or say, you know, something comes out about it that makes it seem less secure. What are other good or or acceptable options? And I know when we're talking about this, these are often options that require more input and work from the user in order to maximize their potential security. But I do think it's good to like let people kind of know what else is out there.

Yeah, so when signal isn't an option, WhatsApp is actually not a bad option. So WhatsApp it is owned by Meta, which is a you know which is which is you know not which is not ideal? But WhatsApp actually uses the same encryption protocol as signal. Uh so, like under the hood, the way that the you know, the way that the math works to hide your messages from the NSA is exactly the same, right, and they've implemented it well. You know, there are a few more steps that, you know, a few more precautions that you need to take with WhatsApp, like making sure that your chats aren't backed up being the main one. But WhatsApp is certainly good enough, right if you're if you're you know, chat networks aren't using signal, if you're in a country where you can't use signal, right, Like WhatsApp has two billion users, I'm you know, it's it's you can use WhatsApp almost anywhere in the world. It's and it's ubiquitous enough that it's not going to mark you as you know, somebody with something to hide, right, And like, and I don't want to I don't want to discount what's app. Right, getting two billion people to have end to end encrypted messaging by default overnight basically was a major cupe. Like that that was world changing, right, and like they they really do deserve applause for that obviously, you know, I think partly because of their scale, partly because they're own b meta, right, they haven't taken all of these same steps, Like they do have more metadata on their servers than Signal does. Right. But if that's your option, that is a fine option.

Yeah, I think that's really good to know, particularly since options are always more secure than not having any kind of a backup plan totally.

And if people are like even slightly nervous about WhatsApp, of great things, they do have disappearing messages. The downside is like the fastest disappearing message is only twenty four hours. But that's something that again you still have, and that's like that is that is an amazing feature.

Yeah, and that kind of gets into also what kind of stuff you can do in order to maximize the value of features like that. Like, for example, if you're coming back into the country or a country and your phone gets confiscated by customs or whatever because security so uses have some sort of eye on you for whatever reason. If you've got you know, thumbprint log in or face log in, they're going to get into that phone right in your twenty four hour delete thing may not have gotten taken care of everything. If you've got like a complicated eight digit password and no biometrics enabled, maybe depending on where you are and whatnot, that'll keep your phone locked long enough for those messages to get deleted, right, Like, it's all about kind of maximizing the chances that something.

Like that helps. Yeah, exactly. We definitely recommend that people turn on disappearing messages. I think that that's just a good sensible default to have. Also definitely recommend that if you're going to be in a situation where you think you're going to be, you know, there's a higher likelihood if you interacting with law enforcement, if you're crossing a border, if you're going to a protest turn off the biometric unlock on your phone. Certainly, especially in the US, there's the case law isn't settled, but there's a lot of state courts that have decided that police can force you to unlock your phone with your biometrics and that that's totally fine. So this, you know, in the in the US context, is a good idea in any context. I think it's a good idea if you're at heightened risk to turn off totally.

I mean, one thing we're also a big fan of is figuring out too like and this is again where threat modeling is so key, is like, is this a circumstance where you need your phone or another thing that you know you can always do if you are nervous about traveling across the border, is you can delete signal and reinstall it and everything is gone. You can delete WhatsApp temporarily while you're crossing a border so it's not on your phone. You know, there are things like that you can do if you feel comfortable wiping your phone, that's something also you can do. You know, these are all again these are these are these are different things, and I think this is one of the things our report I don't remember how too much we get into a bit something that at least we've been thinking about. Cooper and I run a little lab called Complication, and one of the things we've been thinking about there is just also how do we instill sort of like better, better holistic practices where we understand that a phone is just one component of our safety and so like secure messaging, encrypted messaging is one component of that safe safety. So like what are other things we can do?

And some of that can.

Be you know, wiping your phone if traveling, if that makes sense for you, or if that's a thing that makes you feel safer, or removing certain apps and then you know, reinstalling them, reinstalling them later.

Yeah, yeah, and it and it really is holistic. Right, Like a thing that a thing that people need to keep in mind is that, you know, disappearing messages can't stop an untrustworthy uh conversation partner, right Like if if my conversation partner is untrustworthy, they can take screenshots of the messages, right, they can you know, go they can go snitch to law enforcement about what I've hold them. Right, Encrypted messaging, discipree messages, These are not panaspeas. Right, you still have to you still have to keep all of your other aspects of security as well, right, So don't entirely rely on these technologies to save you, right, you have to also trust the people you're working with and build these layers of security.

Yet it's true, I mean, Cooper, you could leak all of my secrets right now on this podcast and them.

That too, what a gentleman.

And that is that is the other thing, right where when it comes to like what is secure, one thing to remember is that signal for all the good things about it, Nothing, nothing at all about that app stops the recipient of a message from you from taking a screen grab or just handing their phone over to their friendly local federal agent, right, which is always you know, we don't want to be I'm not trying to be a security nihilist here. I think you know, there's no replacing communication over phones in many situations. But if you are, for example, going to be transferring a bunch of Plan B pills in an area where that is prosecutable, that probably shouldn't go on your phone in that language. Right, Perhaps you know you could come up with a clever codeword or whatever. But don't you know it is security is like you said, holistic. You know you should not be looking at it as just like, well the app is secure.

So that's enough.

I mean, one thing I also want people sort of think about too, because that's a really great point Robert, is like, we do all different kinds of things every day in our lives that could, you know, in dangerous. Like I think a lot of the work I do is I work a lot with people facing all different kinds of online harassment. So like falling in love, for example, is a dangerous thing to do. You could have your heart broken or that person could hurt you. Learning how to trust people, you know, crossing the street, deciding to jaywalk right, all different things we do sort of.

Every day actually can expose us to harm.

And so one thing I think for people listening to keep in mind is that's the same one we have conversations. And I think a way to avoid nihilism is just to remember that that every day we are sort of going out there and actually being incredibly brave just by living our everyday lives, by deciding to be in community and have friendships and have relationships, and in my case, I love jaywalking, and no one around me does, and that's why that's my choice. And I have not yet gotten hit by a car jaywalking.

I think it's good to look at this the same way. There's a concept that the military has sort of developed when talking about how not to die when you're in a gunfight or something. It's called the survivability onion, right, And I think it's extremely useful both if you're talking about like, well, I'm going to a protest and there will be violence there, you know, should I wear armor, et cetera. But it's also just really it's really useful with any kind of security, and and the onion, it's it's envision, doesn't on you because like the largest outside chunk of it is don't be seen, don't be acquired, which means somebody actually getting you in their head sights. Don't be a hit, which means being behind cover or something. And then the very internal part of it is like, have some sort of armor in case you are shot.

But if you if the.

Armor is useful, the majority of the onion has already failed.

Right.

If encryption is useful, that is not a dissimilar sort of situation. Right, So there's a there's a degree of canniness is super helpful and thinking about like what is what is visible about me? If I'm doing something, I know that I have to be extra concerned about the state seeing what is visible about me from the outside, you know, tobly, I mean.

I think that's an amazing thing to think about. Like where where are you sending a text message? Are you in a place in which like someone can lean over, Like I'm the nosiest motherfucker, and all the time I'm constantly like like looking around being like what's that person watching on an airplane? Or like if someone is sitting next to me scrolling, So like you wouldn't want to like send a sensitive text message like next to me, because I'd be like, that's that's interesting fodder.

That's kind of a show Texas to Cooper later, you know.

And so I think it's important to think about that, Like who's around you? Is this is like how are you describing something? Do you know the person you're messaging? If you're in a group message, you know everybody there? Like do you trust all of them?

You know?

And if you're ever nervous there are this is I guess the upside also to in person conversations. You can have, you know, a phone call or an in person conversation with someone. Right if you're really not sure or you don't feel comfortable even sending something over signal, that might be the time to be like, hey, do you want to meet up and get a coffee and then you know, try to find a discreet place to have have a conversation.

Yeah, yeah, I do want to roll to ads real quick. One second, and I think Cooper had something to say, and we'll we'll continue, but first products ah, we're back Cooper. You look like you had something to add.

On that, nothing particularly serious, just that. I think that that's I think that that's really good advice for the military and absolutely justifies the nine hundred billion dollars.

Yeah, I'm glad they put together a fucking graphic. I wonder how many billions of dollars that did best I could.

I could make a graphic for hundreds of millions of dollars.

Yeah, if anybody, if anybody wants to fund us for hundreds of millions, we will will do it less now a year, hundreds of millions.

We have so many good T shirt ideas and sticker ideas y'all like so many good ones, so many unhinged ones that the world needs to see.

Yeah, I mean I do. I guess just because of the amount of time I've spent thinking about this stuff from my old job. There are a couple of concepts from military planning I think about in this context, and one of them that I also think is relevant to what we're talking about with friction is the concept of an ode loop, right, which is how do you win and combat against an opponent, And it's by disrupting this thing called the ode loop. And the ODO loop is how an adversary carries out actions in a conflict like this, right, and the steps you have to go for observe, orient, decide, and act. And if you can disrupt any stage of that, you can stop them from taking actions, right, which stops them from being able to harm you. And the good security is going to impact all three of those things, right, It's going to stop them from being able to see you sometimes if they can see you stuff like you know, you were just talking, we were just talking earlier about link previews, right, and how that can kind of expose maybe who you're in communication with potentially, well, that could allow the state to orient themselves to you and to your friends, right, And obviously stuff like locking down your devices not having unnecessarily info online can stop them bring being able to decide, you know, what you're doing and how they should respond to that. And I think that's also good if you're thinking, if you're not just somebody who is concerned about your security like most people are, because it's good to have some security. If you're actually dealing with the state or a corporation as an adversary in some way, it can be useful to think about your security culture in those terms.

Yeah. Absolutely, I think that's absolutely right. It's it's and I think that it's you know, it points to you like we should, we should understand what the you know, mode of thinking of our adversaries is, right, like we you know, we should if your adversary is the NSA, right, which is like probably actually not most people in the US, Like for most US activists, the NSA is not actually your biggest adversary, right, Like your biggest adversary is going to be local police, right, your biggest adversary is going to be you know, the the you know, somebody like your abusive partner, right, and you need to. And this is why threat modeling is important, because you need to to really to really think about, you know, think through like you know, well, okay, wait, am I actually worried about protecting myself from the NSA? Or am I more worried about uh uh, you know the the racist police officer that drives down my street every day? Right? And yeah, probably it's the latter. And so you can you can take a lot more useful actions, right uh. And and you know you can you can you know, break that oda loop for him once you know actually what it is. Right, Yeah, if you're defending yourself against the NSA, you're gonna leave yourself wide open to the actual threat. Yeah.

It's totally I think a great example. And I don't mean to be like quote unquote sub tweeting somebody here, but I've known a couple of folks like this. It's like, if you have if you're super paranoid, you're not putting anything online, You're only talking with your close friends, you use a dumb phone, you have burners, but you also drive around with a shitload of weed in your car in a state where that's illegal. It's like, well, like your threat modeling is not great in that situation, right, or like I do all that, but I carry in a legal handgun with me wherever I go. It's like, well, it may be more of a threat than your phone.

My partner the other day was like, what if I got a dumb phone? I was like, what if I divorced you? Like like what if?

They were like what do you mean?

And I was like, well, I'm going to be the one using all the maps for both of us, yeah, and having to google all the dumb shit you want at Google. That doesn't make I'm now your weakest link, like go fuck yourself. But also I was like, I'm absolutely not going to be your your your Google maps bitch, Like I'm not not doing that. But but I mean I think also, you know, to both of y'all's points to get serious again for a second. I mean, you know, like my threat model, for example, might be similar or slightly different, maybe slightly less serious than than Cooper's. But you know, like some of the like the the the journalists in India we were working with, have quite a high threat model, right, Like, yeah, the Indian police force are very much like the NSA. They're very talented they have a lot of money and tech at their disposal, and that might be different for some of the activists we're working with, let's say in like Louisiana or Texas, right, but the differences is, like we're still talking about I would argue two brutal police forces that just have different means of disposal at their hands. So like the Louisiana police are our groups you should totally be worried about. They might not be able to hack your phone, but maybe eventually they could.

But there are other there are obviously other things that were about them.

But you know, in the context of like some of the folks who are working with in the South, like reproductive justice activists, some of the things are probably much more serious in terms of your threat model would be like a nurse for someone who let's say is miscarring or has sought an abortion. And this is something Kate Bertosh from the Digital Defense Fund, a friend of of you know ours, has talked about where like the people that are supposed to take care of you might be the ones that are actually your your biggest threat, right, the ones that have heard you say something or you've can fight it in for example, and that is kind of a horrifying thing to think about.

But that is, that is a thing you.

Have to threat model, right, is is can I trust this person? How am I describing?

You know? What's happening?

Yeah?

Yeah, absolutely, Well, did y'all have anything else you wanted to make sure to get into in this conversation? There's so much more in your in the great paper you helped co author, What is Secure and Analysis of Popular Messaging Apps on the Tech Policy Press. But yeah, is there anything else y'all wanted to really make sure you hit before we roll out?

Yeah? Please don't use telegram for a variety of reasons, but also like it's very unclear how they respond to any law enforcement or government. They don't say anything, and it's kind of impossible to reach anyone that works there. Please don't use Facebook Messenger other than maybe sending memes. There's a lot of really gross surveillance capitalism inside a Facebook messenger that the paper gets into. But effectively, Meta is building this weird, sprawling infrastructure inside a Facebook Messenger and try to link Facebook and Instagram.

And one of the things we noticed is.

That, like if you've blocked someone on Instagram or mute to them, but you haven't blocked remuted them on Facebook, that your stories, like all those stories are still coming across in messengers, so you can still see content from someone because it's linking both of those both of those profiles. So you know, you could see how partaking like an online harassment lens like why that's why that's really bad, that's really harmful and could be potentially you know, upsetting and triggering for folks.

Yeah, I'll add that. I think my the major thing I want people to to think about is that encryption really does work, and it works really well. And we can see that because a lot of countries right now are trying to pass laws that either weaken or byan encryption right and in fact, the UK uh did passed, did just pass such a law, the online Safety built in the UK. And so it's really important that we that we you know, push back against these laws and fight back against these laws and and whatever we can, right. And I'm not I'm not coming at.

This as somebody who's a big believer in the you know, in in incrementalism and in working with governments, but I think that I still think that it's really important to you know.

Educate folks and push back against these laws and try to not let these pasts because these will be really bad for all of us totally.

And not to defend the Online Safet Bill, because I would never do that. I'll go to my grave not speaking highly of it, only speaking critically at least, like the pushback from encryption experts and encryption supporters like Merrit Whitaker, president of Signal, did lead to lawmakers in the UK, for example, admitting that there's no sort of feasible safe way to build a back door, right, And that is I think also a win because of so much pushback, because of so much research, because of so much criticism that security and privacy folks gave people that are pro encryption like that, we you know, we were able to walk back that part. And I do think that's a big deal, even if there are other issues with that bill, because I think it also sends a sick pun intended to other governments as well, and I think that that's incredibly important. But yeah, I would also say just just use Signal whenever you can.

But yeah, yeah, well all right, folks, that is going to be it for us here at it could happen here. Yeah, thank you all for listening, and thank you Cooper and Carolyn for coming on.

Thank you for having us, yeah.

And thank you for having us.

You can find us on social media for now, I guess until it all.

Lights on fire.

Yeah, whichever one you want to trust.

I'm Cooper Cue on most social media's Blue Sky Mastered on Shitter.

Yeah, I'm Caroline Cinders. My first name, last name. Our lab is Convocation Research and Design Record Labs on Twitter at the moment.

Hopefully we'll get be getting on Blue Sky very soon.

Yeah. Yeah, probably get back on there more.

Now.

Twitter has gotten remarkably worse, which you know, we had a back in the day on the old something Awful forums. There was a thread in one of the debate forums about this very right wing site called Free Republic, which is like one of the earliest reservoirs of what became trump Ism, and the tagline for the thread just kind of watching these people, was there is always more and it is always worse, And boy, goddamn, if that hasn't been a continually accurate statement about the whole of social media.

Right now, isn't a time amazing to watch someone just light forty billion dollars on fire.

Yeah, just like yeah, totally there to it. Yeah, it's like the nihilist and me being like, wow, Comrade Musk really really taking some hits to capitalism here.

It could Happen here as a production of cool Zone Media. For more podcasts from cool Zone Media, visit our website fo zonemedia dot com or check us out on the iHeartRadio app, Apple Podcasts, or wherever you listen to podcasts. You can find sources for It Could Happen Here, updated monthly at coolzonemedia dot com slash sources. Thanks for listening,

It Could Happen Here

It Could Happen Here started as an exploration of the possibility of a new civil war. Now a daily sh 
Social links
Follow podcast
Recent clips
Browse 1,218 clip(s)