Week in Tech: 23andBankruptcy

Published Mar 28, 2025, 9:00 AM

What even is a crypto mixer? This week in the News Roundup, Oz and Karah dig into potential Slack-enabled corporate espionage, the recall of a Kim K-beloved product and the group chat that broke the internet. On TechSupport, The Washington Post’s technology columnist Geoffrey Fowler discusses 23andMe’s financial woes and what it means for the genetic data of the roughly 15 million people who bought DNA testing kits from the company.

Welcome to Tech Stuff, a production of iHeart Podcasts and Kaleidoscope. I'm os Vlosian, and today Cara Price and I will bring you the headlines this week, including a bipartisan bill that could change the future of the Internet. Then, in our Tech Support segment, we'll talk to Jeffrey Fowler of the Washington Post about the twenty three and me bankruptcy and what that might mean for your genetic data. All of that on the Weekend Tech. It's Friday, March twenty eighth. Well, I'm very, very excited to finally be able to say it. Welcome back, Cara Price.

Hello, it's good to be back as Valashan.

You're armed with a document dossier.

I am, I have papers. I created a dossier of all the tech stories that I want to bring up. And then, of course you've already reported what happens with me. But no, some of them, some of them are true to my own weirdness and obsession. One of them is Kim Kardashian's photoshoot for Perfect magazine, which came out a couple of weeks ago.

Well, tell me about that. I remember, Kim Kardashian, you and I were entertained last November by her sort of first trap photos with the Tesla robot. But is this a development?

So if you're Kim, you're gonna pose, and if you're Kim, you're gonna post with the hottest stuff. Right, what's the hottest stuff? The cyber truck and a Tesla robot And those poses are not exactly of the doge variety. They are sultry. But no, you know, there's actually a more serious story here. The Associated Press reports that Tesla has recalled nearly all cyber trucks, so it's more than forty six thousand in fact, and the National Highway Traffic Safety Administration warned that panels on either side of the truck's windshield are in danger of detaching and creating a road to hack. It's not funny, but it's like you're in the hand, I mean, where you see it.

It's funny because ironic, because it's supposed to look like you're living in the future but supposed to be roane.

It's RoboCop but with like an old hyende vibe. And so the issue is actually the adhesive. It's vulnerable to environmental embrittlement. According to the NHDSA.

I thought that was your phrase.

So drivers can get the panel replaced by Tesla for free, actually, which I mean I imagining going to do that?

Fool's Errand yeah, I mean Tesla shares a down forty two percent for this year, and I can't imagine this story will help on the subject of corporate drama. I was drawn to this story this week of tech enabled business espionage. Here's the headline from tech Crunch. Rippling seu's deal. Deal denies all legal wrongdoing and Slack is the main witness.

Can you explain to me how Slack is a witness? Actually, Slack is a witness to a lot of our bacchanalia for this show, So Slack as a witness actually makes sense to me.

So yes, I mean, this is kind of a story about who's watching on You're on your work software or computer. But Rippling and Deal are HR startups that offer payroll and other HR resources and their rifles. This is the Pepsi and Coke of the HR software.

Well, that's the sexiest sentence.

In Rippling recently announced a lawsuit against Deal in a fifty page complaint, the ledges racketeering, unfair competition, misappropriation of trade secrets and more. And here's the real intrigue. The lawsuit centers around an employee that Rippling alleges acts as a spy for Deal.

The worst thing you could be at Coke's a Pepsi spy and vice versa. But then again, I'm like, who cares about trade secrets at Rippling and Deal? But I guess it's the comtes.

Twelve and thirteen billion dollar HR companies. They're almost exactly the same size, they have a very similar offering and so so yeah, I think the stakes are high, and you know, there's been a kind of public relations war going on. In parallel, Rippling created a game on their website called the Snake Game. You of course remember Snake.

I mean, I just download a black puzzle again, which is the closest I can get, but Nokia.

Well it's not the closer you can get because you can now play Snake on Ripling's website. Here are the instructions. Deal often claims to be a one stop solution for all your global payroll needs, but their customers pay the price for gaps beneath the surface. Play this game to find the difference between Deal's claims and the reality of their proce.

This is the pettiest. This is pettier than some of the stuff that my friends did in college.

But I'm sure you want to play the game, right, Yeah.

So it won't be interesting to the listener to listen to us play. But just like imagine playing Nokia Snake, you gobble up all of the alleged deal falsehoods that are at play. But the lawsuit was filed by Rippling.

That's right. And the reason it fascinates me, as I mentioned, is because it's kind of about privacy, especially privacy as an employee, right, Like, if you think you're alone on your work computer, you're wrong, man.

Well, especially if you're a spy.

Absolutely, according to Ripling's own lawyers, the company keeps a detailed log of what people do on Slack, Like when an employee accesses Slack channels, or conducts searches on Slack, or opens a document on Slack. All those things are logged. And so the lawsuit contends that according to this logged activity, the Rippling employee, who is allegedly a spy for Deal, started looking at content associated with the word Deal at a much higher rate beginning November, including perusing sales related Slack channels that weren't necessary for his job in payroll operations.

But when they say looking, we're looking at.

Well, I'll give you. I'll give you a clear example of this. So this morning, when you slacked me and said how can I find the log in details to all of our subscriptions, what I did was I searched log in in Slack and then that's how I found it. So you used me as a human function, but you actually there is a search function.

They're like OS has been trying to log in.

Quite a bit recently. So that's exactly right. So this dude was writing deal into the Rippling corporate Slack and seeing what he could find allegedly.

Fascinating, and so slack was basically, I mean, slack wasn't doing anything. But like when people went looking for what this guy was doing, Slack was like, here's what this guy's doing.

People didn't just go looking, They actually created a honeytrap. Explained, So Rippling created a Slack channel and started a rumor that it had a bunch of embarrassing information about Deal in it, and of course allisuredly this employee headed straight for that slack channel.

And the spy employee searched for this channel and that was picked up by the slack log is.

That's according to the lawsuit anyway. It also claims that when the alleged spy was asked by court order to hand over his phone, he escaped the bathroom, locked the door behind him, and possibly even attempted to flush the phone down the toilet.

As someone who's heard anxiety attacks, at first, I hear, well, honey, that phone ain't going down the toilet, And then I'm like, you know what, I would flush a phone down the toilet, especially if it was a Nokia phone that things goes right down, not my iPhone.

I know.

You have to imagine fishing your wet phone out of the toilet having to fail to flush it is not.

The the way I have fished phones out of a toilet. I'd be a millionaire if you paid.

Next, I have a story that doesn't exactly debunk the claim that crypto is the Wild West. The story is about something called a crypto mixer, and he guess what that is, Cara.

Like something between the world's Saddest job fair and the Saddest Prome.

That's what I thought too, but then I asked Google Gemini. Ah, And here's how Gemini explains. A crypto mixer, also known as a tumbler, aggregates cryptocurrencies from multiple users, mixes them, and redistributes them randomly, making it harder to trace the origin and ownership of the funds. So, of course there's a public ledger for cryptocurrency, the blockchain, but after the mixing process, the coins are we distributed back to the original depositors in a random manner.

It's like if everybody bought a bottle of vodka and you made a huge batch martini and then you were like, can I have my drink? You get your drink, and you're like, I don't know if that was from my bottle or who somebody else's voca battle.

I think that that is very very well put, and I'm sober. The metaphor the founders of this particular mixer chose, however, was the Tornado, hence the name of Tornado Cash.

That sounds like a celebrity baby name.

Well wait till we get to the alleged perpetrator. In twenty twenty two, the Justice Department alleged hackers, including the Lazarus Group, which is allegedly run by the North Korean government, launded billions of dollars in stone assets using Tornado Cash. So the US took action and sanctioned the company, and.

So nobody in the US could use Tornado Cash anymore.

Yeah, that was the name. That was one of the consequences of the sanctions. Another was criminal charges against two founders of Tornado Cash. The Wall Street Journal recently spoke to one of the founders, whose name is get this, Roman Storm.

I can't These are Kardashian children's names.

That's right. So Roman Storm the founder of Tornado Cash. But you know, all jokes aside. He was actually arrested at gunpoint after federal agents stormed his home in twenty twenty three to arrest him for his involvement in Tornado Cash. He's now out on bail and says he's not guilty of charges of money laundering and sanctions violations. In this week's Wall Street Journal article, he maintains that the software he created is neutral and has both good and bad use cases.

But it's my understanding that the whole point of the blockchain is to create a ledger that traces back the original transaction. So what could possibly be a good use of crypto mixing, Well, Storm.

Said, it's financial privacy. So an example he used actually was that you could donate crypto to assist Ukraine in their war effort without identifying yourself, for example, to Russian authorities.

So that actually makes a lot of sense.

There is some good news for Storm, we could go the US actually lifted sanctions against Tornado Cash, and while Storm is still awaiting trial, his ally seems to be encouraged by the Trump administration's desire to make the US the crypto capital of the world.

Unbelievable. I love that story. Else, The next story that I want to talk to you about is one that I have not shut up about since probably we started doing Sleepwalkers.

Seven years ago, seven years ago.

And you know what, a lot of people in the US government have not shut up about it over the last seven years either. But this is the first time in my research that I've felt like, Okay, there's actually going to be some movement on this because members of the bipartisan members of the US government are getting involved. So Section two thirty sounds extremely boring, and I bring it up at parties and people look like shut up. But if I want to kind of plagiarize the way that Section two thirty has been described by those who care about it, it is the twenty six words that created the Internet, the twenty six words of the following. No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. What does that mean to you?

Doesn't mean that much to me on the face of it, But I know that those twenty six words have been some of the most consequential in the history of technology legislation, and arguably are lifetimes. They represent this addition to the Communications Decency Act that was passed in the mid nineties, and those twenty six words that created the Internet basically meant that Internet platforms were not legally liable for third party content like content created by users that lives on their platforms when the law was struck. It was basically designed to allow these Internet platforms to grow and to host content without fear of liability.

They did that successful.

Boy did that work.

And it was also related to free speech concerns, which is still something that you know people are talking about today, which is this idea of like, how much is a two moderated internet?

Right?

And so, ultimately, as you were saying, Section two thirty allowed companies to put less effort into regulating their content because they're not treated like publishers.

Publishers obviously are constantly subject to lawsuits for saying stuff which is defamatory or for publishing content which is extreme, whereas websites and social media platforms are immune.

Right, And in the nineties, nobody could have anticipated how this law would fuel the explosion of what we now know as social media. The story here is that thirty years later, there is now some bipartisan support for modifying Section two thirty, and two senators, one being Lindsay Graham, who's a Republican from South Carolina, and the other Dick Durbin, a Democrat from Illinois, plan on introducing a bill to set an expiration date for Section two thirty, and that date is January one, twenty twenty seven. Mark your calendar, less than two years from now, less than two years from now. But the Information reported that the lawmakers don't actually want to repeal Section two thirty outright. It's really more of an invitation to the tech companies to negotiate, which is something that they have been unwilling to do without an ultimatum thus far.

I think one of the interesting things about this story is the first place you sort reported was in the Information which is like the go to source for the whole tech industry.

And I think what that signals is that big tech is actually watching very closely. And one of the things that I've noticed is just this increased anger towards tech companies in recent years from both sides of the aisle. You know, there's the growing concern for the safety of kids online and frustrations about failed attempts to regulate big tech again on both sides of the aisle, which may bode well for this yet to be introduced Bill.

It's definitely a story we'll be following up with over the come months. But right now, let's take a quick break. We'll be back with a few more headlines, and then we'll hear from Jeffrey Fowler of The Washington Post about what twenty three and meters bankruptcy filing could mean progenetic data. Stay with us, Welcome back. We have a few more quick headlines to you this week before we dive into our conversation with Jeffrey Fowler about twenty three and meter. Casey Newton wrote this week in his newsletter about two new studies, one from the MIT Media Lab and one actually from Open ai itself, which suggests that heavy chatbot use is correlated with loneliness and reduced socialization. Now, of course, as Casey Newton and the studies both point out, correlation is not causation. There's always a possibility that loneliness is what propels someone to seek an emotional bond with a bot. But the studies do point at a potent danger, which is that engaging with compelling chatbots might pull people away from connections with other humans, which may in turn drive more loneliness and more dependence on a computer companion.

This is the digital chicken egg. Am I lonely seeking a chatbot? Or is the chatbot so hot that I can't stay away?

That's it.

There's another story that Semaphore reported on that an off Broadway theater in New York is offering live AI powered translations for the play Perfect Crime. So theatergoers can scan a QR code and choose from sixty languages. So actors wear microphones that feed their voices directly into the translation system, so no side conversations or audience noises get accidentally picked up by the AI powered translation services. This is an incredible use of AI, which is like, you could be a tourist from anywhere and see this show and still understand it.

Finally, obviously we can't pass the headlines by without a nod to this week.

The dofist Pentagon pracs.

Hashtag signal Gate. Just a few weeks ago we had Meredith Whittaker, who runs Signal, the encrypti mesching app on tech Stuff and Signal is absolutely the flavor of the week because Jeffrey Goldberg, the editor of The Atlantic, was accidentally included in a Signal group with JD. Vance and Pete Hegseth to discuss highly sensitive battle plans for Yemen, which are now absolutely all over the Internet. The lesson here is that it turns out encryption doesn't work if you invite journalists into your group chat. The story broke the internet per Axios. Hashtag signal Gate is the most interacted with story of the year to date. The Atlantic story, which is titled the Trump administration accidentally texted me its war plans has approaching five hundred thousand interactions on social media, which is more than the second and third place stories of the year combined. For your reference, those two are Meet the World War II veteran that recently celebrated its hundredth birthday from the venerable news source eleven Alive. And Elon Musk's networth dropped twenty nine billion dollars in one day as Tesla stock tanks from Business Insider. So lots of emojis in the in government communication, many more than one would suspect fireflames. So for our next segment, we're going to explore what happens when a company that you've given your personal data to hits a crisis. Now, companies file for bankruptcy all the time, from major retailers and restaurant chains to tech startups and bitcoin exchanges, but they usually don't own the DNA sequences of millions of people. So such as the case Carrot with twenty three and me, yeah.

They have my oh boy, do they have my data?

Was so you were one of the people in the twenty tens when this was the hottest game in town who couldn't resist.

I did it in twenty twenty one because it was a thing that it was a thing that somebody bought me as a gift, oh okay, And I kind of sat on it for a little bit.

You're buying the dip as a user, not as an investor.

One thousand percent, and I, I don't know, I just looked at it. I'm obsessed with my family history and to get a little bit more serious, you know, it's like I have a lot of dead relatives and I don't have the ability to ask them, you know, who was our extended family. And so I decided to take matters into my own hands. And you know, lo and behold found out that I'm incredibly Jewish, which doesn't surprise me, and that's it, but that was still exciting. I struggle with something that I think is really at the core of this bankruptcy filing and how it affects people, which is I love to give my data away. I don't care. I want it now. I want Wi Fi at La Guardia, I want Wi Fi in the middle of the street, and I'm going to give you my pass where I'm going to give you my email address. I also report in a technology podcast where I know that that's a terrible thing to do, and such as the case with twenty three and Me, where everyone was like, Kara, don't do that, just don't do it. Talk to your family members and don't do it.

Now. Can I ask you? One of the things that twenty three and Me was trying to do to kind of make its business model more robust than just providing a one off test that you literally never needed to repeat, was to kind of build in this like social aspects to the platform. Did you connect with any distant, previously uncontacted relatives.

Yes, and we figured out we were related. You know, no profound It was not a profound thing. And look, I know that for a lot of people what happened was like people found out they had like a brother or another family. You know, it was I think it was a It had a profound effect on a lot of people. It did not have a profound effect on me.

So my sister is pretty interested in genealogy, family tree, etc. Scarce and she, to her enormous credit, said to my father, would you mind if I do twenty three and meters? And he said, I would really prefer it if you didn't, because I don't want to know if there's a serial killer in the family. We both raised the eyebrows with this, and of course the question arose, was there a suspect in mind?

You're like, are you exact?

So she didn't do it? You know how paranora I am. Of course I didn't do it either, But fifteen million people did. And many of those people who know about the bankruptcy, I'm sure are concerned about what's going to happen to their personal data. Here to help us understand is Jeffrey Fowler, who writes a column for The Washington Post all about the user experience of technology, the good and the bad. And what I particularly like about his writing is that he really centers the user in his column. It's tech journalism, but it's practical tech journalism. What does this mean for me? How can I make things better? What should I know? And so he wrote a column this week that called both of our I.

Jeffrey, welcome to tech stuff.

Hello, Hello, So you wrote a story this week and the headline was delete your DNA from twenty three and me right now, which was quite a captivating headline. Could you just start off by giving us a little bit of the background On twenty three and meters. And then while you wrote this story, I wrote.

It because fifteen million people around the world spat into a little vial and sent their DNA to a company in Silicon Valley with the promise of unlocking all sorts of things about our ethnicity, our background, our family tree, and our health. But it turns out that the company that we trusted with that information, that really really precious information, was a terrible business, and it declared bankruptcy late Sunday night. And if there's one thing I know, it's true you do not want a company that is bankrupt to be responsible for protecting your very precious information. And that's the situation we find ourselves in. So the moment I heard about it, I was like, we need to press published on this story right now. And it seems like a lot of people agreed, because so many people have been trying to delete their data from twenty three and me that the site has been down for large periods of this week.

And yet you, like me, could not resist getting a twenty three in me account. What were your reasons for getting an account originally?

You know, I think it sort of tracks the arc of a lot of our relationships with technology and with data over the last twenty years. Look, this was the two thousands for me. There was so much possibility out there, right, and you know, in two thousand whatever, when I did my twenty three and me test, what I could imagine was what the company was promising, which was just a fascinating idea, right that together, if they were able to gather enough DNA data, they could use the power of technology to unlock all kinds of secrets about the human body. They could already even in those early days, tell you curious things about yourself, Like I remember one information they gave me is that I have a gene that makes me likely to have wet ear wax.

While that is, I feel so seen. I was always wondering. It's a thing. It does.

It's very validating.

Yeah, And so while that's like the wet ear wax is not going to change the world, they maybe could learn things about cancer or all kinds of mysteries about the human body. That is, That was and remains a pretty exciting idea.

All of these things are true about twenty three and meters And as a user, I also experienced, you know, some interesting information and connection with people that I didn't even know were distant cousins. But as you said earlier, this is no longer a viable company. Can you talk a little bit about why it went bankrupt and how it started to run into trouble.

So all preface this was saying like, I've not been tracking the financial fortunes of twenty three and meters over the years, but I know at the high level they've tried a lot of different ideas to make the business work. Because it turns out that when you asked people to spend roughly one hundred bucks to send you their DNA, they only want to do that once, right, So twenty three and ME had this problem, which was once people did the test, that you couldn't get any more money out of them. So they had to figure out other ways to keep going as a business, to make some money out of the data that they had. And I think the big idea that they had was that it was going to be useful for developing drugs, and they made a big partnership with GSK, and GSK took a stake in the company, and that went on for several years, but it wasn't producing results at a scale that was helping them and I guess the costs associated with it were just gigantic. They've then tried other things. They tried, I think most recently selling GLP once as kind of a home kind of health kind of service like Pivot to Health, and it just didn't work, leading to around twenty twenty one that they went public and they were valued at six billion dollars roughly, and then as a Sunday night when I checked, it was around fifty million dollars. They have no idea what it would be now that it's in bankruptcy court. Probably less, probably less. And along the way they had one other thing that we should definitely talk about, which was they had a big hacking attack.

Yeah. I wanted to ask you about that because I think, you know, you sort of pointed out that there was a time in the two thousands where we just couldn't wait to give our data to all comers and see what kind of surprising results that we got back that change, which presumably became a headwind for twenty three and meters. But there also some specific issues they had that may have put customers off.

Yeah, and I think the attacking attack is a big one. My memory of it is that they had a problem with users that were reusing passwords or using passwords that had been otherwise compromised on the internet. This is a common thing to all our listeners if you do not already have a password manager and use it to get distinct passwords on every single site, and Appy used and do that now because that is a big risk. But turns out a lot of twenty three meter users fit in that bucket, and folks were able to get into a whole bunch of accounts and then even offer to sell some of that some of what they learn online. I think it was mostly information about like family trees. I don't think they got people's like DNA samples, but still it was enough to really really spook people. And you know, because at that core of it, I think people are and even back then, we're pretty nervous about the idea of spinning into a vial, and any breach of that trust by by twenty three and me was just it was just killer.

After the break, we're going to get Kara to finally delete her account, stay with us. So, Jeffrey, what happens is you don't delete your data from twenty three and me, Like, what would happen then.

If you don't delete your data from twenty three and me it is now essentially up for sale. So the company has said that it is in bankruptcy court, which means it has to find it either find a buyer for the whole company or sell it off for parts. And the most valuable asset they have is the DNA data of sensibly still millions of people. Who is going to buy that, We don't know. There's lots of speculation, maybe insurance.

Companies, insurance we have well, we have some laws, some laws in the US that protect users from not users protect people from.

Having their their data, their genetic data used to keep them from getting things like healthcare coverage. But it's not an airtight law and it doesn't apply to other kinds of insurance. And again, it can also be bought by somebody who we have no idea what they're going to do with it, and they may not try to do anything with it for like a long time. Because that's the thing about data. I think that is the key takeaway lesson for everybody from this that you know, you can think you know what's going to happen to it at one time, and then in the future. It has a totally different use, and that applies to our genetic data, but all kinds of things about our lives.

Can anybody delete that twenty three meter day secause? I think I read in the piece there's a California law that makes this particularly easy to do, or at least possible to do. But what about for other people elsewhere in the US or elsewhere in the world.

I'm glad you asked this question. So we've been pretty sort of gloomy so far in this conversation. But there's a glimmer of good news here, so let's talk about that.

I love it.

Starting back in twenty eighteen, California passed a law that said, you, dear consumer user citizen, have some privacy rights, and among those are you have the rights to delete data that companies collect about you. And then a couple years later, actually during COVID, California passed another law that everybody was so distracted by COVID about we forgot even existed, which is a genetic information privacy law which comes even deeper and says you not only have the right to delete your data from a company, you also have the you have the special right to delete genetic data. You have the right to tell them to destroy your sample, and you have the right to tell them withdraw me from any research that's going on. So hoay California for this law. Other states, seeing that the federal government in the US was doing absolutely nothing to protect our data privacy rights, copied it, and I am now happy to say they're about twenty states that have versions of this law that require giving people the right to delete their data. I have a little map that think gets updated that I keep, which is like my little sign of hope in the dark dark days that regulation can work and we can have some help. So the reality is the majority of Americas are now covered by these state laws, and so companies, including twenty three and me basically treat all Americans the same way now, which is to say, yes, we will delete your data. Now, a lot of people have been writing to me saying like, well, how do we know they're really deleting it? It boils down to like if they don't, they could get into big trouble.

But who's they? I mean, this is the receivers, the creditors, I mean, in a bankrupt company, the stakes. So I mean, who do you go after in the event they don't take this duty seriously through the ankruptcy process.

It's true that like they have very little left to lose, you might argue, but there is a spotlight on them because they're going through this process, right, and so there's going to be a judge involved. And to be fair to twenty three and meters, they have already said, look, we are going to handle your data the same way we always have throughout this process, and we're going to try to look for a buyer that will uphold the same use. But the truth of it is, because America has no real laws that cover this kind of data, that all that they would really have to do is update the new buyer would have to update a privacy policy and give you notice of that and kind of then again give you that chance to delete it if you don't want to be a part of that. And the truth is most Americans are not paying enough attention. Even if you know, even if I do another Washington Post headline, but this time in all capital letters, you know a lot of people wouldnt wouldn't pay attention to that.

So you haven't started a genetic bank run. We don't know how many are actually trying to delete their data, certainly enough to crash the website, but a majority, as far as we know.

I asked twenty three and me yesterday and they wouldn't say.

You've convinced an undisclosed amount of people to delete twenty three and me. I'm gonna be honest, Jeffrey. I read your article. I didn't do anything, and it's because I'm lazy. I mean, there's there's no good reason. Like you, you wrote an incredibly compelling article. I went on Instagram instead, Like there's no good reason.

But how do you actually do it?

Okay, you want to do it together? Right now?

Yeah? Yeah, hold on one second. Can we get your phone?

Yeah?

Okay, so I'm opening of course, I'm like, I'm opening it with my face, So tell me how to delete. I'm now in the app.

It says, hey, Kara, so go up to I think it's on the upper right corner.

Okay, So I'm there. God, I'm gonna miss my cousins. Keep on.

You can look for settings. Once you tap on your little profile.

I's a chance to pass for an updated information.

Okay. Once your in settings, keep scrolling all the way down and towards the very end, you'll see an area called twenty.

Three and menu data Got it?

Got it?

Yep?

Okay, then click view.

Well there's an access your data or delete your data in.

Mind, Oh you want to delete?

We were like shock, we were already to folkate the roads that Jeffrey, thank you.

I wasn't sure.

It will give you a chance to download some of the data, including everything from their report about your health and your family.

And report summary. Here we go.

And then after you get that stuff, Okay, you're going to scroll down, you're going to click delete. Now. A couple of things to mention about this. While you're clicking on things. When you ask to delete, they also do two more things. One they also delete your specimen if you'd left them the physical jewel that you send them in the mail. And two, they withdraw you from any health studies that you might have opted into.

I'm sorry, I just want to share one thing, Jeffrey, and this has nothing to do with our podcast. You and I, Jeffrey, are both likely to get wet earwax.

Yeah, let me tell you if you ever live in a really moist place that your wax is going to sneak up on you.

I'm deleting.

Here we go, permanently delete. Oh gosh, here we go. I'm like in this house, we delete twenty three and meters and then it says profile data, so delete data. So here what it says is we have received your request to delete your data and have sent an email to the email address link to you'rre trying to three me account. Please locate this email on the strategy to confirm my request.

That's right, So your kind your email app and then it's an email in there, and then you press do it.

Do it permanently, delete all records permanent.

Do remember. Should you decide that you really still want to know about such things, you could swab again. The one thing about your DNA is it's not changing. And that's in some cases I guess a good thing and sometimes of badly bad thing.

Yeah, Jeffrey just decays. I have two questions for you. The first is how often, as the technology columnst for the Washington Post, do you have to talk people through how to do banal technology tasks.

We actually have a special thing on our website where it's like, this is the box for us to give people instructions for things. So in fact, if you pull up my column on this and you scroll down a little bit. I've got one of those boxes that I was literally reading off of while we were going through this process together. And that's fine. Look, this stuff is hard yep. These companies can act in evil ways and it's not your fault. So the whole premise of what I try to do as a tech columnist of the Washington Post is be on the side of the user r and like, help you fight back when you can, right, And like, I think that's that's super important.

And the second is what do you want to take aways from the twenty three meters story?

One, perhaps we shouldn't have all sent our DNA to Silicon Valley corporation that has Silicon Valley Corporation values and ways of doing business. And the bigger one is that, like we've been talking about here, that it's really hard to know in any given moment what your data could be useful for at some point in the future. And so the only really reasonable thing to do to protect yourself is to allow as little of it as possible to be collected, which sounds like an insane person thing to say in our modern economy when we're literally being watched in every single potential dimension. I once did a piece for the Post where I hacked into my iPhone to watch what it did while I was sleeping at night. Oh wow, and saw it sending data out to like just like hundreds of like data brokers and all these are sorts of things. So it was terrifying. It wasn't even using the phone, but it was, you know, it was communicating all this personal information about me. So how do we deal with that fact? This runs a little bit inttention of like I love technology. I want to use is a cool phone, and I have seventy five connected gadgets in my home that make all sorts of cool things happen. I think the answer is you just have to be vigilant, and you need the help of regulations. And so that's you know, to that have our interest at heart to sort of put boundaries around what these companies can do.

And that's why I'm.

Here to sing the praises of the California Privacy Protection Law and hopeful that we can maybe get some some more Jeffery, thank.

You, Thank you so much, Jeffery, and thank you for helping me do something that I should have done many days ago. You bet.

That's it for this week for tech Stuff. I'm most Flosian and I'm care Price.

This episode was produced by Eliza Dennis and Victoria Dominguez. It was executive produced by me oz Vaalashan and Kate Osborne for Kaleidoscope and Katrina Norvel for iHeart Podcasts. The engineer is Bihied Fraser and Kyle Murdoch mixed this episode and he also wrote our.

Thea Join us next Wednesday for a very special edition of tech Stuff The Story when we'll share an in depth conversation with Zach Brown, the CEO of McLaren Racing from the McLaren Technology Center. So f one fans tune in and please rate, review, and reach out to us at tech Stuff podcast at gmail dot com. We really want to hear from you.

In 1 playlist(s)

  1. TechStuff

    2,448 clip(s)

TechStuff

TechStuff is getting a system update. Everything you love about TechStuff now twice the bandwidth wi 
Social links
Follow podcast
Recent clips
Browse 2,445 clip(s)