Week in Tech: Can a Woolly Mouse Make a Mammoth?

Published Mar 7, 2025, 10:00 AM

What if AI could read your mind? This week in the News Roundup, Oz and producer Eliza Dennis explore the latest tech investment in the US, Meta’s brain-to-text breakthrough and the creation of the woolly mouse. On TechSupport, 404 Media’s Jason Koebler takes us to an AI-generated film festival… spoiler: the tech isn’t there yet. 

Welcome to Tech Stuff, a production of iHeart Podcasts and Kaleidoscope IMA's Veloscian, and today will bring you the headlines of the week, including a genetically edited rodent, the Wally Mouse. Then, on today's Tech Supports segment, we'll talk to four of Form Media's Jason Kebler about what the future of AI movies could look like. All of that on the weekend Tech Is Friday. It's March seventh. I'm excited to be back in the studio this week with our producer Eliza Dennis.

We're glad to have you Stateside.

Yes, it's felt like I was away for a long time.

I'm wondering if that had something to do with this news cycle though.

Yeah, there's a lot, lot lot to cover, so should we jump in.

Yeah? Absolutely. So.

It was a bit of a confusing week when it comes to chips and semiconductors, and I'll come back to why it was confusing. But Monday saw President Trump hold a press conference with the Taiwan Semiconductor Manufacturing Company aka TSMC. The clues in the name the company manufacture semiconductors and they produce ninety percent of the world's super advanced semiconductor chips. These are the chips that power AI training models, but also devices and basically are the backbone of the new global economy. However, the vast majority of the manufacturing takes place in Taiwan, and so many in Washington and beyond have worn that TSMC's dominance in the chip industry could create a national security risk, given that Taiwan is squarely in the bullseye of China's territorial ambitions. But this week, the Taiwanese company pledged to invest one hundred billion dollars in manufacturing chips on US soil.

You know, this is so interesting to me because it comes after multiple announcements over the last couple of months about investments in things like data centers and AI infrastructure. And that was with Stargate, and then Apple actually recently made a pledge to make more products domestically with themestic contractors.

Yes, I think they talked about five hundred billion dollars. But what was really interesting was that as soon as Tuesday, when Trump addressed Congress, he talked about his aggressive desire to dismantle the Act that actually TSMC is using in part to fund its semiconductor manufacturing in the US. The Chips Act was biden error legislation that basically created a platform for manufacturing semiconductor chips in the US. I don't know, I don't know how to square those two things, but that actually brings us to our next headline, which is a breakthrough indirectly interpreting and reading brain waves and converting them to text.

The superpower I want.

Yes, exactly what. You may be able to buy it if met had anything to do with it, because they announced that in partnership with the Basque Center on Cognition, Brain and Language in Spain, researchers have been able to decode unspoken language, often reconstructing full sentences directly from brainwaves and not even requiring any surgical intervention. This is all stuff which can be measured outside the head.

Yeah, and that's really the breakthrough here, right, because other research from companies like Neurolink have been extremely invasive, you know, electrodes being implanted into the brain. Invasive.

Yeah, that's right. And this research is all about kind of putting monitors on the skull or around the head to be able to read brain waves without having to directly hook into the brain, which is obviously much less scary, and there's an amazing promise for people with cognitive impairments or brain injuries to be able to convert their thoughts into text and therefore speech. But there are also some concerns. Right the Vox headline was Meta's brain to text tech is here. We are not remotely ready, And of course the big concern here is privacy if private companies can actually read our thoughts. But there's actually a long way to go before this research leaves the lab. Nonetheless, the experiment was kind of a maze. So thirty five volunteers sat under magnetic brain imaging scanners and typed on a keyboard. Based on prior training, an AI model was able to predict what they were writing, and meture research is accurately decoded between seventy and eighty percent of what people typed. In other words, with seventy to eighty percent certainty, it could know before I clicked a T that I was about to click the T. And so the real promise here is actually a data from this research is beginning to give neuroscientists a path to understanding how abstract thoughts are converted into language by the human brain.

Then I think the other part of this is that we're getting closer and closer to this idea that we can have wearables that do this kind.

Of tech totally. But of course, a wearable headset that can can actually read your thoughts and translate them into language is something that you know, conceivably could change a lot of people's lives. In another kind of science fiction becomes science fact story, it's about the wooly mammoths. The headline from MPR was just irresistible hoping to revive mammoths, scientists create wooly mice. Yeah, and I think one of the scientists that we knew we could do it, but we didn't know they would be this cute and they're worth a look. But the story is about a company called Colossal Biosciences, and they are, by their own account, the first and only de extinction company.

Okay, this was a concept I had never heard of until this week.

Yeah, this one's been one that's I've been intrigued by for it for a long time, and I hope we'll be able to cover it on an episode of the story before too long. But Colossal's website points out that nine hundred and two species are extinct and more than nine two hundred are critically endangered, and their mission is to restore extinct species to preserve biodiversity. It's a little controversial. Some people think there are more efficient ways to do conservation than reviving extinct species, you know, But to that I would say, I mean, look at the wooly mouse. This is whether or not you think this is the most efficient investment. It is absolutely wild. So picture a mouse with fluffy, orange tan fur that looks like it got very wet and then got a blow dry at the salon. You've got the picture.

They are extremely cute, and.

The way Colossal made them was first studying the wooly mammoth genome and then genetically engineering mice by modifying seven key genes to make them more like wooly mammoths. You know, the wool obviously being the most visible element, but also some things that were invisible, like the way the mice store fat and their fat metabolism makes them much more able to survive in the cold. And according to Colossal, the plan is to implant wooly mammoth esque modified embryos to Asian elephants by twenty twenty eight. This week, was also the Oscars and we both saw the movie that won Best Live Action.

Please tell people about it. It's wonderful.

So it's a Belgian Dutch copro called I'm Not a Robot? What did you make of it?

I was extremely tickled by this promise.

So, for those who haven't seen it, the film was written and directed by Victoria Wamadam and it's about a music producer who fails a series of capture tests and in so doing us to question whether she's in fact human.

I mean the minute I knew that we were having a capture test as part of the plot to this movie, I was all in. I don't know if you have this feeling, but I hate failing captured tests, especially when you have to click I'm not a robot and all you have to do is choose squares that show images of street lights or motorcycles or bikes. How can I get that wrong?

Yeah? So she's failing the tests again and again, even though it looks like she's doing it right. And then she gets to pop up with another quiz and one of the questions is did your parents die before you met them? And she answers, she answers, yes, and I don't want to spoil the whole plot. It gets pretty eerie, but it's a fascinating film well worth a watch. You can check it out actually on the New Yorker website because they were involved in releasing the film and on YouTube, and as a tech nerd, I was rooting for them to win the Best Live Action Short and did.

Yes.

Congratulations team. I'm not a robot, So.

Stick around as well after the break for a look at how AI was used in this year's OSCAR nominated feature films, including The Brutalist, and for a conversation with Jason Kebler about what it's like to attend an AI film festival. Stay with us, Welcome back. The Oscars were on Sunday, so we're going to stick with movies. Back in twenty twenty three, the Hollywood Writers' Strike was this fascinating early example of a very public negotiation over how AI might could, and even would disrupt and displace human labour. Ultimately, the Writers Guild of America signed an agreement with the Alliance of Motion Picture and Television Producers that Generative AI would not reduce or eliminate writers and their pay. But this was not a commitment by the industry not to use generative AI in filmmaking, far from it. In fact, this January, the editor of the Triple Oscar winning movie The Brutalist told an industry publication that he had used generative AI a few times in post production. Some of the actors in The Brutalist, namely Felicity Jones and Adrian Brody, performed their roles with a heavy Hungarian accent, and they even had some dialogue in Hungarian. To prepare for the roles, Brody and Jones spent months with a dialect coach to perfect their accents, but as The Brutalist editor David Joncho, a native Hungarian speaker, pointed out, English speakers can have a hard time pronouncing certain sounds. In post he tried to perfect the Hungarian in dialogue, and first the team had the actor's reader of the lines in the studio. Then they tried having other actors say the lines, but that also didn't sound right, so Yoncho turned to AI. He fared Brody and Jones's voices into the program respeecher and then using his own voice, Yoncho refined certain vowels and letters for accuracy, a process that could have been done without generative AI, like in an audio editors such as pro Tools, but Respeecher made the process much more efficient and of course Adrian Brody won the Oscar for Best Actor. As us Say Today reported, not all viewers would pleased with the news. Don't think it's too reactionary to say this movie should lose the Academy buzz. It was getting one person posted on eggs. But the manipulation of vocal tracks is not uncommon in movies. Deadline noted that combinations of vocal tracks will use in performances like Romy Mallock's Oscar winning portrayal of Freddie Mercury, and Respeecher may have been used in another film nominated for Best Picture this year, Amelia Perez. The rise of generative AI has been remarkably fast in creative industries. But one big question I have is how far could this go and how soon? And to answer that, we want to turn to our friend Jason Kebler at four or four Media, who not too long ago attended a film festival of AI generated movies. Jason, welcome back to the show.

Hey, thanks for having me.

Before we get into that film festival you went to could you just explain how respeech it works and how it was used in the editing process for the Brutalist.

Yeah. So, respeacher is an AI voice synthesizer, and so it takes training data of an actor's voice and runs it against a large language model. So in this case, it would probably be examples of the Hungarian language, et cetera. And it would take Adrian Brodie's voice and make it more closely match other examples of Hungarian language. And it's very interesting because this technology is sort of one of the first native AI technologies that was widely used commercially, not just Respeecher, but another company called eleven Labs has become really famous for like Eric Adams, the mayor of New York City, did a calling campaign to various communities in New York City where he spoke English, but then eleven Labs translated his voice into like fifteen different languages. And it's not just like a robot voice reading it sounds like Eric Adams speaking Mandarin or Eric Adams speaking Hungarian. And so increasingly this is being used in movies, not just Respeecher, but also eleven Labs and other tools like it, and it really is like one of the first big commercial uses of generative AI in movies.

To me, it feels like it's not that far away from other post production tools that have been super charged by AI, like description, podcast editing, or other tools like that.

Yeah, I mean it's really interesting because I think that music had this a long time ago, with things like autotune, and it's like many, many, many popular artists use autotune, and this is a very similar technology. I mean it's it's in the same family of technologies at least. So it just becomes a question of how much post can there be for the human performance to still be there. And I think it's a really open question at this point. I think if you asked me a while ago, I would say they're changing the performance in some fundamental way. But I think everything in a movie is so carefully edited, so carefully shot. They do hundreds of takes for certain scenes and then splice together different takes and cuts, and so I think it really is a spectrum of what you are willing to accept if you're in the Academy and need to decide whether someone is worthy of an award for this, I think audiences sort of have to accept it because it's being done, and it's been done for a long time. And I think that if you start like having purity tests about this sort of thing, I think it's going to be pretty difficult to know which movies to see and which are not to see, because ye, honestly, the only reason we know that this was used at all was because the editor talked about it to the media.

Yeah. And also, I mean, to be fair to Adrian Brody, I doubt that many Academy members would have voted against him on the basis to his owncungarian accident wasn't quite perfect, So I'm not sure that this was like the key input to his victory. But what you said about like the role of post production and what that means visa v like the original product made me think about this AI generated film festival that you went to. So, first of all, what made this an AI generated film festival? How much of the films were AI generated?

Yeah, so it varied for each movie, but I think that if you walked in off the street, you would say, oh, these films were made with AI And what I mean by that is each movie had visuals that were clearly AI generated, like a lot of the backgrounds were constantly changed in a way that if you were using a camera, they wouldn't happen. A lot of people had like faces that were morphing from scene to scene. One thing I will say though, is that TCL was very clear that all of the scripts were written by humans, and all the voices were done by humans, and all of the music was done by humans. The artificial intelligence was limited to the visuals in different movies.

Can you just take me back to kind of how you got invited and what questions you had going in?

Yeah, So I went to the Chinese Theater in Hollywood, which is ironically where the oscars are. It's like the same complex. And that theater is owned by TCL, which is a Chinese TV manufacturer, and like a lot of other TV manufacturers at this point, they have their own free streaming TV service if you buy a TCL TV, And TCL is the first company to put fully AI generated movies on its streaming service. And so this was a premiere of five films that were created using generative AI. And so I had been writing basically about this technology for a while and they invited me to come watch them.

So, despite the fact that you'll perhaps more on the skeptical side, they welcome you into the film festival.

I was pretty shocked that they invited me, because honestly, I had written about a trailer that they released for an AI generated film and I kind of dunked on it. I said, it was really terrible. It's called Last Train Paris, and it was like an AI generated rom com. And in the YouTube video, it's like the lip syncing of the audio and the lips is like really bad. The characters move incredibly robotically, and it has this very dreamlike quality to it that is very common with AI generated visuals, where it's not like a cool effect. It's like, wow, this is really distracting because the background is constantly swirling and changing and things are popping in and out. And after I wrote that article, they still decided to invite me, So I thought that was brave of them.

But what did you think, I mean, what were you kind of expecting going into it?

Going in? I thought that they would be pretty bad, to be totally honest with you, just because the state of the art at the time. This was back in December, which it was only three months ago, but at the time, AI video generators were pretty bad, and I didn't think that TCL had access to some proprietary system that we hadn't seen before. I figured that they would be using the state of the art that you can find on the internet, and I think that those tools are not very good, and so I thought that they would be bad, to be totally honest with you, and they were bad.

Can you describe some of the highlights on the low Lights?

Yeah? I thought that the films themselves were just they felt pretty rushed. So one of them was called The Slug and it's about a woman who turns into a slug. She has a disease that turns her into a slug and it feels like The Substance, which is another you know, Oscar nominated film. The visuals on it are wild. Things are just like constantly changing. Her face is changing, the you know, the food is changing. There's a lot of like weird screams that happen that are not super well timed with the dialogue. And then also there's like a scene where the woman takes a bath and there's like a close up on some bath salts and like the text on that label is like an alien language because AI has like a really bad time generating text, and I guess you can take it with a grain of salt or say like, hey, this is early technology. But when you're watching something as a viewer in a movie theater on this giant screen and the text is completely not even in English, it's like, wow, it really takes you out of the narrative.

I would say, I mean it's a weird idea, right, because I mean you mentioned this is for TCL, the Chinese TV manufacturer, and the assumption be like, they don't want you to change the channel, right, they want you to have their own channel on kind of in the background so that you know your attention is with them and they can sell you ads whatever it may be. But that's very different to like putting hundreds of people in a movie theater and kind of fulcing them to watch with full attention, right, yeah.

Yeah.

And it's very interesting because before the movies played, two TCL executives addressed the audience, and it was very interesting the difference between what they were saying and what the filmmakers were saying, because the TCL executives were business people and they were saying our research shows that almost no one changes the channel once they're watching something like this, like they are watching it in the background usually, and so their hope is that you're just going to be too lazy to change the channel.

So inspiring creative brief.

Right, right, And then the other executives said, like, we're going to use this as part of our targeted advertising strategy, which was pretty dystopian. And then the actual filmmakers came on and said, you know, we put our heart and soul into this, and we think this is the future of the industry. So that was kind of like a whiplash situation for me in the audience.

When we come back, more from Jason Kebler about the rapid advances in generative AI video technology and how the state of the art is evolving in real time, stay with us. Welcome back to our conversation with Jason Kebler from four or four Media, where we continue our conversation about a recent AI film festival he attended. There was one film though, which I think was like a kind of blended documentary and AI film that you thought was potentially a bit more interesting.

Yeah, I thought it was pretty cool. I mean, it still had a lot of problems, but It was called The Best Day of My Life, and it was mountaineering documentary where a mountaineer who got trapped in an avalanche is talking directly to the camera, like the actual person is talking directly to the camera recounting his story, and as he is telling his story, they flashed to generative AI depictions of what he is saying, And so I thought that was kind of interesting because this is something that happened to the guy. He obviously didn't bring a camera with him at the time, and you were able to sort of like see what he was describing.

In a way that was actually viscerally compelling, or in a way that's still felt a bit uncanny and jarring.

In a way that made me think that maybe this has potential in the future, but this isn't quite there yet, because it similarly like the there's various scenes in the film, and the guy who's happening to changes in each scene. It's like his face looks different in different scenes. He was under snow because it was an avalanche, and then in the next scene all of the snow had turned to mud, and then it turned back to snow, and it was like, similarly took you out of the narrative, but I thought that the idea behind it was pretty interesting and I could see that being a direction that future documentaries go.

And was what was the feeling like in the room? I mean, who else was in the audience? What was the general takeaway from this experience?

The mood in the theater was one of incredible optimism and excitement. It was a mix of people who had worked on these films and people who have like a lot of money invested in the idea that this is going to be the next big thing in Hollywood. And so the mood in the theater was one of incredible optimism and excitement. Meanwhile, the films like Objectively are not good. They're really They're all on YouTube now and if you go watch them, like the comments brutal, there's not a lot of views on them. I think on some of them, the comments that you even been turned off because people are like, how could you dare put this on my television. So I did think it was interesting because it reminded me of things that I had been to in the past, for like virtual reality or for cryptocurrency, things like that, And a lot of people have said like generative AI is the new crypto, it's the new metaverse, it's the new virtual reality. And I think that AI there's like a lot of snake oil out there, but undeniably companies are leaning into it in a way that's going to affect us and affect workers and affect people in the industry.

It's also interesting where companies fall in terms of how vocal they want to be about how they see the AI future unfolding. Right, Like, obviously for Chinese TV manufacturer, alienating Hollywood doesn't really matter that much, right, whereas like full Hollywood studios had to behave very differently.

Yeah, it's super interesting, and that's a great point because, as you said, like the Writer's Guild strike was partially about generative AI in the writer's rooms, a lot of voice actors, going back to Respeecher, voice actors in both the video game world and the animation world are really worried that AI voices are going to replace their jobs or that they're going to get less work because AI is going to be used to generate voices for animation and video games. And then, of course, like you said, a lot of companies are laying off their workers in a bunch of industries and then realizing, oh wait, the AI is not good enough to do these jobs yet. And so there's a real tension about it because fundamentally, this is an automation technology. It's designed to replace human labor or do things that sometimes humans can't do. And I do think that a lot of companies are going to be able to differentiate themselves by saying we do not use AI, we respect human artists, we don't want to do that. And then some companies are going their total opposite way, like TCL, which has very little original programming, very little relationships in Hollywood. They don't care if they piss off directors and actors and things like that because they're just trying to make a name for themselves, so they're able to be more aggressive about this.

So I guess, on the one hand, you have like TCL and more or less fully AI generated films. On the other hand, you have the brutalist where you know at the margins AI was used and respeech, she was used to do some accent correction. Do you see like ultimately a convergence between those two things, or do you think it will remain that like AI is either used in like premium productions for optimizing posts, shall we say. And on the other hand, you have like this kind of wild west of full AI generation, which is a long way off from being consumable.

Yeah, I mean, I do think it's a spectrum and slippery slope, if you will. And Special Effects have in general been incorporating a lot more AI over the last few years. I think one that was really interesting to me was when the first deep fakes were sort of invented, maybe like five or six years ago, where you can like replace someone's face with another face. Star Wars had tried to generate like Carrie Fisher after she had died for one of the Star Wars films, and apparently they spent like millions of dollars doing this. And then someone on Reddit using deep fake technology was able to do something that was almost indistinguishable from what Lucasfilms had done, like on their computer at home, for free. And so I do think that we're going to see a lot more of this stuff in films, but you may not even notice that's happening when they start replacing artists, replacing musicians, replacing actors with AI. I think that's I personally think that's a problem, and I think that that's when you end up with a lesser product. Yeah, I don't know. I hope that AI is going to be used to make films better, not to create tons of low budget, poorly made films that are designed to scratch a specific itch or perform an algorithm, which we're definitely gonna see a lot.

Of itist you're a humanist at HUT, Yeah, yeah, And I mean you mentioned that this film festival was a couple of months ago. Has the state of the art change since then? I was playing around with this Google deep Mind product called vo two. At least on like a scene by scene basis, you can make pretty good photo realistic depictions, but then like a couple of seconds each. I don't think they've figured out that any means how to stitch them together or make continuity. But how is the state of the art devolving?

It's changed a lot in the last three months. There's been a lot of Chinese companies that have released video models in the last just a couple of weeks, like ten Cent, which is a massive Chinese company, released a new video model that seems to be better than most publicly released video models. You know, it was sort of immediately used by people to create non consensual pornography, which is quite upsetting and is what a lot of people are using these tools for on the internet. But basically it's like every week there's a new model and they're they're constantly leapfrogging each other. So you know, one will be able to generate hands better than another, one will be able to generate faces better than another, one will have like better movement when you try to make these people move, or they require less training data, meaning you can make videos based on one input image versus having to feed hours of footage into a model to create something else. And so you know, these are things that like AI nerds spend a lot of time caring about, and I would say that there is a big generational difference between them. But as like a consumer of these things, you might not know that this is happening behind the scenes. But the short version is basically it's getting easier to make a generated video, it's getting cheaper to do it, and the quality is getting better and it's changing on like a day to day basis. At this point, Jason, thank you so much. Thank you so much for having me.

That's it for this week. For tech Stuff, I'm oz Voloshin. This episode was produced by Eliza Dennis and Victoria Dominguez. It was executive produced by me Carrot Price and Kate Osborne for Kaleidoscope and Katrina Norvell for iHeart Podcasts. The Heath Fraser is our engineer. Kyle Murdoch mixed this episode and he also wrote our theme song. Join us next Wednesday for tech stuff The Story, when we'll share an in depth conversation with the neuroscientist David Eagleman about people who develop romantic relationships with AI. Please rate, review, and reach out to us at tech Stuff podcast at gmail dot com.

Eight

In 1 playlist(s)

  1. TechStuff

    2,447 clip(s)

TechStuff

TechStuff is getting a system update. Everything you love about TechStuff now twice the bandwidth wi 
Social links
Follow podcast
Recent clips
Browse 2,444 clip(s)