Clean

COPPA, YouTube and a Creepy Cartoon

Published Jul 6, 2022, 7:01 PM

When Kris Straub uploaded a creepy animated short to YouTube in 2018, he indicated that it was for audiences 18 and older. So why did YouTube's AI overrule Straub to say the video is suitable for kids? We look at COPPA and how it has affected YouTube.

Welcome to Tech Stuff, a production from I Heart Radio. Hey there, and welcome to tech Stuff. I'm your host Jonathan Strickland, Diamond Executive producer with I Heart Radio and how the tech are yet So recently, the YouTube content moderation AI made a notable mistake. Artist and writer Chris Straub, whom I once interviewed many years ago for an article about how web comics work, has made a lot of different stuff over the years, and one of his creations is a video series called Local five eight TV, which is described on the YouTube channel as quote an analog horror at four seventies six mega hurts end quote. The series relies on a low fi VHS aesthetic which has, you know, in a popular style in the horror realm for the better part of a decade now, and I'm sure you've seen plenty of examples, such as the game series Five Nights at Freddie's or the VHS anthology Horror Stories. There are tons of these of this kind of throwback to like nineteen eighties era looks and aesthetic for horror. Anyway, one of the episodes Stroub created back in has the title show for Children and the description on the video reads not for children, and the whole stick here is that Stroub has gone for a beyond generic title for this short. In fact, in the context of the Local five eight t V universe, it's the television station the Local five eight has show for children listed on its on its lineup, and that's just the title, like, it's kind of a commentary on this sort of very generic, low budget children's program um And the short itself follows a cute little cartoon skeleton walking through a cemetery at night with the moon overhead as a warbly soundtrack plays in the background. The skeleton comes upon an open grave and looks into it and sees a less cartoonish skeleton inside it, and then continues on its way, and gradually the music fades out. We get the sound of wind as this cartoon skeleton looks into other open graves. It sees a little weird skeletal creature in one that kind of mules like a cat, and sees a bottomless hole in another, and apparently goes inside it and is walking in like a cave with its arms folded as if the skeleton is freezing, and it comes upon another open grave within this cave and lies down inside of it, looking up through the grave to the moon, and then transforms into a more realistic skeleton. And that's the end. Now, on the whole, it's a pretty tame but creepy short video. But Straub knew what audience he was aiming for and it wasn't kids. And so even though the title read show for children, the description said otherwise. And moreover, Straub went a bit further. He used the content settings in YouTube to tag the cartoon as an eighteen and up video and that was that for a while. Anyway, flash forward to this week in twenty two, when Straube discovered that YouTube's content moderation AI had mysteriously changed this video's setting from eighteen and up to being suitable for children. Now, that also means that his short could potentially appear in the YouTube Kids app More on that later. Now, clearly that's not what Chris Strab wanted. He went on to Twitter to reveal this problem, including screen shots of some of the issues. He had intentionally flagged his video as being for mature audiences. The YouTube moderation AI reverse that setting without Straub's input or consent. Moreover, Strab found that he couldn't change it back. His ability to switch the video away from being flagged as suitable for all kids had been grayed out. He could not switch it to eighteen and up again, so his only option was to file and appeal with YouTube, which he did. Now, obviously there are a lot of questions that pop up because of this incident. So why would an automated content moderation tool be able to reverse an age gated restriction on a YouTube video? You know, it makes sense if the moderation algorithm detected a video that had no age restriction should have age restriction, right like, if a video detection system determines this video has got some objectionable material in it, it makes sense that it could flag that video and change the setting so it's not listed as appropriate for all ages. You could understand that maybe the content creator could say, oh, no, this is a mistake. Review it. You'll see that this content is in fact appropriate for everyone, and maybe get that decision reversed. You would think that that would be the only way this kind of AI would make an error, right like, it would be overprotective, rather than potentially reversing a video that had been flagged as being eighteen and up as being suitable for all ages. So what is going on here? Well, to get to the bottom of all that, we're gonna have to talk about YouTube's shoddy history with content for kids, which goes back pretty much to the founding of YouTube. We're also gonna have to talk about the Children's Online Privacy Protection Act. In fact, that's going to be the meat of most of this episode. That's also known as Kappa CEO p p A and the ridiculous tendency for lawmakers to deem anything that animated is intended for kids. That seems to be kind of a go to, which you would think by now people would realize that's just not the case. In fact, I mean, I'm going to talk about this again later, but animation has a very long history of being a medium for different age groups. I mean, there are old cartoons that were never meant to be shown to kids. And we're also going to talk about the business of online video because that plays a big part in it too. So let's start with Kappa because that actually predates the launch of YouTube. You know, I'm sure a lot of people heard about Kappa, maybe around definitely more around twenty because that's when it was really having a huge impact on YouTube. But in fact, KAPPA was initially passed in and was enacted on April twenty one, two thousand, so that's five years before YouTube came out, and KAPPA was intended to protect children online using various measures, including restrictions on how sites could harvest and use personal data, as well as requiring sites to seek very aifiable consent from a parent or a guardian before a child was allowed to use it. The law specifically protects kids under the age of thirteen. So if you've ever wondered why so many social network sites require users to be thirteen or older, this is why. It's because creating a service that complies with Kappa is really darn hard to do. It requires a lot of oversight. I mean, just having a verifiable way for parents to consent is tough on its own, right, Like, how can you verify that it was in fact a guardian or parent who gave consent for a child to access that material. If you've ever encountered any site that asked you if you were of the appropriate age, you've probably noticed that all it takes is clicking the yes button or the I'm over eighteen button or whatever it might be. That's not verifiable, right, that's just taking your word for that. Anyone could click an affirmative answer even if they didn't meet the criteria. There's nothing stopping them. So building an age gate that verifies answers, or at least as capable of verifying answers is way more difficult. And it also means that you potentially are reducing the amount of traffic to that site or service. So a lot of companies just opt out of doing it entirely and say, hey, our stuff is not meant for kids under thirteen, so don't even try. Doesn't mean that kids can't try, and it doesn't mean that those sites prevent kids from trying, but you know, they're at least trying to take the appearance of not being intended for younger children in an effort to not be lumped in as something that would be covered under Kappa um. There. You could have a very long discussion. It's almost like theater, right, Like it's like security theater giving the appearance of propriety in this case but not actually being proper. Uh, we could have a week long discussion about that. But you know, just keep it in the back of your mind. So KAPPA also specifically focus is on sites and services that are specifically directed to children. Moreover, the courts of interpreted Kappa to treat user generated channels like user generated content, as if they the person who created the content owned and operated the site it was on. So, in other words, if you made a channel that a reasonable person would interpret as having content directed to children, well then your channel would be subject to KAPPA and you could be treated as if you owned and operated the site, though the site itself could also be facing its own issues, which YouTube has faced numerous times in the past. Now let's talk about YouTube in particular now, as I'm sure we all know, YouTube and Google in general makes most of its revenue through selling ads against content, and we probably all know that the most valuable advertisements online are targeted advertisements. Those just have a higher chance of making an impact on a consumer. Here's a very simple example. If you happen to know that the person who's visiting your channel and watching a particular video has been shopping for shoes recently, serving up an ad about shoes could be a huge payoff. The person might respond to that ad, and that ad becomes more valuable. But to target ads, you have to learn about the audience first. You can't target an audience with advertising unless you know more about them. You need to know stuff like where they're living, because you don't want to serve them ads for things that aren't available in their area. You need to know what they like and what their habits indicate about those people. You might also want to know stuff like how old they are, what gender do they identify with, that kind of thing. Now, it's one thing to track personal data that belongs to adults, it's an entirely different matter when we're talking about children. That's when Kappa comes into play, and that's when you really need that very viable adult consent. Okay, we're gonna take a quick break. When we come back, we're gonna talk more about Kappa and YouTube. Okay, so let's talk about Kappa and YouTube. While a channel creator might make stuff directed to children, that channel creator isn't fully responsible for tracking the data of that person necessarily, right, Like, you can get some insights on your audience through the creator part of YouTube. Like when you're actually owning a channel on your uploading videos, you get insights into your audience to some degree, but it's not like you have all the tools available to you to make use of that information the way YouTube does. Like YouTube really takes the reins on that, right, It's it's the platform's responsibility to secure advertising against the content. They're better qualified to do it and able to do it on a much larger scale, so you aren't fully responsible for that. However, you know, if you have comments or whatever enabled on your on your video, people might be leaving information that reveals personal data about themselves, like their name or where they're from, or any sort of of of information about themselves, and you could be keeping track of that, right So it wouldn't necessarily just be YouTube. You could be tracking that too. So if a content creator has enabled ads against their content, if they're trying to monetize their their videos, that brings up a ton of red flags when you know, view it in line with Kappa. But hey, what determines if a site or service or channel is directed to children in the first place, Like, what is it that makes us say, yes, this channel is meant to be shown to children. Well, there's no cookie cutter answer for that. Ultimately, if it came down to it. The FTC or Federal Trade Commission would have to rule if something is actually directed to children, using various criteria, and that criteria can include things like the subject matter involved. Right Like, if the video channel is all about home appraisals, chances are the FTC is gonna say, well, clearly, this isn't directed to children. Even if you're using cartoons to illustrate how home appraisals work. The subject matter alone is not going to appeal to children. It's not directed to them. It's not really a question. But the visual style of the content does matter, and that includes whether or not there's any use of animation. Animation is one of the criteria. So again, I know that my my animation fans out there all feel this way. There's tons of animation that's not meant for children. It's not necessarily incredibly offensive or whatever, but there's tons of animation animations like any other medium, right, There's there's animation out there that's for different audiences, including audiences that don't include children. However, animation is one of the criteria that the FTC might look at if determining a channel is directed to children or not. The type of music being used in a video can also be a factor, right, So if the music is very jaunty and fun and there's animated characters, those things could be elements that the FTC takes into consideration to say, like, well, you know, even if you don't intend for this to be seen for children, it's designed in such a way as to appeal to children, So it gets kind of gray and vague. If your videos have the presence of children in them, or people posing as children and them, that can also be a factor. There are a lot of different elements at play here. But hang on a minute. I hear you say, isn't this too broad of a brush? And yeah, each of these taken by themselves, definitely would be too broad a brush. If, if it only, if only one of these check boxes had to be checked for the FTC to say, well, clearly this this channel is directed to children, or this website or this service or whatever, then that would be ludicrous. It would be way too much over each right, and so you would have to definitely dig further down. There has to be multiple factors here, and it has to be reasonable to assume that the material itself was specifically created in order to appeal to children and attract children to it. So this is not a new thing. This is something that you know has been an issue for ages, and for some cycles of content, it's more prevalent than others. Right, Like, you can look at stuff that has been created over time that uh, you know has like a retro appeal or a nostalgia of heel to it that you could, on one hand say, oh, I can see how this could be considered directed to children. I'm thinking of things like Robot Chicken, a series where toys and and animated figures are used to tell jokes that are not meant for kids. Like the material in those shows are not meant for for kids, uh, And it's meant to appeal to an age group of people who grew up playing with like action figures and stuff. But the fact that it is animated and that's using toys, those are things that the FTC might look at and say, you know, it's hitting some of the criteria we consider for material that was meant to be directed to children. So yeah, there's a real um sticky situation there. And you can see how by using some of the criteria that the FTC uses to determine if something is in fact directed to children. Such a short as Chris Stroubs might get lumped in there, right, had a cartoon skeleton and had some fun music in the beginning. Never mind that the short is sort of a dark parody of those kinds of cartoons, and it is in fact leveraging that retro aesthetic to convey an unsettling and creepy story. You could argue that just because something looks similar to material produced for children, that it should qualify as intended for children. I don't think that's a good argument. I don't think you can argue that just because something resembles something else that it makes it the same thing. I think that's a dangerous argument to make. But you can start to see where YouTube's AI started to make mistakes. Now, if you're someone like Stroube, there's a very good reason you would want your video to not be flagged as appropriate for all ages if in fact the video isn't. See if the FTC determines that your service or site or whatever is directed to children but is violating Kappa on some level, then you could face a maximum fine of forty thousand, five dollars per violation. That's a heck of a fine if you're a content creator, right like, you get hit for that because your video was deemed to be inappropriate for children and yet designated as appropriate for children. Explains why Straube is so frustrated that YouTube has overruled his own designation that the video is meant for older audiences. Strab could be on the hook for a large fine if a complaint were brought against his channel. And now let's talk about YouTube's own lousy history of curating content for children. It is no surprise that there is a lot of content on YouTube that's unsuitable for kids. I would argue there's a lot of content on YouTube that's unsuitable for anyone. I see examples of it all the time. But we're gonna put that aside for now, because that's just grumpy old man yells at cloud material. So YouTube launched back in two thousand five, and it did not take very long for folks to find stuff that was not appropriate for kids and yet appeared to be aimed at kids. Now, that pretty much has been part of YouTube's landscape throughout its entire history, but for a very long time that issue was largely ignored or at least not talked about very much. In two thousand fifteen, Google released an app for Android and iOS called YouTube Kids. The app was meant to give parents more control over what their kids could watch on various devices like tablets or smartphones or smart TVs, and the content was curated. The idea was that the only content suitable for children under the age of thirteen would make it through a first past filter into YouTube Kids, so that parents could be sure that whatever their kids were watching on this particular version of YouTube would not be inappropriate. But on top of that, parents would actually have additional parental controls so that they could restrict it further if they wanted to. The curated content issue really came under the spotlight in two thousand seventeen with the onset of what was later called Elsa Gate. That name references the character Elsa in Disney's Frozen series, and gate is a shorthand for saying scandal, and they used Elsa because the character of Elsa would figure prominently in tons of videos that were being created for YouTube and YouTube Kids. Some of those videos were live action. In fact, a lot of them were as just someone in an Elsa costume and wig, some of the shorts were animated. Sometimes the animation was actually not terrible, most of it was super cheap and limited and shoddy, and it was like a fire hose of content had just been dumped on YouTube all at monts. The consequences of that would end up being pretty serious, both for YouTube and for content creators. I'll explain more after we come back from this quick break. So some of the videos that were uploaded during Elsa Gate were just weird but otherwise mostly innocent. Other you know, they're just had a bunch of people in cheap costumes running around doing weird sketches, often without any dialogue at all. But other videos included material that was really unsuitable for kids, including sexually suggestive content, violence, other disturbing content, you know, cruelty, that kind of thing. And the videos would smash together tons of popular characters, and there was no question whatsoever that the people who were making these videos weren't you know, bother ring to actually license those characters. So this was all a lot of folks infringing upon intellectual property going on here, like at a truly absurd level. And you would have videos with you know, characters like Spider man talking to a pregnant Elsa while you see the joker running around in the background. It was just like a smorgas board of you know, popular characters that kids would recognize. And like I said, some of these videos were just bizarre with no real rhyme or reason to them, and they had like, you know, background music, but they wouldn't have dialogue. And obviously leaning on popular children's characters while not using words meant that those videos could become popular all around the world. There was no language barrier to overcome, so you didn't have to worry if the kids in you know, whatever region couldn't understand say English or French or whatever, because there was no no speaking or or language at all. And many of the videos became more bizarre, not necessarily out of a desire to warp kids minds, but rather driven by data. See. Content creators were paying attention to topics that were popping up in search terms and videos that were getting popular. So if topics were starting to gain in popularity, you would start seeing videos that somehow tie into those search terms. It was kind of like the old days of the web where people would just hide a massive text dump on a web page, so that search engines would pick up on that page and index them, even if the page itself had nothing to do with the search term. It was just that, you know, buried somewhere on the page was that term, but the page itself had nothing to do with that. Like I said, there have been some channels on YouTube that created this kind of weird content directed at kids for years, and you might wonder why. And again, if you can monetize your content, your goal is to get as many views on as many videos as possible. You want to get that that view count sky rocketing. Kids are a great audience to have that happened because kids tend to fixate on content. They like to experience the same thing over and over. It becomes kind of like a ritual. I'm pretty sure every parent out there is familiar with the experience of having to read the same bedtime story for the hundredth time to their kids, including doing all the voices. So for me, when I was a kid, it was There's a monster at the end of this book, and then it was Hamlet. I progressed pretty quickly, and you might think I'm joking, but I actually once found a cassette tape labeled for Jonathan that my dad made for me that included him reading Hamlet and doing different voices for the characters. Anyway, kids would rewatch videos and that would drive up the watch count, that in turn would prompt YouTube's recommendation algorithm to promote that video to other people. The idea being that if something is gaining popularity, then with a little help, it could go totally viral. And because YouTube makes most of it its revenue through advertising, and advertisers will pay more for videos with more views, keeping the viral machine going is what pays the bills. Plus, like if it's proven to keep people on the platform. The other big goal of YouTube is to keep you there as long as it possibly can, just like any other online platform that depends heavily on ads for its revenue. Facebook does the same thing, right, the idea of being let's eliminate the desire to go anywhere else and just keep people here. So these channels started to turn out content at a crazy pace, and kids would watch and rewatch the videos, and YouTube would continue to promote them. And in twenty sixteen, so just one year after YouTube Kids launched, The Guardian, the newspaper ran a story about one of the channels creating bizarre and sometimes disturbing content directed toward children. American news outlets would begin to cover the story in two thousand and seventeen. Early on it was like the tech journals that were covering it. By the end of the year you had mainstream newspapers and magazines covering it, and they started listing out the dozens of channels known for blasting out odd and sometimes unsettling videos, all marketed to children, most of which were popping up on YouTube Kids, many of which you would say are not appropriate for that. The media attention cast a pretty critical light on YouTube, asking how a platform could allow this sort of content to proliferate across its platform, particularly the parts set aside specifically to curate appropriate content for children, and YouTube didn't really have any good answers for that. Now. One answer is that YouTube has way too much content uploaded to it per minute for humans to keep track of it all. Back in two thousand seventeen, YouTube claimed that users were uploading around three hundred hours worth of videos every single minute of the day. So let's say you're hired YouTube to review content that's been uploaded to make sure it doesn't violate any rules. So you watch a ten minute video and by the end of that video, three thousand hours worth of material have been added to YouTube. There's just no way for humans to keep up with that kind of a content fire hose. You could employ tens of thousands of people, you still wouldn't get through all the material that's going up every single minute of the day. So one reason this stuff was getting past YouTube is yet there was just too much of it. YouTube depends heavily on users following rules, so kind of the honor system there, or users that flag content that violates the rules, so it becomes like a self policing community, and then YouTube employees are more likely contract workers can then review the flagged videos and determine do they actually violate any policies. Companies that hold a lot of like intellectual property, like the various music labels and movie studios and TV studios out there, they tend to be a lot more proactive in seeking out videos that violate their i P and they are known for flagging those videos to get strikes against the channel um or they will you know, use an option that demonetizes the video for the channel owner, but directs all monetization to them because they owned the i P. But while there are search algorithms that can seek out video and audio that violates copyright, it gets way harder when you're talking about just a bunch of people in cheap costumes posing as licensed characters in an unlicensed video. That's a lot harder to detect, you know, with a regular search algorithm. YouTube did respond to the issue, however, The company moved to demonetize videos that were deemed offensive or controversial that were directed to children. It also shut down lots of channels, like around fifty channels at one point, and it turned off comments on thousands of videos because people recognize that comments sections could include opportunities to prey upon children, like it could include sexual predators, for example, trying to reach out and lure children. So YouTube said, all right, we're just gonna We're going to discontinue comments sections on any videos that are determined to be directed to children as a safety measure. However, that still wasn't enough. In twenty nineteen, the New York Attorney General's Office and the FTC would sue YouTube slash Google for violating Kappa the lawsuit accused YouTube of collecting the personal data of children without first getting parental consent, and that this came from children directed channels posting videos and YouTube monetizing those videos by pairing them with targeted advertising. Uh so this wasn't necessarily stuff that was showing up on YouTube kids, although some of it was, but rather just YouTube in general saying these are our channels that are generating videos meant for children, which means that children are overwhelmingly the audience watching them, and you haven't put any protections in place to prevent these the personal data of these children being just harvested and then exploited. So Google would eventually settle out of court for a whopping one seventy million dollars. Now, in the grand scheme of things for Google, anyway, that's not that much money. But I can assure you that stakeholders would much rather see that hundred seventy million dollars stay with the company rather than get siphoned off to pay off fines. The settlement also required YouTube to commit to quote, develop, implement, and maintain a system that permits channel owners to identify their child directed content on the YouTube platform, so that YouTube can ensure it is complying with Kappa end quote that's from the FTC itself, and so YouTube made a renewed effort to stave off future Kappa violations. So it was in twenty nineteen and early twenty twenty that YouTube really began to push harder to force channels that had children directed content to be compliant with Kappa. One big change was that channels that were determined to be children directed would no longer be able to run targeted ads against their videos, So those videos wouldn't have a comment section either, and some other community features would similarly be eliminated or turned off. This would have the consequence of severely restricting ad revenue for those types of channels. Again, because targeted ads are so valuable compared to other types of advertising, YouTube creators would have to designate their channels as being four kids or not for kids. From a monetary standpoint, choosing not for kids made the most sense, right because that's when you could use targeted advertising. But if YouTube determined that the channel was in fact directed to children, then YouTube might override the user choice. You can't just say this cartoon about lollipops playing games with peppermint candies isn't meant for kids just so you can have targeted ads run against your video. Now, this brings us back to Chris Stroub. His video is clearly not meant for children. If you watch it, you would agree, not that it's particularly dark or disturbing. It's a little dark, but it's not super gruesome or anything, but it's it's meant to play on adult sensibilities. And this is where I really have issues with YouTube's AI. I think it is terribly irresponsible to allow an automated system to determine if content is suitable for all ages. If you run it the other way, that makes sense. If AI determines that a video that has previously been set as being suitable for all ages isn't, then changing that video so it's age restricted, that makes sense. Content creator can appeal the decision, but more importantly, a potentially offensive or disturbing piece of content gets removed from circulation among kids, which I think is the most important element. But to take a video that's been expressly flagged by the creator as being inappropriate for children and then flipping that switch without the creator's consent, that seems irresponsible to me. Now, I can see some justification to review stroubs video right, like, I can see why the AI might flag it to say, someone check this out and make sure that, in fact, it should be listed as eighteen and above. The video does feature an appealing cute Cartoons skeleton character, you know, at least cute as far as skeletons go anyway, and early on it has some fairly jaunty music playing, so you could see how it might at first glance be content directed to children. But if you watch it all the way through, you realize, okay, this is not meant for kids, and Stroube took steps to show that, like he he designated that video is not being for kids. So to have an automated system reverse this decision, and moreover, to prevent Stroube from being able to reset the age restriction, that's not a good look for YouTube. Early in there was a pretty big kerfuffle and multiple YouTube communities about the impact of Kappa and YouTube's new policies. Channels that covered topics like animation, video games, and toys particularly came under close scrutiny as some of those channels might have been you know, kind of borderline cases some of them were clearly directed to children. A lot of them had kids as hosts, but a lot of the channels weren't directed to children. They were clearly not meant for kids. They were expressly directed to older audiences. And it still speaks to a lot of false preconceptions that lawmakers who typically are you know, a little out of touch, let's say, which is a way of saying many of the are old and don't really have a deeper appreciation for this, and and those false preconceptions include stuff like, you know, thinking collectibles, video games, and animation are all expressly the domain of children, and they're not, and they haven't been. In cases like animation, that's never been the case. It's never been just for kids, but that frequently is how it gets viewed. The kerfuffle on YouTube has died down a little bit since, I mean, obviously we got Chris Strobs uh recent incident, but you know, as a whole topic, it kind of died down, and there's some reasons for that. We had a global pandemic which really took over. Headlines like the issues with YouTube and Kappa were starting to come to a head at early well by March that seemed like a quaint worry compared to other things that were going on in the world. Plus since then, we've had some truly enormous events, ranging from political insurrections to war in Eastern Europe and more. But we're still seeing the effects of YouTube's shift. On the one hand, you can understand why the platform is taking such a drastic step. It's protecting itself from future litigation. It's trying to fulfill the obligation it has to the FTC when it made that settlement in that Copper case. You can also understand this is a monumental task when you consider how much content is joining YouTube every single minute of the day, like now we're talking about more than five hours per minute. But you can also see how an automated system can make changes that unfortunately can have the opposite of the intended effect. They can actually end up making content that is expressly not for kids, flagged as being suitable for kids, that is clearly not something that YouTube wants to promote. That's just going to be asking for another Coppa case down the road. And if it's if it's something like this where you've got the documentation where STROB shows I flagged this as being inappropriate when I uploaded it. It was never meant to be shown to kids. YouTube overruled me and then prevented me from fixing it. That's a really bad look for YouTube. Anyway. I hope you learned something in this episode. Obviously we could talk a lot more about Kappa and the unintended consequences of that legislation. Again, I think that the legislation itself came from a good place, but as we frequently see when we talk about you know the intent of law, and then we see what happens when we actually enforce a law, there can be a disconnect there. Uh. I do also think that we should be seeing a lot more data privacy protections in place for everyone, not just for kids. I think for kids it's absolutely crucial, but I would love to see that because more of a thing for everybody. You know, Europe has certainly made great strides toward that. In America, we're starting to see at least some discussion around it. I don't know that it's ever going to turn into actionable items, but here's hoping. And that's it. That's all I've got. So we're gonna wrap up this episode if you have any suggestions for future episodes of tech Stuff. There are a couple of different ways you can reach out to me. One is that you can send me a message on the I heart Radio app. It's free to download. You just navigate over to the tech Stuff page. You'll see there's a little microphone icon. If you click on that, you can leave a voice message up to thirty seconds in length. Send it my way. Let me know if you want me to use the audio in an upcoming episode, because that would be fun to do. But I only do opt in. I don't do opt out, and then if that's not your style. You can also reach out to me on Twitter. The handle for the show is tech Stuff hs W and I'll talk to you again really so yeah. Text Stuff is an I Heart Radio production. For more podcasts from I Heart Radio, visit the i Heart Radio app, Apple Podcasts, or wherever you listen to your favorite shows.

In 1 playlist(s)

  1. TechStuff

    2,449 clip(s)

TechStuff

TechStuff is getting a system update. Everything you love about TechStuff now twice the bandwidth wi 
Social links
Follow podcast
Recent clips
Browse 2,446 clip(s)