As Meta fights a high‑stakes Los Angeles trial alleging Instagram’s design harms children, the company has simultaneously launched a new tool that alerts parents when teens repeatedly search for suicide or self‑harm terms. The feature arrives amid mounting scrutiny and claims that platforms intentionally foster addiction in young users. Some observers view the timing as damage control, while others argue it reflects a genuine safety upgrade rather than an admission of wrongdoing. The alerts roll out next week, including in the U.S., as Meta continues defending itself in court.
Please Like, Comment and Follow 'Broeske & Musson' on all platforms:
---
The ‘Broeske & Musson Podcast’ is available on the KMJNOW app, Apple Podcasts, Spotify or wherever else you listen to podcasts.
---
Weekdays 9-11 AM Pacific on News/Talk 580 AM & 105.9 FM KMJ
-
Everything KMJ
KMJNOW App | Podcasts | Facebook | X | Instagram

DAZED AND CONFUSED: Bodycam Video Shows Tiger Woods Telling Deputies “I’m Talking to the President”
29:24

BREAKING NEWS: U.S. F‑15 Down Over Iran
38:45

INTIMIDATION? Trump Makes Rare Supreme Court Appearance in Birthright Citizenship Fight
12:45