As Meta fights a high‑stakes Los Angeles trial alleging Instagram’s design harms children, the company has simultaneously launched a new tool that alerts parents when teens repeatedly search for suicide or self‑harm terms. The feature arrives amid mounting scrutiny and claims that platforms intentionally foster addiction in young users. Some observers view the timing as damage control, while others argue it reflects a genuine safety upgrade rather than an admission of wrongdoing. The alerts roll out next week, including in the U.S., as Meta continues defending itself in court.
Please Like, Comment and Follow 'Broeske & Musson' on all platforms:
---
The ‘Broeske & Musson Podcast’ is available on the KMJNOW app, Apple Podcasts, Spotify or wherever else you listen to podcasts.
---
Weekdays 9-11 AM Pacific on News/Talk 580 AM & 105.9 FM KMJ
-
Everything KMJ
KMJNOW App | Podcasts | Facebook | X | Instagram

AMERICAN PATRIOTISM! Jack Hughes’ Triumphant Devils Return Becomes a Red‑White‑and‑Blue Celebration
13:10

MONSTER! Elderly Parole for Convicted Serial Child Predator Sparks Outrage
34:15

$1 MILLION REWARD: Guthrie Family Hopes Reward will Break the Silence
10:23