The doomsday clock hitting 85 seconds to midnight is the kind of fact that lands hard and then disappears into the noise. You file it somewhere between knowing you should care and not knowing what you'd do about it anyway. That gap, it turns out, is the most important variable in the whole equation.
Picture the same tool that mapped every human protein, a Nobel Prize-winning breakthrough built to prevent disease, being used by someone who asked it the wrong question. The safeguards exist. They also have a workaround that takes about as much effort as convincing a chatbot you're an adult. Meanwhile, the early warning systems feeding nuclear decisions are already running on AI, and the officials those systems report to haven't been told. The distance between safe and not safe is shorter than the reassurances suggest.
Governing this technology is still not happening at the scale it needs to. The clock has moved in both directions before, and both times it was because of what ordinary people decided to do with their attention. That window is still open, but it has never been this narrow.
Topics: doomsday clock, AI bioweapon risk, nuclear early warning systems, existential threats, public engagement
GUEST: Dr. Jon Wolfsthal | https://thebulletin.org/doomsday-clock/
Originally aired on 2026-03-09

SHIFTHEADS: The Pump Price That Works Against You and For You at the Same Time
08:23

Why Reddit Kept Its Rules When Everything Else Didn't
09:46

ICYMI - Monday Study Panel - Cost of Business in Farms and Cities
19:08