AI data privacy is built on promises nobody can verify, and you've already accepted those terms. You pay a monthly subscription to feed your ideas, your business data, your writing into a system that will, when pushed correctly, reveal behavioral patterns across its entire user base. You're building their platform. You're funding their valuation. And the data agreement you clicked through didn't cover any of this.
Mohit Rajhans points to the gap most people miss: the tech has already changed completely from 18 months ago, and it's still hallucinating on basic requests. Businesses are attaching full-access bots to their internal systems, yet a simple conversation can extract behavioral data from an AI about how it processes input from every user. What exactly is staying private?
Mohit's final word: forget the buzzwords, try the tools yourself, figure out what you can afford to ignore. The harder question is whether the thing you can't afford to ignore is what you've already given away. You're paying for the privilege of building someone else's asset. That's the gym analogy, and it lands.
Topics: AI data privacy risks, agentic AI, ChatGPT data security, AI business tools, AI hallucinations
GUEST: Mohit Rajhans | http://thinkstart.ca
Originally aired on 2026-02-20

Your Therapist Noticed Your Foot. The Chatbot Wouldn't Have
09:43

Body Cams: Who Actually Controls the Footage?
09:26

The Toonie Turned 30 and Transit Raised Its Price the Day Before It Showed Up
09:35