Professor Nita Farahany reveals to Azeem Azhar the startling advancements of brain-scanning technology and the extraordinary implications this tech has for privacy and humanity.
In five years, we'll legally own our own thoughts. That's a premise of today's conversation. I'm Asimazar. Welcome to the Exponentially podcast. In the past five years, over a billion dollars has been invested in developing new devices that can read or even manipulate our mental states. They can help us relax, learn, or reduce pain, and as they do, they harvest data. So how can we make use of this technology without firms taking advantage of us? Do we need a new fundamental human right, a right to our mental privacy? To discuss the ethics of this emerging neurotechnology. There's no one more qualified than Professor Nita Farahari. Nita, you're a professor of law and philosophy, and you got there by way of genetics and cell biology. Did you have much fun at college?
I did.
I probably had too much fun at college, But you know, I came into college already really enamored with genetics and behavioral sciences, and while I was there, ended up doing a lot in both genetics but also neuroscience and government as an odd pairing between my interests.
Also, how did you show up to college with a deep interest in those quite different subjects.
As a high school student, had taken a genetics class and found that not only was I really interested in biology, but I really enjoyed the mathematics, the modeling, but especially the behavioral ties.
I found that fascinating.
But I also was a policy debater in high school, and I was recruited to college for policy debate. And I had spent summers as a geeky nerd at debate camps and about debate camps.
I mean, they were fun.
I enjoyed it, but it's a geeky endeavor. And the result was I was very passionate about policy. I was thinking about the implications of policies for society and for humanity, thinking about the broader implications and applications of technology and science. I think I sort of didn't realize it at the time, but was destined to follow the path that I ultimately did.
You've turned your attention to one of science's remaining frontiers. It's, in a sense, the last bastion of our freedom. It's the nature of our brain, our cognition, and our mental state. And I wanted to share a story with you. A few years ago, I was sent a headset from a startup that was meant to stimulate your brain to improve your cognitive performance, and you put it on your hair and it would zap you with some direct current, and you know, I was too scared to use so I put it.
That's probably a good thing.
I put it back in its box and I now use it as a shelf for my print up.
It's probably a good thing that you didn't use that one back in the day, because we don't really have a very good understanding of the zapping your brain. So as you think, as you do anything, neurons are firing in your brain. They give off tiny electrical discharges. And when you're doing anything from feeling to smiling, to thinking to doing a math problem in your head, hundreds of thousands of neurons are firing in your brain giving off concurrent electrical discharge.
And so we can measure those electrical signals. I guess that approach is the EEG. And if anyone is old enough to remember Ghostbusters, Rick Morana's character wears he's wired up to one. That's right early on, and yes, we can we can measure those electrical impulses and also their waveform and whereabouts in the brain that's right, they're happening.
That's right.
Now, if you're using consumer grade electro and sepilography or EEG, you're going to have far fewer electrodes, and so it's really the average across the entire brain rather than picking up from a specific brain region. You might have it in the form of earbuds or headphones or headbands, and they're picking up the averages of those waveforms, and then those averages, through software and advances and artificial intelligence, can then say Okay, this is happy, this is sad, this is paying attention, this is fatigue, this is mind wandering. So they're kind of big brain states. They're not very precise kind of mind reading, right.
So those are all emotional states that you could imagine matter to students who are studying as people who are working in desk based jobs are actually in non desk based jobs in the field.
Right.
So already this particular company has partnered with a number of corporations as an enterprise solution where if it's offered as part of a brain wellness program, for example, to allow an employee to be able to train their focus. People are really distracted in today's world, right, and they have so much context switching. You're on your computer, you then quickly go buy something for your kid's birthday party. You remember you get a notification on social media, and that context switching is costly for your ability to focus.
And it's very hard to judge in the moment or even after the fact, when you were concentrating well and actually whether you were genuinely in a moment of flow or a moment where you've blocked out the rest of the world. So what you have here is a more objective measure of where you might be actually, in the same way that smart watches can really give you a sense of whether you actually did walk the ten thousand steps or not.
I was talking with a company that has used this technology on employees where they were trying to make managerial level decisions about work from home policies versus work in the office, and they were looking at some of the brain data to see the extent to which people were more distracted or better able to focus in a home based environment versus in the office environment. Or you know, you think that you're really a morning person, but that's when you're in the best data flow, and it turns out your best working hours are like three to five in the afternoon.
What happens to people when they get told that, after years of thinking that they were a morning person, they turn out to be a light out? I mean, how do they feel?
You know?
I think for some people it's helpful, right, Oh, that's why I'm stuck. That insight is so incredibly powerful for me. For other people, it causes self doubt, a bit of an existential crisis. To what extent can you trust your gut instinct, your so called inner voice when the data doesn't stack up or measure up with what your own sense of self is. And then for others there's just doubt. No, no, the technology must be wrong because my own perception of self is much more accurate than any technology could ever measure.
So these are all consumer grade devices that are in a way leaking into the workplace. But of course there are a whole range of different neurotechnologies, some of which are much more complex and sophisticated used in healthcare, some of which are just not as portable as these devices.
These right now primarily or entertainment or very low stakes decision making. They're not diagnosing a concussion on the football field or for a person with epileptic seizures, maintaining real time information about what's happening in their brains, or for diagnostic purposes for Alzheimer's disease or Parkinson's or other kinds of conditions, and a lot of the other more sophisticated neurotechnology. And by sophisticated, I just mean it has better signal to noise. It's able to more precisely pinpoint in the brain where something is happening. You know, you have something that gives you clinical grade information about brain activity or brain disorder. And you know those really run the gimmut from electrical stimula or implanted neurotechnology, where there's been tremendous advances in the past few years. Companies that are investing in the ability for somebody who's a paraplegic to be able to navigate their environment, or a person who has lost the ability to communicate to be able to speak their mind by using implanted neurotechnology.
To these are the so called invasive neurotechnology. In other words, they're physically in the brain. They probably require you to go into surgery to fit them.
They carry for obviously greater risks because they entail surgery. But what you can get from going deep into the brain right now is obviously much more powerful than having to pick up a signal that goes through the scalp, through the skull where it's degraded over time, and far lesson it can be picked up.
And what's the role of what I imagine as the gold standard of all of this, the MRI machine. You lie down, you go into this tunnel that these very powerful magnets. There's a lot of clanking, but then you get these very small slices of your brain and the blood flow activity within them. And these machines are in big they weigh thousands of pounds, they cost millions of dollars. Is that the gold standard to which we try to reference everything.
Functional magnetic resonance imaging machines. They can peer deeply into the brain. They don't have very good what we call temporal resolution, which is they're not fast, they're not measuring rapid changes across the.
Brain like high high resolution photographs. Yes, rather than a film.
That's right, and so, but the ability to peer deeply into the brain and to be able to see at literally like this is what's happening, or this is what's going wrong. That is the gold standard for at least diagnosing some things.
It seems that in recent years we've started to close the loop from lots of different ways to read brain states through to how we can then change those brain states. Talk about some of these implants. It seems like some of the technologies that you've been discussing are as much about reading the brain state as they are about imposing a new brain state on us.
That's absolutely right.
So, for example, there was a woman who was suffering from severe depression and she described herself as being at the end of her life, no longer in a life worth living, and neuroscientists and physicians were able to track the specific patterns of brain activity and then use electrical stimulation in the brain to reset that brain activity like a pacemaker for the brain. That enabled her to recover her will to live, to overcome depression, and have a typical range of emotions instead. Wow, that's pretty extraordinary.
That's extraordinary because right now the disciplines of neuroscience and psychology are are rather different, Yes, and they're separate. And what you've given an example of is using neuroscience to try to tackle something that would have traditionally just been seeing there's a statological problem. That's right, So let me summarize where we are. We've got this fantastic science and engineering which is progressing, and it's giving us a new class of neurotechnologies which we can use for diagnostics, for therapeutics, for augmentation. But these are also dual use technologies. They could be applied in ways that are harmful. Perhaps. I remember reading about a school in China, for example, which for a short period of time was putting on a brain monitor on the kids and allowing the teachers to figure out which ones had drifted off. Now, I think it's a right for a young child to zone out as school. I mean, we all did. What are the other examples that you can think of that are showing the potential drawbacks of this technology?
I think anytime you're looking at, for example, coerced use of it that's in the workplace or in a school setting, that it's deeply problematic. And that's because not only do you have a right to have your mind wander, but you have a right to think freely. Right the ability to be able to have a dissident thought or have a creative thought or even fantasize about the coworker in your office, so you have a bit of a crush on, like there's.
I work on my own.
Well there you know, no crushes, no qrushes for you.
But you know these flashes of bad thoughts you have For a moment, somebody cuts you off in traffic and you have a bad thought that pops into your head. Right, being able to think freely without fear of having your feelings or thoughts intercepted or manipulated or punished, I think is critical to human flourishing. In other context, it's already being used in these kind of coercive ways, like in police interrogations where people's brains are being interrogated to see if they recognize. Yes, it's already happening. There's a company in the US that has been selling their technology to law enforcement agencies worldwide. Whether it's that technology which has been used in places like the UAE or in Singapore, is being tested out in Australia, and there have been criminal convictions that have occurred as a result so called criminal confessions in response to brain based interrogations. It's already happening worldwide. Those are I think very chilling applications of this technology.
I've read a slightly scary example of a beer company trying to change the way we dream so that we would dream about their beer. I mean, how does that even work?
So neuromarketing has been around for a while. There are certain periods of time, like when you just wake up from a dream, that you're more suggestible. And by more suggestible, I mean your mind can be changed more easily. Yes, And so in that suggestible state, what some dream researchers were hired to do or to figure out could your dreams be used as a time to incubate particular ideas. They call this dream incubation. So this particular beer company they had as the kind of logo associations of mountains and streams, and they want you to think of their beer and think being refreshed.
I'm almost speechless. It really sounds horrible, and it does.
Yeah, No, it does sound horrible, and I've gone to the most dystopian place with it possible. So now imagine you're wearing your overnight brain centers. Because there are earbuds that are designed to detect your sleep and to help you monitor your own sleep. You have your you know speaker that's listening in your bedroom. They're connected together at the moment at which your most suggestible, a soundscape starts to play without you even realizing it. And this is, you know, the newest place that people pitch their advertisements.
That logic is completely inverted. The moment when we're most suggestible is the moment when there should be.
No right from us now one hundred.
Percent agree right? This is just like the dystopian vision of this is. If advertisemers are trying to find the moment that you are most likely to develop positive associations with their brands or their products, and there are no regulations that would prevent them from being able to commodify your brain data and to target you with advertisements, then why wouldn't they right? And the only thing that would stop them from doing so is ethical norms that would strongly develop against doing so and legal rights that would protect people from having that happen to them.
If you could take.
A person who's suffering from trauma who voluntarily said like, yes, I would like for treatments while I'm sleeping to incubate positive memories or associations instead, it's not a bad thing. And done a controlled research environment with consent that seems potentially like an incredibly positive application of it, when instead what you have is what seems like gimmicks. But I worry that each instance of the use of neurotechnology in these unexpected associations and unexpected places subtly and very perniciously normalizes people to a future of neural surveillance where even our brains are hacked and tracked.
These technologies are improving exponentially, and there are these combinations as well. So I've been tracking a variety of experiments where scientists have connected the readings from fMRI machines to different types of AI, and they've been able to reconstruct the image that you've been thinking about. You've been thinking about a church, and they can produce a sort of blurry church. And over the last four or five years that church has got sharper and sharper and more precise, and even more recently, scientists have been able to look at fMRI readings and predict the words that somebody was thinking about. Now, of course, an fMRI machine is this huge machine, not that practical, but the point being that once we are able to correlate activity as we see it through an fMRI machine to particular sequences of words or pictures. We can then find easier to read signals and then without needing to pop you into an fMRI machine, start to extract those thoughts or those pictures.
Yeah.
So there was a recent study that was done that was really a leap an advance in this area using the advances in generative AI, and were able with a very high degree of accuracy to reconstruct continuous language from the brain. They were then curious as to whether those same findings could apply to a different portable system called f near's functional near infrared spectroscopy, which picks up blood flow changes in the brain just like aphemeridas. Could it apply in that context as well?
So blood flow being the key thing that appears to connect to what we're thinking.
They trained it on f nears to see if it worked in that context, and it did.
It sounds magic. It is a way of peering into the brain using light. It's sort of infrared light are the kind that might come out of your TV remote control.
There are others who are working on something like a bike helmet that would have FEARS and enable us sortably to be able to pick up even potentially our very thoughts from our brain that could be decoded using these advances and technology.
And of course, hundreds of millions of us have these smart watches that actually have quite cheap sensors in them and yet can be really accurate for measuring things like blood, oxygenation or pulse and all sorts of other biomarkers. So we've got some expertise in taking lower cost, low fidelity signals and turning them into more reliable.
Signals, that's right, And to be able to filter out things like noise, even the ability to have the computation which used to be very difficult to do. You'd need a huge room to do the kinds of crunching of the data. They now are done on device literally like a little device like this together with something like a mobile device, you can do incredibly sophisticated both imaging and decoding at the same time.
How long before there are going to be affordable portable fnears devices that you might see employers buying for their workforce that's.
Both affordable for them to buy and make sense in some ways to distribute to employees. And that's for a couple of applications. Those are to do things like for brain wellness programs to decrease stress levels and employees, or this one I use for focus. You can retrain your brain to have longer periods of focus and increase the activity in your prefrontal cortex, which is a form of cognitive enhancement.
The idea need to view with your multiple degrees and to professorships having even more focus.
It's an unfair advantage.
It is. It is. We have these very powerful technologies that are improving exponentially, and we can see many of the upsides. We can also see how they can be used against us. The thing that strikes me about these technologies in particular is that the brain is that last fortress, and our inner self is the thing that defines us. And yet we're at this turning point where that inner space could become part of someone some business's private enclave, or some governments or welly in fantasy. What do we do about it?
So I worry about this deeply, and I worry that that final private space, the space that I think is so critical both for self awareness and for resilience, for the ability to know truth from fiction. The emotional self reflection and cultivation is at risk of brain transparency.
You mean the idea that we can just be seen through entirely.
Essentially, right.
I think most people consider having that space of mental reprieve being able to think about what you're going to say next, with whom you share information with whom you're vulnerable.
Right.
The way we define intimacy and a lot of relationships is by choosing what emotions, what thoughts, what secrets we want to share with another person, and if all of that can be revealed by decoding your brain activity, if other people can have access to it, whether it's your partner who says, no, no, I want you to prove that you're in love with me, or you know, your employer who says, you know, I need you to have five hours of focus today.
Or show me you gave me one hundred percent that's.
Right, prove it to me, or you know, a lie detection test by the governments, or the way you authenticate yourself at borders is through brain transparency. I think this final space of mental reprieve.
Is at risk.
Your suggestion is cognitive rights. What do you mean by that? So?
I believe that a right to cognitive liberty as an international human right or right to self determination over our brains and mental experiences, would give us both a right to access and change our brains if we choose to do so, but are also a right from interference with our mental privacy and our freedom of thoughts. Those can be simply updates to our existing interpretations of right.
Already in the Universal Declaration of Human Rights, the right to privacy, the rights freedom of thought and belief, and the right to self expression without undue influence.
You can look to the right to privacy and say what we need to make explicit as right to mental privacies included within it.
You can look to the.
Right to freedom of thoughts, which has currently been really interpreted much more narrowly to be about freedom of religion and belief, and say, no, no, this is also about interception, manipulation and punishment of our thoughts. And we have a collective right to self determination, a political right which really if we look at all of the existing rights, we can say an individual right to self determination is fundamental to every right that is part of the.
UN Declaration of Human Rights.
So it's not that I think we need new rights, it's that we need a new umbrella concept, which is cognitive liberty to help us and direct us to update our interpretation of existing rights.
The other side of all of that, of course, is that these are much more powerful and useful technologies for beneficial outcomes. So how do you propose drawing a line between the acceptable and unacceptable uses of these newer technologies.
So I have framed cognitive liberty is a right too and write from and I think the right too is critical because the truth is, we don't treat our brain health and wellness nearly as seriously as we treat the rest of our physical wellbeing. We don't have access to information about our own biases and preferences, even our own cognitive decline over time. Self insight, I think is powerful, which is why it's a positive right to access information about your brain, a positive right to be able to use it. Notice me saying you right. It's about you being able to access and use the technology, you having control over your own brain data, it not being commode and misused against you, and a right from other people coercing you to use the technology or taking your brain data, to analyze it, to mine it, to share it, and to sell it. It's giving you and putting you in the driver's seat of your own brain.
Even if we are in the driver's seat, we do have precedence from other types of therapeutics. I'm thinking about attention deficit disorder therapeutics. I'm thinking about antidepressants, which emerged over the last thirty or forty years, and particularly in Western markets, particularly in the United States, are heavily over prescribed because it's easier to deal with some of these issues through a prescription than perhaps figuring out how to tackle those issues. These advanced neurotechniques will be more sophisticated, more precise, more effective. So why wouldn't they fall foul of the same type of behavior that sees them just being used pervasively, even if we are notionally in the driver's seat.
They could be right, I mean, I think that's the risk, and there are risks to individuals using them to shortcut their own process of self discovery and self awareness and growth and emotional developments. It can be misused by others. It can be misused by parents, well intentioned though they may be, to try to redirect the brain activity and brain development of their children. I think in order for us to effectively realize the potential of this technology and society, we have to not just have rights right, not just have norms, but we need to also be substantially increasing people's awareness of how their brains can be accessed and changed. What does individual cognitive liberty and cultivating that and individuals look like. What does it mean to cultivate it in children or in the workplace, or to invest in cognitive liberty for businesses and for corporate developments, for product developments, for employee wellness.
Given the urgency the technologies are getting better and better at exponential rates, I'm a bit perplexed about choosing the UN as the route to do this. It's not an organization that in recent years people would say is adept at dealing with change, or indeed effective when it does finally get round to doing it. So is that the right way to approach this first?
It's not the only way, but as a starting place to say, we need worldwide to recognize as an international human rights that we have a right to cognitive liberty, and that that simply directs us to update existing human rights and our interpretation of it sets both a strong legal norm which doesn't require everybody to come together. It requires the existing oversight bodies like the Human Rights Committee that oversees the International Covenant on Civil and Political Rights to update their interpretation and application of existing rights to protect people. Even if we don't succeed, meaning even if the UN takes zero action in this space, that doesn't prevent us in the United States and Europe and other countries from interpreting existing human rights consistently with the idea that there's a right to mental privacy, that freedom of thought is broader, that we have an individual right to self determination.
I mean, it strikes me that even if this isn't sufficient, also self regulation that's right could be very, very valuable if you're able to persuade the handful soon to be dozens of companies making these technologies that cognitive rights matter. That also creates some kind of a buttress for them in the near future.
I think that's right.
I think that could be simple things in product design, when you have multifunctional earbuds to be able to turn off brainwave reading while you're taking a conference call if you choose to do so, or to recognize the mental privacy of their employees, and to say here's technology you can use to improve your focus we won't collect the data, we won't use it to make choices about you.
Now, the premise of our conversation is that within five years, our thoughts could be legally protected. How likely do you think this vision could become reality?
Well, I'm an optimist, I would say, and I believe that we have to get this right, and that the only way to get this right is to give people strong rights and protections over their thoughts from being intercepted, manipulated, and punished. Within five years, I think the technology is going to be mature, and it's going to be widely available at scale across society, which is why I believe before that happens, we will make moves to recognize or write a cognitive liberty to enable us to be empowered by rather than oppressed by the technology.
Anita Farrehani, I don't have to tell you what I'm thinking, but there has been a tremendous pleasure to speak to you.
Well, such a pleasure for me as well. Thank you.
Reflecting on my conversation with Nita, I can't help feeling that the technologies we've discussed will be mainstream even faster than we think, and two hundred million of us wear smart watches, and these collect personal information every second of the day. It's becoming a normal experience, and the exponential improvement in technologies will allow devices like these to monitor us more closely. Yes to the point of tracking our mental states. So it does make sense to protect our minds our thoughts. But as going via the slow moving un the best route or can we together find a better way? Thanks for listening to the exponentially podcast. If you enjoy the show, please leave a review or rating. It really does help others find us. The Exponentially podcast is presented by me azeem as Are. The sound designer is Will horricks The research was led by Chloe Ippah and music composed by Emily Green and John Zarcone. The show is produced by Frederick Cassela, Maria Garrilov and me azeem as Are Special thanks to Sage Bauman, Grocott and Magnus Henrickson. The executive producers are Andrew Barden, Adam Kamiski, and Kyle Kramer. David Ravella is the managing editor. Exponentially was created by Frederick Cassella and is an Eat the pie I plus one limited production in association with Bloomberg LLC,