Imagine an A.I. Assistant that reads all of your text messages… and turns them into a pile of data points. A human psychology report.
And then it guides you. The assistant can say — “hey, the person you're talking to is introverted. You may want to be a bit delicate when you message them.”
The assistant will tell you the likelihood — down to the percentage — that the person you're texting likes you… in a romantic way.
But you're going to have to give over a lot of your data in exchange. It’s a classic privacy dilemma.
The tech exists. It was created by an entrepreneur named Es Lee and built into an app called Mei. Could AI detect our mood and guide us to communicate better? And what are the ethical issues that come along with tech so personal?
In this episode of First Contact, Laurie and Es talk about what happens when you mix artificial intelligence with raw human emotion
Learn more about your ad-choices at https://www.iheartpodcastnetwork.com
First Contact with Lori Siegel is a production of Dot Dot Dot Media and I Heart Radio. Okay, we just got an email from S who are interviewing or I'm interviewing tomorrow, and he says, Hi, here's some data we got from the conversation that's exciting. Some are screenshots of the app which I can give you a demo tomorrow, and other data from our system we don't show in the app to give you a sense of the underlying analytics. Are you ready for the underlying analytics of our connection? Yeah, because I honestly have no idea what type of insights. Terrible to clean, it's loading, it's taking a while. Built in suspense, Okay, what think about this for a second and don't judge too quickly. Imagine an AI assistant that sits on top of your text messages. It reads your conversations, It turns your messages into a novel of data points, basically a human psychology report, and then it guides you. The assistant can say, Hey, the person you're talking to is a bit more introverted. You might want to be a little bit more delicate when you message them. The assistant will give you a percentage likelihood that the person you're texting with likes you. I'm talking romantic feelings. It'll read hundreds of data points to help you shape a better, more personalized conversation based on what it picks up in the nuances of your language, the pauses between texts, the emojis you use. Are you using words that are positive or negative? So would you use this type of technology? Well, if so, you're gonna have to give over a lot of your data in exchange. I know what you're probably thinking, My text messages are pretty personal. What about my privacy? The conversations are anonymous east and you can delete them whenever you want. But it's leaks and hacks are the norm, so of course they're ethical boundaries. The tech I'm talking about was built by an entrepreneur named s Lee. It's an app called May, an AI assistant that you can download that, when unleashed, can tell a lot about us, including our mental state. So in this episode, you're going to hear me talk a lot about depression, love, all the raw human elements that this tech can discover. So what does an algorithm say about our messy human conversations about us? Could it help us communicate better with each other or are we crossing a line? The future is here, we might as well take a deep dive. I'm Laurie Siegel and this is First Contact. Well, welcome to First Contact. This is a show that introduces you to people and tech that changes what it means to be human. And I think you're a perfect example of that. And as normally I go into my first contact with founders, I bring on the show, um and how we first met. But we've never met in real life. Thank you for having me. But that being said, you might not realize this, but our first contact happened in this room. I interviewed a guy named Shane mac Do you know Shane, Yeah, yeah, I met him once. Interesting guy, UM, and he's building out bought technology to date on the dating apps, and so he said, in the middle of the interview, he brought up your company and he said, they're building this really cool technology that can analyze conversations and they can know if you're interested in the other person by the way you're speaking. I was like, WHOA, that's really cool. And so you know, I was doing research on you and something stuck out that you said you talked about how your text messages have body languages. I love this idea that we could read our text messages like body language. It's just so much about us and I think that's like a good way to start with what you're building. Yeah, yeah, I think it's body language in a way, and that I guess there's kind of like a new language evolving from text. So thirty years ago, nobody be able to look at a phone and say, oh, this person likes you based on the metadata in the text. Right, So if somebody makes eye contact with you, if they smile, these are all kind of signs that we've learned over time that this is an indication this person is interested in you. And now in the form of text messages, it might be a double text, it might be you know, a lot of emoji usage or exclamation points. It might be immediately replying to all your text messages. And the idea is that, um, with what you guys are building, you could actually do analytics on it and really help people understand relationships and if someone likes you, and and much more than just that. It started very basic like that, but you can understand all sorts of things about people. Yeah, yeah, there's an immense number of data points in conversation, so you can imagine even the conversation that we're having. Now, there's been research out there that the number of times of person says I versus the other person as an indicator of the power dynamic between the two. So when you add all the metadata along with the content of you know, what we're saying, there's a lot of things I get revealed just from looking at the data on text messages. So before we dig into the tech and what you're building, can we just take a giant step back because normally when I interview folks, I I generally have a history with them or I know them, And this is as we say in my first contact with you? Um, where are you from? So? I was born in China. I grew up in Boston, and I moved to New York about twelve years ago. I always thought people were really interesting to analyze. I did computer science and college, and then after school I went into finance and covered the insurance industry. The insurance industry, Yeah, that's right. Um, I was working on an insurance startup, and whenever I would tell people what I was working on, it would just be like crickets, and then I'd say, well, actually, in my back pocket, I'm working on an algorithm that can figure out how much people like you based on your text messages. And you kind of see the eyes light up, the jaws dropped, and then I'm like, Okay, that's enough validation that I should probably be working on this. In my experience, UM founders look to try to solve problems that they have. UM So, what problem did you have that you're trying to solve? Um? So, actually you kind of solve right through me. I've been in New York for twelve years, and you know, I had my share of dating experiences. I think I'm okay in real life, but then when it comes to text messages, I'm terrible. Why. I guess it's a common affliction that actually a lot of guys may have. They just don't know how to text right. So I was always eager just to end the text communication and get to real life, whereas that was probably interpreted as a you know, he doesn't like me or it doesn't care. So I kind of wondered just how many relationships could have been better or more how to just seeing these things? M hm. So you're texting with women and not having luck and you're frustrated, and you have this background in computer science. I mean, there's an algorithm that can help you communicate better. Okay, so then what um so, actually what kind of brought this to fruition was a friend that moved into the city and he said dating here sucks, and I was like, why, he goes, so I want up with this girl. A couple of days ago, we really hit it off, everything was great, and now all of a sudden, she's not responding to my text messages. And then he was confused. So I take a look at it, and I kind of flipped through his text messages and I'm like, she likes you. How did you know? So I kind of looked at the body language. I mean it sounds like, if you don't have luck on text messages, how are you able to analyze these text messages and look at words as data points and actually know? Yeah. The funny thing is actually I found over the years that it is actually impossible to be objective about your own relationships right. Often it it's a third party to kind of look at it from you know, our bird's eye view and say, hey, look here, we're all the warning signs. How did you miss all that? Um? And I actually think that's how we deal with a lot of our relationships, is we you know, we're not able to see objectively, right despite knowing kind of all the pitfalls of texting when it comes to my own relationship, it helps me not at all, but I can kind of see it in other people, and you know, I can be more objective myself about it, and then with the algorithm, I can actually give them data behind it. And all of this is kind of um background for an algorithm that went and this is this is all before you created may it was crushed, right, the crushed app that you created, and so all of these data points went into creating this crush app. So talk to us about that. Yeah, yeah, I would like to go on record to publicly apologize for crush Um. It was downloaded by a hundred and fifty thousand people, and I joked with my team that, like, we should put out a press release to apologize for ruining a hundred and fifty thousand people's days? What what? What was so terrible about it? Because you know, you're probably if you're using an algorithm to figure out kind of whether somebody likes you or not, Like, chances are you're not going to be happy with what you find. And what did the algorithm take into account? And you talk to us about the technology find and then we'll get into the technology you're using now, which is much beyond just crush and and does someone like you or not? What did it actually take into account? It took into account the average response times, the length of the text message counted, an emoji usage like exclamation point usage, whether they sent you a link or a picture, uh, and then the number of conversations each side initiated. I mean, it is pretty extraordinary if you think about the amount of data we give each other on a daily basis, like the time of day we text, the types of words we use, the types of emojis, like, all of this is insane data set that creates a personality profile that I mean, it's just an extraordinary ay amount of information about us that you're utilizing and that advertisers are utilizing that that, when put to use, can be used and I guess you're trying to use it to kind of help us better ourselves and have better conversations. But it is an extraordinary amount of data that I don't think people quite understand, Like how is human beings we decode pretty easily, right, yeah, yeah, So I tried to back up my text messages. Right, So when I get a new phone, you get a new app and it kind of loads back your old text messages. But the program I was using took a really long time and actually showed every text message, right, And I actually spent three hours looking at all my text messages flash before me at half a second each. It's an incredible amount of data. And the average person actually texts about a novel's worth of their thoughts a year with your thumbs. Wow, I mean, And we'll get into this a little bit later, but I gave you text messages with my co founder and dear friend Rick. And as we were giving you the text messages, by the way, we only gave you like a year or year and a half worth, but we have like twelve years worth or something. And it was maybe one of the most like horrifying interesting experiences watching as they were downloading or looking at them. It was just like, you know, looking at all those words, and it's such like a personal experience through technology to to do so I want to get into what you're doing now with may. So you have moved So we're not just doing love anymore. We're we've moved past just an app to help people find out if other people are interested in them. Yeah. Yeah, we have moved on from that. And so when did we make the pivot? Talk us through that? So my goal had always been to be like the default texting app on the phone. So the most used application on our phone is for messaging, right, Yeah, we had instant messaging twenty years ago and there's almost been no innovation whatsoever on it, right, And so I go, hey, what if I am able to build a messaging app that actually has kind of an AI that's just sitting right on it kind of analyzing and conversations. Maybe like it's almost like a guardian angel, you know, looking over your shoulder as being like, oh way you're coming across as being rude or you know you're missing this about this person. So I always thought there was the capability for that. So Crush was basically the very first step in doing that. You know, I know, with a lot of data, a lot of amazing AI as possible, and that wasn't possible without at first having the data that we got from crush. First, all May is, did I read that you name that after your mom? I did. That's interesting. It was also coincidental in a way because I was looking for a name for the AI, and I thought messaging could be done so much better. So in a way, MAY also stands for messaging Improved, and maybe one day it'll be it'll stand for me improved, because you know, we're going to take this data and discover what we can to give you the best information for you to operate as an improve version of yourself. And it means beautiful in Chinese. Yes, it means beautiful Chinese. You know, I want to get into the algorithm and how it works. So I think we're talking a lot of like in theory about how this thing works and all this. So let's like go hardcore into the technology behind it. So what exactly does MAY do, like like break it down for us and and you know the most human way you can. Okay, we're basically a replacement of the default texting app on the phone. So if you and I texted after this, with enough messages, the AI will be able to pick up certain things about your personality and start giving me advice or or kind of little insights that it picks up about you, like what like it might pick up that, you know, maybe you're more organized than I am. Well, that's certainly not true. Everyone in this room is shaking their heads. Why don't we Why don't we another one or more more you know, altruistic or empathetic than I am. Okay, we could go though. So the idea is, you know, as we're having these conversations, and with enough data that goes through the system, that has to actually be able to figure certain things out about yourself and the other person. So, you know, for example, it might try to predict what the age or gender of the other person is. You know, if there's a history of conversation and it figures out that the other person is using a lot more negative words than they used to and might say, hey, Laurie seems like she's you know, a lot of different You might want to check in with her. So we kind of went through as many algorithms as possible to figure out, Hey, how can we be like that little assistant, your little personal relationship assistant As you go about, you know, through your day and texting, what do you mean when it says like negative words, I don't know if folks really truly understand what that actually means. Yeah, a lot of like natural language work has been based on sentiment. Working about sentiment like hey, if you're if you're saying sad, angry, or you know, I'm happy, I'm glad to see these are negative and positive words. So if you take the dumbest algorithm and just say, hey, let me just look at the number of times you said something negative, use a negative word today, and say, now, on average you use twenty two out of five thousand words, but now you're at forty four, and so you're twice as negative as you tend to be. And so we have algorithms that will look at statistically like these numbers and find when there's an anomaly, and you know, try to point that out to the user. We've got to take a quick break to hear from our sponsors, but when we come back, we'll talk May and mental health. What if an algorithm identified that you were falling into depression and then using data, it suggested the best person in your contact list to reach out to. And here's what surprising. It's not the person you normally would think more after the break and what other types of things does it analyze. It's still analyzes if someone is into you, or the ideas that I can analyze work relationships, all sorts of different types of relationships, right, and how yes, just give it the specifics I think are really interesting test because I don't think people quite understand the data points that go into a lot of this stuff. Right. So with Crush, one of the first things that we would ask is, hey, what type of relationship is this? Because you know you're going to be texting somebody you'd like a little differently from your family and from your friends. So we wanted to kind of separate all our data set. But into doing so, we actually kind of had a lot of labels to do. AI. There's only two pieces of things that you really need. It's a ton of data which we had and labels. So basically let's just abstracted and say, okay, well let's talk about pictures of dogs and cats. Right. If you have hundreds of thousands of pictures of dogs and cats and you go, okay, well that's a dog, that's a cat, that's a dog that's a cat, Right, there's ways to turn that picture into kind of math and then have the computer find patterns amongst those pictures and then start predicting whether it was a dog or a cat picture that was seeing. So in the same way, if you had enough text message conversations when where one party says, yeah, that was you know, I had a question that person, and that person, uh, that's a family member or that's my buddy, you're able to kind of do the same thing right with enough data and starts actually doing this better than people can. There's there's no way a person could have actually listened to two fifty thousand conversations, right, were conversations between you know, five people. But a machine is able to do that and take every single one of those data points and say, hey, if this person, you know, within like the first twenty words say something like handsome, and out of a hundred of those that person had marked that they were romantically interested in that person. That machine is able to do things and recognize patterns in ways that people can't. And if you think about how a person actually works, like if you're young and this is your first romantic relationship, you might not be able to read any of the signs that this person likes you, right, because you've never seen it before. But you take somebody who's you know, seen enough relationships, and you know, your grandma might actually be your the closest thing to an AI that we know because she's just seen enough information that she can go, oh yeah, I heard him say that. So yeah, in my experience, like, yeah, that has not turned out well, and your grandma might be as close to an AI. It's really interesting. So I mean, it's like it's pattern recognition and also based on kind of personality types that you guys define. Is that how you guys do it? It's the Big five person okay, And so I guess, like, you know what, what's the point of it is to help us communicate better. The point of it is to to give us the words to say to each other because we're messy and human and sometimes we just don't know. It's an iteration of you not being able to talk to women. I mean, and I don't mean that in a bad way, like I mean that in the way that you know I sometimes have trouble talking, you know, to a family member or something. It just gives us a pattern or a blueprint um through AI by which to speak to people. Is that that's if I'm reading it right. Well, think about it this way. How often when you compose the text message, do you think about my come across across as a like rude or well, I think my my issues. Sometimes I don't think before I text. But it's amazing I think when there's For most people, I think when they text something, they scrutinize it, thinking what is the other person going to think when they get this message right? And I think most of us kind of think the worse. Right, But just think about every time you texted somebody. If you didn't like somebody, you wouldn't text them. So, for example, you know, if the algorithm starts picking up that person on the other side is a lot more reserved, it'll say, oh, you know, this person seems like they're a lot more reserved than you. You might want to try a little harder to relate to them. What other kind of soft nudges will the a I give us. Well, on the flip side, if you know you're reserved and the other person is a lot more outgoing, you might say, hey, this person is a lot more outgoing than you. Might want to try to keep it light. You know, keep things light with them? Does the AI? I mean, as always AI is so flawed, right, like, and can get it really wrong on sometimes, so like, what if the AI is giving me advice, that's just terrible, right, Yeah, it's not infallible, but I would argue that it is probably better than your friends. I want to talk a little bit about the kind of mental health aspect of this, because it's putting a lot of weight on May. Like if May is looking through a lot of these messages and seeing people at these like very vulnerable moments, and I'm sure you must see some crazy stuff, like you must really see some people in pain saying some some pretty sad things about depression or suicide. What's the responsibility of the AI? Yeah, I mean that's a good question. I don't think anybody really has a good answer for that. Look, we actually try to steer away from giving people advice for as much as possible. But all the feedback that we got from people were like, hey, okay, so you told me this about it, and too, what am I supposed to do about it? Right? People saw that there was application here potentially outside dating um. That was one of the first things that we turned to when we go, hey, if the AI is able to look at your conversations with everybody, your your parents, your girlfriend, your best friend, it's able to kind of understand things about you that like, really nobody has the perspective, right you only get to see somebody in a certain dimension, whereas we're you know, we have a three sixty view, and we kind of theorized that, hey, maybe we can actually, you know, figure out patterns. And so we started looking into conversations and you know, for the longest time, I was looking for another startup in mental health or some data set where it's kind of morbid to say, but could I get the text messages in history of somebody who killed themselves? Because we have the analytics tools to kind of you know, comb through the history, and what if we were able to use that to figure out patterns um that preceded the actual act. And so I talked to many many psychologists startup companies, and the answer was always we don't know, or we don't have the data. And even if we did have the data, privacy regulation is keeping us from doing anything. And then it dawned on me, well, we have conversations of fifty thou people what have we just searched through these messages and try to look for their phrase tried to kill myself. We actually found about ten occurrences of that where was from the user and not a contact, so we had a lot more information on them, and they actually in those ten they actually included a date. So one of them was, Hey, sorry, I've been m i a but you know, I've been going through a rough patch and I actually tried to kill myself two weekends ago. But all I managed to do was down a bottle of pills and send myself to the hospital, So you know that person tried. So with that data, we're able to now look at the patterns and some of the things that we foundberg were actually really really interesting. How do you feel when you're looking at messages? This is your product, right, Like, how do you feel when you're looking at messages and someone is saying, on this date, I tried to kill myself, Like I don't, I don't know really like what what the question is other than like how does that as a founder and with your responsibility and your technology, how does that make you feel? Um? I mean, I think going through some of these messages and by the way, we don't know the identity of the people because we never asked them for their name, right. I think it was an exercise and empathy that if everybody had the opportunity to go through, I would suggest it because going through the interactions of some of these people with everybody in their life, it felt like a soap opera. I did it one where I looked at the two months of conversations before that and I tried to jot down every person. Oh yeah, this number three, seven oh four, that must have been a friend, or that must have been a girlfriend. It was It was exhausting because imagine you know that this person was going to go through a hardship a month from now, and you're reading the conversations of them crying out for help to people. It was kind of heartbreaking. Um. And there were some conversations where you know, this person came in and just started listening to them, checking in with them and saying, hey, how are you doing. I just remember mentally like cheering at that point, like thank God for you. And so, you know, we went through this because I knew that an algorithm might be able to find that person. And I think when people go through tough times there really is no solution. There's no easy solution, right. People might say, Hey, if somebody is depressed and you find that out, what you should do is you know there's a lot of bots out there or a mental health hotline. I think all these people know that there are boughts of mental health hotlines, but they don't reach out. And I actually think what we worked on is a lot more complete solution than anything that's out there, because the only people that can actually help you are the people that are important to you, that you talked to, that you care about in your life. If we could use these algorithms to find that person that we could recognize cared about you the most but also was depressive, it could be empathetic to you. And it doesn't matter if you have mother Teresa in your phone. If she doesn't reply to your text messages, she's not going to be able to help you. So we needed the algorithms to find, you know, somebody who tries a lot harder than you do to communicate than you do with them. And so the algorithm goes through all that and finds the person that might be right for you and suggests that you reach out to this person. That's what you guys do. If you notice that someone is getting depressive or has threatened their own life, may will suggest that AI will suggest you reach out to someone, and based on your AI, you identify a person based on these messages who you think would be a good person to talk to. Yes. Yes, And the reason we had to go through kind of reading these messages was we had to back test the algorithm. So we would go through these messages and there might be five or six people that they're texting, and we go, I think this is the best person to reach out to. So we went through multiple iterations of the algorithm, and then we found that the one iteration where that algorithm was going to pick that exact person that we were going to pick. But, like, doesn't that feel like playing God rolling the dice. I think it's it's normal to see a person in need and say, hey, I think I could help that person. I feel like it's a responsibility in a way then, right, because I think in society, when you know there are others, there's a part of a responsibility that grows out of that for the other person to help you in need, you know, So I actually do think that, like it is the responsibility of the people who have access to something like this. Yeah, I mean, I think it's a really interesting and ethical conversation of what you guys are sitting on and I want to get into the privacy. But also think about Facebook, right, like all these social networks can tell if people are becoming depressive or threatening their own lives, and and it is you know, do they contact authorities, do they put the suicide hotline number? And so it is pretty unique that you guys are identifying and suggesting a person based on artificial intelligence and text message data that you have. It's it's a choice, right, and no one knows what the right choice is. So I think we've seen example, I think of the big tech firms that actually went and did something right. I don't think they have a right to do that. It's just logically, I don't think they have a right to do that. Whereas what we do is we put the power back into that person and we say, hey, you might not have noticed this, but this person that you know checks in with you, a friend that you don't talk to that frequently. Um, we figured out that this was a person that might really understand what you're going through and then it's your choice to reach out to that person. We don't do anything on your behalf. I mean, and I think one of the most interesting things you said there is like you pick a person for someone who's threatening their own life to to reach out to. One of the tidbits that you kind of slipped in there is the you guys try to pick out someone who's also depressive, which is certainly very interesting to me. Our preface is ten people is a small amount of data, right, But when I was going through that exercise and I plotted these things on a graph, I think people will say, like, hey, I don't know what the patterns of suicidal tendencies look like. It must be really hard to figure out. When you see some of these graphs, you're like, oh my god, this is perfectly obvious. So let's just say you know what I mean. So the usage of negative words and absolutists words, there's research out there. Let's say when people are depressive, they tend to use these words more. When we plotted these times people out on a graph, we plot the negative word usage of them versus everybody that they're talking to just asking for It's a lot of words it's just negatively negativity, you know, no hate, sad, angry. When we plotted out the word usage negative word usage, trying to graph between the person and everybody collectively that they talked to, it tends to show a pretty regular wave. And we found that when the user was more negative than they ever were, if the other people that they talked to were just as negative, everything was fine. It's when they were the most negative and there was the biggest gap between how negative they were and the other people that you started seeing behaviors. And I think when I went through that exercise, by the time I came to like the six or seventh graph, I didn't even need to look at the text messages. I just looked at the graph and said, I'm going to look at because I think that day they might have tried something and I was pretty close and most of them. It's crazy, I mean even crazy to think about the future, like if you could predict I don't know, And this gets into like minority report and a lot of stuff like you know this is this is used for good, but this could also be used in a pretty scary way, like could you start making some of these predictions if someone's gonna rub a bank, do something bad, harm themselves, hurt someone. You know, you could start making some of these really interesting predictions based off of some of the data that you guys are getting. Although we'll get into the privacy because it's a huge part of this, but it's anonymised data at the moment. But I'm sure you can make some insane predictions, right, yeah, yeah, right? What? Um, Well, we've never tried, but you could theoretically, yes, um, but yes, I think there's the possibility for that. And we've actually been in the four years I've been working on this, really been open to anybody coming along saying, hey, like I'm an expert in this, I would love to look into this. I think this data is very powerful, and so long as it's being used by the people who kind of intend to use it for public good and for education, I feel I feel fine with that because none of us really know how this technology will be used. It's almost like a a hammer could be used to build a house or here something down right, It's up to the people how they choose to use it. Well, you guys are a small start up, but big companies use big data in all sorts of different types of ways. This is the national conversation around our data and our privacy. We've got to take another break to hear from our sponsors. But when we come back, I know what you're thinking. Tech that sees all my text messages and figures out these intimate things about my personality and relationships. What about privacy? Well's again after the break. I want to get into privacy a little bit because we've been speaking around it. But it's probably, like I would say, the most important part of what you guys are doing. You guys are an AI that sits on top of messages and reads messages. So like the elephant in the room is like, whoa what about our privacy? So, um, so what is your answer to the question of you know, how do you protect user data? Yeah? I think it's like security. Nothing is fool proof, but you want to try to do your best at every step. So we knew text messaging data is probably some of the most intimate data, right, so everything within may everything is off by default, We don't collect anything. If you want to turn on the AI, we pop up this pop up and says, hey, we're collecting these messages. If you're not comfortable with this, don't go any further. We did that with Crush and we probably turned away half of the people, which is fine, and we want people to know exactly what they're getting into. But then when we collect the data, we don't collect anything more than we have to. So, for example, we don't collect the pictures because we're not at the stage where we can analyze pictures, so we only take what we need. We don't need your name to help you. The only personally identifiable data that we have. We get is the telephone number, and we have to verify that you on the account. That has to be a way to authenticate you. We take that and actually hash it to an I D that means nothing to anybody, And actually that was a way to protect us against ourselves. What do you mean meaning we can't look in our database, find something, find a user, and then take that idea and try to figure out what telephone number it is. So none of your employees could actually be like, oh, this is a pretty interesting conversation with Zo, and so I'd like to go look it up. They couldn't do that, right, And then we go, okay, if you want to delete your data from our database, we make it really easy to. I know there's a lot of apps out there that make it impossible to We just put the button in the app and just say okay, now you're going to be deleted. So every step along the way we thought, okay, well, you know, we're users and apps too, we're out of cross worlds. Which choice do we make? And we always made the one that we thought was for the sake of preserving ca propacy. What's interesting, though, is when we were thinking about okay, well, I wanted you to analyze some of my messages and I don't have an Android. So right now it's available on Android and the iOS app, you have to upload your WhatsApp messages. Why is that so on iOS there's no third party apps that can access your your text messages. Do you think at any point that would change. I'm assuming that's that's a hindrance to the business to a degree, right, to a degree. Yeah, I think Apple would say this is for the sake of user privacy, but I actually think the better answers give them a choice. Right, if you as a user and I'm telling you exactly what I'm using this data for, if you choose to do so, shouldn't you have the right to it's your messages. Well, what's interesting though, I mean, and we had this whole debate when we were thinking about what text messages to send over to you. I was like, well, I guess I have to ask someone so permission. I like, no, I don't like it goes into this question do you own your text messages? And there's legal precedent that says like if if U S send me a message, it's public domain, right, or it's I can you know, it doesn't matter, like you don't have any right right, So like you can do this analysis of anyone. I could send over text messages of me and any friend and they don't have to consent, right, right in the US, that is legal precedent, right, And then overseas, you guys comply. Yeah. Yeah, GDPR is like the big one in Europe. Yeah, and and we're compliant with that. In Canada they have a little more stringent. I guess property rights as it applies to text messages. And I saw you said something about you guys are a small startup and you've never made a cent from data and all this kind of stuff, and you've never sold data. I mean, in the time that I've covered tech, I've heard that before, and I believe you right, I believe it. I think that the issue, I think is that that could change right in a heartbeat, and then we've seen that change, and so there's less I think there's less trust now, So how do you ensure to your users? And maybe the problem isn't just small startups, it's really the big companies that we need to be talking about that won't change, right, that their data will continue to be protected as you guys grow, right, Well, you are correct. We have never sold any data. We don't have any plans to. But I have asked myself, you know, even asked my team the question before. You know, we don't need to now, but what do we were backed against the wall and it does bring up some some very interesting questions. I mean, am all in favor of regulation that kind of kind of modernizes how we think about property and privacy. Whenever I bring up this is what we do, people go, wow, there's application here, there's application here. In my mind, I go, there could be so much money made from the things that are necessary. Why would we would we need to turn to something that is nefarious? What do you mean by that. So on the iOS side, we're able to tell you whether you know the probability somebody likes you based on the algorithm, and users pay for that. So the idea is, with this information it can be valuable to somebody, I can sell it to them for what they're willing to pay, as opposed to use it for to sell to somebody else from marketing M I mean it does. It does bring up a very good point of how do we monetize this? I think it's important to be clear and to tell your users what you're gonna do. So we actually found a solution about maybe two years ago. We have a credit system in the app. And what it does is when you give us your text messages and when you label data for us, we give you credits. And that's kind of like a placeholder for what is the value of this data? I actually don't know, right, so somebody uploads out of the conversations of um, you know, it contributes to the algorithms that might be worth something someday. It contributes to the overall kind of you know, AI and data equosystem that might be worth a lot someday. I don't know what that's worth, but I'm going to give you a little place holder that said, hey, you gave me some information. We found that to be a way that we can kind of return values of the people like give us their data. So now in the app, anytime you uh it goes, I think you seem like you're a really empathetic person. Am I right or wrong? When you you know, agree, disagree, you get a small credit. And the idea is none of us know what this data is worth, and maybe I for advertising reasons, maybe maybe this is a system that allows us to say, okay, well I choose to see an ad. You company are making money from me seeing this ad. We can give you a small amount of those economics, and everybody is happy, right like this the idea that you kind of take issues with the way that that companies take data right now, it's like they take your data and we don't know what's happening to it, and then we're advertising. It's under the guys that it's free. What are you okay? Does it say that you're sec really in love with me? It is my predicted ages forty four? Are you serious? That's definitely not a maturity level. Okay, So my predicted age is forty four. I guess for our listeners, I'm thirty four, and it says your predicted ages thirty six. WHOA, I am thirty six. How would it know that? I want to get into my personal experience with me because I think it's I thought I was super interesting and it was such a personal experience because what we did, I have an iPhone, So like what we did, just for full disclosure, is we sent you guys the messages to analyze, and I sent you a year's worth of messages with Derek, who's a good friend but also my co founder, and we thought it'd be interesting to do it because we've been working together, you know, we're also friends. To see what your algorithm picked up about us. UM, but like, man, it took an hour and a half for my messages to just download because we down it all my messages, which was horrifying, UM, simply horrifying. When you talk about like novel, it's like because like the Great American novel minus like any thing classy or interesting? Did you did you read through some read through some of them? Yeah? They were really I mean, it's it's fascinating to see these like sniffets of conversations, especially if you, like years ago, I mean you were like was that me? Could? I? I mean like, unfortunately, I'm like, yes that was me. Um. You know, so I'm curious to see what your algorithm showed because I think, like, what you're doing is fascinating because it's so beyond just to someone like me. It's like, how do you better working relationships, friendships, all those kind of stuff? So should we try it? Yeah? Yeah, we should dive in. I mean, and have you done this interview like knowing my personality traits already? No? No, no, I'm not. I'm not that creepy organized or yeah, I didn't mean to say creepy organized. No I haven't. I haven't looked at any of these. Um. So I did the same thing and downloaded your conversations into a new phone of mine. And is this going to show me? Or is this also gonna show me and Derek? Because if it's going to show Derek, he's in the room, so we might as well just bring him over. Right, he's glaring at me, so that means we should absolutely bring him over. That's so grumpy. I wonder if his personality profiles is that. Oh, I guess we will see. Um so it is able to kind of give insights about both Derek and you. Um So, I guess not that you're sitting down, Derek, you can be the first rate. So for our listeners, he's pressing a button. Yes, I'm pressing a button. And it says you seem passionate about their work. Does Oh that's good. That's good. Yeah, I mean that's helpful given that we just launched a company. We launched our media company dot dot dot. So I'm glad you're passionate about it. Thank god. You see that. It says agree and disagree and this is kind of our way of getting better. Right. Oh okay, so I'm going to go with a agree on that. So it says that you seem like you're more philosophical than concrete. Let me ponder that. Agree on that? And so what is it doing now? It's it's taking us through an exercise for Derek. It's it's trying to understand things about Derek. So that if it figures out that there was this big personality difference between the two of you and that might cause you to kind of misunderstand. Uh, it'll try to give advice on how to bridge that communication gap okay sounds useful well from uh LORI are you looking awkward? I don't know how old you are. It got it wrong by the way I looked at the data. It got it wrong, and that's good. I'm glad you felt a little awkward looking at me thinking that that's not how old I was, because that's, uh, that's ten years older than I actually am. As Yeah, maybe it's maybe it's because it's a professional relationship. Do you think it says I have an old soul? Is what we were discussing earlier in the office. Potentially, so for our listeners, your algorithm thinks that I'm forty four and I'm actually thirty four, So why do you think it picked that up? So usually you would have a lot more information to go off of. This was just one conversation, so you can imagine if it was done fifty times. Maybe the way you text Derek might be different from the way you text other people. Maybe because we've been more professional over the last year. I bet if we had given it our twelve years, it certainly would not have thought I was forty four. But then so I also looked at the data, and it predicted that I'm thirty six and that is correct. Oh wow, alright, like your AI favors Derek, which is no big deal. Let's keep going. So the personality profiles, I actually think I included that when I sent over the infisition, Um, what did you think about that? It was good? It was pretty spot on. It says your top personality traits are empathetic, emotionally aware, dutiful, altruis, sticking, philosophical. I think that's totally spot on for her. I've always said that Laurie is one of the most empathetic people I know. So the fact that empathy was the first characteristic that it listed I thought was spot on. What did you think about yours? It says that your top five are philosophical, empathetic, energetic, ambitious, and fearless. Yeah, I think so. Maybe I'd like some more of that energy. I like those for him. Yeah. So it will also look at your personnel, and this is just pure math when it just shows you how similar you guys are. Similarity, I mean, that's kind of great, right, How does that compare to other people's similarities? That's pretty good? Pretty well. I mean, here's one thing that we tend to try to empathize and match the way the other person speaks. So just looking at this slice, it kind of shows you know, whether whether or not you are these things, this is how you want to come across to the other person. And so yeah, I mean that's that's a good score. It's great. Keep going. Oh let's see, let's look at the probability that this is a romantic relationship. So we're gonna ask, does Derek have a crush on me? So it is a thirty six percent chance. It's a small chance, but it doesn't seem likely. I think his husband would agree with that. Truthfully, that's very true. But I mean I would say I have a crush on you in other ways. Yeah, thank you. No, I think that's amazing though, I mean we have after looking at thousands of text messages over a year, it's able to determine that basically we're friends. Yeah, there's anymore there. That's because it did. It categorized what we are because technically we're like business partners if you were to look at us on paper, but like, actually, we're really good friends, and that's how it categorized us from that, right, Although I mean it did say I was ten years older than I am, So all right, we're gonna have to go back and tweak that word. What about these other there were some graphs, so what's on the app? There's kind of a lot more behind the scenes. Um so those graphs where things that you know are all things that contribute to the analysis that we give to users, but it might also be information overload for people who aren't looking for it. But like, is it because I want to know, Like could it tell us, like how I could communicate better with him, or like if there were issues that we had recently, like it's okay, you know, like I'm please help please help us. Yeah, we're okay, Like we've been through some stuff together. Well, what's so interesting to me about this technology is that Laura is someone who I communicate with probably more than anybody, especially at this current phase of building out dot dot dot, but with anybody you can communicate better. And so if this technology can help us learn how to communicate better to each other, and we're already good friends, were already high performing business partner, well then that just becomes that much more valuable that this can give us insights about the way we communicate to each other, which is predominantly over text message. Laurie sends probably ten text messages per my one text message. That's just her station. The data shows I send three point six text messages. Is that what it showed, Yeah, per day, whereas he was you know, half of that, which in this case is not not a indicator of the fact that I'm not interested in her, it's just simply my style of communication. Yeah, there's actually a tool in the hair that shows the relationship balance, and that's actually something that was carried over from crush. So what crush did was it not only showed you the balance of the relationship, but it actually showed you how that changed over time. And I think it's an astute point you make that. Okay, you might send a lot more messages than Laurie does, right, and so like I send more messages. Sorry, yes, And people have said before like, Okay, well that's why crush doesn't actually really work, because I'm just terrible at this. But if you actually look at your relationship balance over time, right, Like I've looked at enough of these graphs and it's almost like you can kind of see like the the EBB and flow of relationships. It's almost like a dance in lay, right, Like we've seen relationships where you know, actually my very first employee a week into the job, I was like, you wanted to see this in action that I can actually just you know, look at your relationship with your boyfriend. And we went through and I was like, something happened on that day, on that day, on that day, and that day. She was like, oh, yeah, yeah, we had a fight on those days. And then I told her like, yeah, the relationship balance has never been so in your favor, like he's never tried so hard, and she said, yeah, we actually had a fight on Saturday and he's taking me out to dinner on Thursday. That's interesting though, like that you could actually look at the patterns of like over a long time and see kind of inherently what I mean. I just wonder, like, is that a lot of power that you want or don't want? And like, could it be a self fulfilling profle I don't know. My feeling is this type of data is already in the hands of people like advertisers, and so why can't we as consumers have this data in a way that is actionable and beneficial to ourselves and actionable. I mean, we all have people in our lives who we communicate with in a way that either frustrating or causes anxiety, and if we had technology that could help us manage those relationships in that communication better than that would be extremely powerful. But to give you the flip side of it, I read something that a Wired writer wrote and he was talking about it's almost like tarot cards, right, Like it is big data analytics, and I believe in it, and I've been obsessed with this idea for a while. Like I was interviewing a guy years ago who does predictive data analytics to determine if something bad happens, And in the middle of the interview he's like, I like he could predict if like a suicide, bombing or something really terrible was going to happen in the area, and he had like a certain level of accuracy, and I think you would probably you would probably be like, oh, that makes a lot of sense given what you do. Um. And in the middle of the interview, he's like, I analyzed all your data on Facebook, Instagram, Twitter, everything you've said over the last seven or eight years. And he's like, I predict your unhappy in your relationship and you're growing unhappy at your job. And I was like, whoa, that's fascinating. When I left my job and that relationship, I called him up. I was like, how did you do it? And he talked a lot about um, you know, he talked a lot about how a lot of the stuff you're talking about negative and positive words, the time you tweet, the time you post, all this information that's out there. And I've talked to my tech context about and I think there's a lot there. But then there's also a lot of skepticism to being like it's also like a terror card, right, like where you give us some things like oh yeah, that that totally makes sense. And a Wired writer wrote, um, and so you have to respond to this, but he said, whether a text analyzer reveals anything real or not, using one seems to offer a false sense of predictability and a semblance of control over otherwise messy human relationships. Does the emoji mean it's true love? Did the double text ruin the mood? Am I doing this right? The answer is displeasingly never live in the app. The guidance there is about as useful as a deck of tarot cards. So what do you think, Well, you know, the tarot card and the horoscope industry is huge, and you know, whether you believe it or not, right, people are actually just looking for another data point. It's up to them whether they want to believe it or not. So, and then one other thing I'll go into is I think there's a The one thing after Cambridge Analytica and the whole debacle with privacy and Facebook that really I think wasn't covered enough was this fine line between micro targeting and manipulation and this idea that we know so much about people and we can tell so much that you just like you're taught. We've spent this whole time talking about we can tell so much about people. Now there is kind of a fine line with this AI being able to to micro target or to manipulate, right, Like do you worry about kind of the future and and some of those ethical lines. Um, I think every piece of technology is controversial and worrying. I mean, I think the more powerful technology is, the more controversial it is, and it always came down to how people used it. I think it's in inevitability and I think we just we should just embrace the things that are and try to understand it, talk about it. Yeah, and and see how we can kind of live in the world with this tech. Because it's not going away. How will we use it? I'm sitting round Derek, how are we using this? If this is like version, this is like what one point oh right, like two point out? One point out? You know, what does this look like in ten or fifteen years? How are we using this? I think this technology has the ability two simplify our lives. I think the choice will always still be ours on whether we take its advice or not. Yeah. I mean, I don't know where this is going to go. I just think if the people who have this information kind of do it, I guess, do it for good reason. That's all we could really ask for. I can see that doesn't seem like a very you know, gratifying response. No, it's not. It's not that it's not gratifying. I think it's optimistic. I think there's a double edged sword with every with every technology. The microphone is off and then says something that I thought was fascinating, So I asked him permission to include it in the episode. Since technically the interview was over, he said. Yes, says May has over thirty comments from users, and he said of those comments, only one percent are about privacy. Many of his users who have experienced made don't mind giving over their data. In fact, one of the most common requests, he says, is to give more. Although you have to remember May's users have shown a willingness to share more than the average person, says. Users asked to just turn on their microphone and record all day get a much larger data set to analyze. So could more data points about our conversations give us more feedback that could lead to stronger relationships. And when given the choice, if we saw the power of May to help us learn more about each other to communicate better, is the tradeoff worth it. We often hear these very strong arguments for privacy, but as s says, the feedback he gets is I'll give more if you give me more information that's valuable to me, although of course there's a lot of gray area in between. I'll leave you with that. I'm Laurie Siegel and this is First Contact. For more about the guests you here on First Contact, sign up for our newsletter. Go to First Contact podcast dot com to subscribe, follow me. I'm at Lorie Siegel on Twitter and Instagram and the show is at First Contact Podcast. If you like the show, I want to hear from you, leave us a review on the Apple podcast app or wherever you listen, and don't forget to subscribe so you don't miss an episode. First Contact is a production of Dot dot Dot Media. Executive produced by Lorie Siegel and Derek Dodge. Original theme music by Xander San Visit us at First Contact podcast dot com. First Contact with Lori Siegel is a production of Dot dot Dot Media and I Heart Radio