On March 6, 2019, Mark Zuckerberg published a long essay about privacy and social media. Does this indicate a change in philosophy for Facebook or is something else going on?
Learn more about your ad-choices at https://www.iheartpodcastnetwork.com
Get in touch with technology with tech Stuff from how stuff works dot com. Hey there, and welcome to tech Stuff. I'm your host, Jonathan Strickland. I'm an executive producer with How Stuff Works, and I heart radio and I love all things tech. And today we're going to dive into a recent story. On March sixth, two thousand nineteen, Facebook co founder and CEO Mark Zuckerberg posted a very long essay titled a Privacy Focused Vision for Social Networking on Facebook. Of course, the post has prompted a lot of discussion in the tech space as well as the political space, over what Zuckerberg actually means, both for Facebook users and for the company itself, and it raises some interesting questions. So today I thought I dedicate an episode to Facebook and the concept of privacy, because the two things have long been at odds with one another. You wouldn't necessarily associate Facebook with privacy, and that's part of the problem. Now, before I dive into the essay, and I'm going to quote the essay quite a bit, but don't worry, I'm not reading out the whole three thousand, six hundred word thing. But let me set the stage a little bit. We're gonna dial back the clock to January twenty ten. Facebook had been around for several years and Michael Arrington of tech Crunch interviewed Mark Zuckerberg in January to talk all about Facebook. And this is what Zuckerberg had to say about the concept of privacy back then. So this is nine years ago. People have gotten really comfortable not only sharing more information and different kinds, but more openly and with more people. That social norm is just something that's evolved over time. We view it as a role in the system to constantly be innovating and be updating. Water system is to reflect what the current social norms are. So essentially, Zuckerberg was saying that things have changed culture, you know, values had changed, where things that might have been considered private in the past now we're public. People were eagerly sharing private thoughts on social media platforms, and they were sharing more and more with more people, people who might not be close friends, they might be further out in that person's social network, they might even be strangers. So in that discussion, Zuckerberg was essentially making the claim that Facebook's policies and operations were a reaction to the changing values of the culture at large that this was happening, so Facebook was changing to cater to that. Now, not everyone agreed with that particular assertion, including yours, truly, I also didn't think that this was a totally genuine response. I would argue instead that Facebook is dependent upon people issuing privacy and that the company has done a great deal to encourage this kind of oversharing behavior. I'm not saying that Facebook is fully responsible for it, merely that the company has greatly encouraged it. I mean, that's what the company is based around. So rather than react to a cultural shift, I would argue Facebook has done a lot to boost or push for that shift. So again, not fully responsible. I wouldn't say that Facebook is the culture king, that's the kingmaker of culture, but rather they saw the opportunity and they pushed really hard for that to become the social norm. And the reason for doing that is really really simple, because it's cash, cash money, y'all. Facebook provides a service to users, right, It's a social media platform we can use, but the way the company makes money is largely through advertising. So you can think of Facebook's product as being your data, assuming that you use Facebook, of course, and your data might include specific facts about your identity, like your home address, your phone number, your name, your age, your occupation, the schools you attended. It might also include information about people you know and your relation to those people. So it might have the identity of your spouse or family members, and it may even identify them as such because you can, you know, designate people as having a specific relationship to you. That's valuable to you to have these particular circle is like if you want to message just people who are in your family, that's valuable to you, but it's also valuable to Facebook to have that information. So Facebook also has a record of all your activities on the platform and the posts you tend to interact with. So as you use Facebook, you generate information. You provide a more complete picture of who you are, what you like, what you value and more and what Facebook doesn't outright no from your input and behavior, it might be able to guess at based upon similar behaviors they've seen, so essentially pattern recognition. It might notice that lots of people who happen to like certain types of posts also like other posts, and maybe you haven't liked this other post yet, but because you fall into the first category, chances are you'd also fall into the second one. Now let's shift our attention to advertisers. Advertisers want to get the most for their money, which is I mean, that's just easy business, right. Everyone wants that, and this doesn't make advertisers bad people. But they want to get the most bang for their buck. They need to get the word out about products or services, and they want to focus on people who are most likely to respond to those advertisements and make a purchase. It doesn't do you any good to send ads to somebody and those ads have nothing in common with that person. If the person is never going to buy your product or service just because of who they are, it's a waste of time and resources to send that add to them. So the more information advertisers have about a potential target or customer, the better campaigns could even be tailored for specific types of people. You could have a really good advertising company craft different ads, all meant for the same product or service, but intended to target different types of people. So you might see one version of an ad for a product and your buddy sees a totally different version of an ad for that same product, but the two different versions of the ads were selected because of differences between you and your buddy. Maybe you really like hiking and your buddy really likes the beach, and this is an outdoor sporting event type company. Maybe it's uh, you know, a sports equipment type company, And so they send one ad to you that is geared more towards hiking, camping and that kind of thing, and one to your buddy that's more about beach life kind of stuff. It's both for the same company or service, but the ad has been selected specifically because of differences in your preferences that Facebook is able to keep track of. If that means that either of you are more likely to make a purchase, then that's a valuable expense for the advertiser. So you start to see how all these little pieces of data start to add up to become incredibly valuable. And then you think about Facebook being a data broker with all this information about you and all these potential advertisers to work with, and you really see how the money rolls in. So Facebook has a vested interest and people sharing information publicly. The more people share about themselves, the more data Facebook has about them. So from posts to about you know, pets too, pictures of meals, invitations to parties, requests for help to seek out a job. Facebook's paying attention to all of this, and it has on occasion gone even further than that, such as when the site launched the Beacon program back in around two thousand eight. I believe it was that's what allowed companies that were partnering with Facebook they could share activities that Facebook users were doing on their sites. So let's say that you are a ticket broker company. You you sell tickets to live events, and you have this Facebook Beacon partnership, and then I, a Facebook user, go to that site and I purchased a ticket, then the site could have published that activity to my Facebook feed, whether I wanted it or not, and a lot of people felt like this was a terrible violation of privacy. So maybe you just happened to go and buy tickets to go see my little pony on Ice, and maybe you don't need all your friends on Facebook to know that and then to use that information as a way to make fun of you for the next decade, for example. But then Beacon goes and posts your purchase for all to see on your profile. Not that this has happened to anyone I know, and it definitely did not happen to me, but you get the idea. So Facebook is in the information business, the same as Google and lots of other companies that we tend to associate with some other type of business. Right, most of us think of Facebook as a social network platform. We tend to think of Google as a search engine, and then increasingly other stuff remotely related to search engines. But both companies actually depend heavily on leveraging data about their users to entice advertisers. They really are advertising companies or companies that host advertising. So it was completely in the interests of Facebook for Zuckerberg to declare that the notion of privacy was no longer a cultural norm back in twenty Whether that was true or not as a matter of debate. And certainly, some people appear to have little concern for their privacy, and I'm not passing judgment on them either, but I suspect there are a lot of folks out there who feel otherwise, that privacy still is important and still should play a part in our interactions online and otherwise, And some pretty high profile incidents have brought the topic into sharp focus. If you've been listening to my show for a while, you probably heard the episodes about Cambridge Analytica and how the data analytics company relied upon information and that was pulled from an app that left many feeling that they had had their privacy violated. The app was a survey that would pay users a small fee for completing the survey, and part of the process involved granting the survey app access to the survey takers Facebook page, so the app could pull down information about the person taking the survey. But so far, so good, right, I mean, the person taking the survey is presumably aware of this process and is getting compensated to boot. But then we go a step further. It turned out the app was also pulling down information about the friends connected to the survey takers, so not just the taker him or herself, but that person's friends. The app's permission allowed the administrator of the app to view the survey taker's friends profiles as if the app were a friend of those people itself. So let's say you and I are friends on Facebook. Sometimes you make a public post on Facebook and anyone in your feed can read that public post. Anybody, No problem there? Right? That That makes perfect sense. But sometimes let's say that you post so that only your friends can see those status updates. The general public never sees them. However, I'm your friend, so I can see those posts. Then I decide I'm gonna go take this survey, and I agree to the apps terms, not knowing what all is entailed with that, And now the app can see your profile as if the app were me, meaning it can get access to all those status updates you made using the friends only option, and you never agreed to that. You're you are an unknowing party. You never gave permission to the app or to me, You just had all that information accessed. Facebook would address that problem after people brought it to the company's attention, and this was actually years before the Cambridge Analytica scandal really broke out in public knowledge, but the damage was already done. And once the scandal did break many people were rightfully upset at Facebook. But that's just one example of how Facebook has been less than perfect when it comes to protecting users safety. Another recent example came to light in late two thousand eighteen, Gizmoto published a report that revealed Facebook had done something fairly shady, pretty shady I would say with users who had opted into two factor authentication with the service. Now see the purpose of two factor authentication is to make online profiles more secure. That's the whole reason for it, and the ideas that you require users to submit proof of identity from two categories or factors. The factors we typically look at our things you know, things you own, and things you are. Facebook was looking at the first two factors. The thing you know would be your password. You know your password, so you put it in when you log in. The thing you own would be a cell phone, a cell phone that you have registered with your account. So when you enabled two factor authentication, logging into Facebook on a new machine would prompt you to provide your password, and then Facebook would send you a one time code to your phone, which you would then enter into Facebook's site. And by proving you had both the password and the phone, you're essentially proving you are who you claim to be. It's supposed to cut down on the chance that someone else might have access your account just by guessing or otherwise getting access to your password. If they don't have your phone, they can't complete the two factor authentication and they're locked out of your account. But Gizmodo found out Facebook was also using this process to do something else, something not so awesome. The company was using the data from these two factor authentication phones to target those users with more specific ads. So the system, which was supposed to be about providing users with a sense of confidence that their accounts were secure, was simultaneously being used to generate revenue you for Facebook with targeted advertising. And Facebook was doing this by collecting contact information from users phones to further fill out the links between social contacts who were on Facebook. The Gizmoto report gave a pretty interesting account of this. Cashmir Hill, who wrote the piece, did an experiment by paying for an ad on Facebook that would target a specific phone number, and that phone number belonged to the landline for a computer science professor named Alan Mislove, and this was with miss Love's consent. He was in on this, so miss Love had never provided this particular phone number to Facebook. Hill suspected that Facebook had made the association between the phone number and Mislove by combing through the contact information on phones being used for two factor authentication purposes, because after she placed this ad, within a couple of hours, miss Love saw it, and again, miss Love had never put this particular phone number into his profile on Facebook, and yet within two hours of the ad being placed, it popped up on his Facebook feed, So that meant the company had to get that phone number from somewhere, So that sets the stage for this privacy concern. Next, we'll take a look at what Mark Zuckerberg said about privacy in that essay, and then we'll look into some analysis and criticism of that essay. But first let's take a quick break. Now you can find the full essay that Zuckerberg wrote on Facebook, but I'd like to pull some segments to talk about specifically, and we're gonna start with the first two paragraphs after the introduction. Now, like I said earlier, I'm not going to read out the whole thing that would take up nearly an entire episode just by itself. But once Zuckerberg is done with his introduct and paragraph, he has this to say. Quote. Over the last fifteen years, Facebook and Instagram have helped people connect with friends, communities, and interests in the digital equivalent of a town square. But people increasingly also want to connect privately in the digital equivalent of the living room. As I think about the future of the Internet. I believe a privacy focused communications platform will become even more important than today's open platforms. Privacy gives people the freedom to be themselves and connect more naturally, which is why we build social networks today. We already see that private messaging, ephemeral stories, and small groups are by far the fastest growing areas of online communication. There are a number of reasons for this. Many people prefer the intimacy of communicating one on one with just a few friends. People are more cautious of having a permanent record of what they've shared, and we all expect to be able to do things like payments privately and securely. End quote. All right, so far, so good. These are not super deep insights, of course, they're mostly common sense, But this is Zuckerberg laying down the foundation for his position moving forward. There's a growing interest in separating out our personal interactions from the public space, and there's not really a point here in which Zuckerberg addresses targeted advertising, which I would argue is one of the big reasons people are interested in moving off of public online communications channels. But then I would have been surprised to see so candid a statement from him on that subject. The statement also seems to ignore the fact that Facebook may have played a part in making people feel like they need to move to a more private, secure platform. So, in other words, Zuckerberry saying, I'm noticing this trend, but he's not saying we're responsible for kind of creating this desire um, just as he wasn't saying that they were responsible for promoting the concept of oversharing in the first place. Zuckerberg goes on to assert that social networking platforms will remain important moving forward and argues that their value and how they connect people together can't be understated, that they provide a service that is valuable, and therefore they will continue to survive. And he says, quote, I understand that many people don't think Facebook can or would even want to build this kind of privacy focused platform because frankly, we don't currently have a strong reputation for building privacy protective services, and we've historically focused on tools for more open sharing end quote. Now there's an illusion both to the company's privacy related woes and a sort of tangential nod to its business model without explicitly calling it out. He goes on to say he see communications shifting to private and encrypted services, and he says, we plan to build this the way we've developed WhatsApp. Focus on the most fundamental and private use case, messaging, and make it as secure as possible, and then build more ways for people to interact on top of that, including calls, video chats, groups, stories, businesses, payments, commerce, and ultimately a platform for many other kinds of private services. End quote, and he lists six principles that he says will be important for this new focus to come true. First, that people should be able to have private interactions with clear controls for who can access those interactions. Next, that these methods should incorporate end to end encryption to protect the communication from prying eyes. Third is reducing permanence uh, something made famous by Snapchat. This is the idea that communications would have sort of an effective self destruct mechanism after a particular amount of time, presumably enough time for the other party to have seen the message, no permanent record. In other words, though I should point out that some of the services famous for providing this kind of experience weren't, you know, actually deleting the content right away. So while it sounds like it would be this kind of one time only thing, and then the data has gone forever. There's a chance the data is not really gone forever. Next, Zuckerberg stresses the system will have to be safe and that people will entrust Facebook with their data. Then he calls for an interoperable approach, pointing out that lots of people rely on different services to connect with one another. So you might be a whiz on Facebook Messenger, but maybe your best buddy prefers using WhatsApp, and then another friend of yours, a friend in common is really only using Instagram, and all three of those, by the way, belong to Facebook. Zuckerberg wants a method that could work across apps, as long as they are Facebook's apps. More on that point in just a bit. The last principle Zuckerberg mentions is secure data storage, which ties back into safety. Though perhaps Zuckerberg was actually differentiating a person's physical safety and the security of their data, or maybe the physical safety of data centers. It's not entirely clear in that part. He does later go into data centers in particular, so maybe that's where really what he meant. Then Zuckerberg goes on to flesh out these points a little more, and I'm not going to dwell on all of them, because again it would just take too long, but I do want to focus on a few. So under encryption and safety, he has this to say, quote, there is a growing awareness that the more entities that have access to your data, the more vulnerabilities there are for someone to misuse it, for a cyber attack to expose it. There is also a growing concern among some um that technology may be centralizing power in the hands of governments and companies like ours, and some people worry that our services could access their messages and use them for advertising or in other ways they don't expect. End to end encryption is an important tool in developing a privacy focused social network. Encryption is decentralizing. It limits services like ours from seeing the content flowing through them and makes it much harder for anyone else to access your information. End quote. So here we have Zuckerberg pointing out that using data for the purposes of advertising is a concern for some users, and he's also explaining that end to end encryption would mean Facebook would be unable to see the content of the messages themselves, making that content awful limits for monetization, at least directly, And that sounds pretty good to me on the surface of it. I like the idea of a means of communication that keeps things truly private between parties and isn't used as currency between Facebook and advertisers, So kudos to that idea. But then he turns around and points out that end to end encryption also creates opportunities for people to use the communication tools to do bad or harmful things, such as planning a terrorist attack, and he states, quote, we have a responsibility to work with law enforcement and to help prevent these wherever we can end quote. So this opens up a question that people have asked for ages, how do you balance individual privacy with collective security. Zuckerberg says that Facebook would use other means to detect possible harmful use, including looking for usage patterns among people communicating through this private, end to end encrypted service. This is a good reminder that sometimes a spy doesn't even have to be able to read or hear a message to draw some pretty accurate conclusions about what is being said between different parties. Just ask the n S A. Actually, you don't have to ask the n S a they're already listening. Zuckerberg then moves on to talk about interoperability and talks about how, in his vision, users will be able to communicate across Facebook apps through an opt in feature, eventually even including SMS along with Messenger, WhatsApp, and Instagram. Now, I feel this is a pretty risky move. The more points of connection you have between services, the more potential vulnerabilities that will be in the overall system, and the more chances you've given hackers to find ways to exploit those vulnerabilities. Zuckerberg does acknowledge that interoperability will create challenges that will need to be solved before it can be rolled out. So I don't want to suggest that he's oblivious to this, but it's a concern I have that I think needs further attention. Zuckerberg concludes with a section on data security, pointing out that Facebook doesn't build data centers and countries that a history of ignoring or violating human rights like privacy and freedom of expression, and he gives a rather broad series of next steps that the company will take to achieve this vision. Now, I've already talked a little bit about some of the concerns I have about this essay, But when we come back, we're going to explore some other takes on it that apply critical thinking to the whole proposal and to see what might really be at the heart of this message. But first, let's take a quick break. I think it's pretty clear that I have some concerns about what Zuckerberg has said, but I am not the only person to express skepticism about this new vision for Facebook. Some people, like Jeff Chester, doubt the sincerity of the mess as well. Chester is executive director for a nonprofit organization focused on privacy. It's called Center for Digital Democracy. The Washington Post quoted Chester's response to Zuckerberg's essay. He said, quote, why does it always sound like we are witnessing a digital version of Groundhog Day? When Facebook yet again promises when it's in a crisis that it will do better, will it actually bring a change to how Facebook continually gathers data on its users in order to drive big profits. That same Washington Post article, which by the way, has the title Facebook's Mark Zuckerberg says he'll reorient the company toward encryption and privacy. It's a great read. It's a long title cited surveys that indicated people's trust, and Facebook has taken a bit of a nose dive over the last couple of years, So it sounds like Chester isn't alone in this skepticism, and his question about whether or not Facebook will actually change its ways is a valid one. Facebook's operating margin has been pretty darn high for the last few years, and for those not inclined toward business speak, a company's operating margin is the ratio of operating income to net sales, So it gives you a measurement of what proportion of a company's revenue as in the money it's bringing in from doing business before some other indirect costs like taxes are taken out, are compared to operating costs, so you want the percentage to be high. You want to the higher it is, the more money you are keeping from what you earn. So Facebook's has been between forty three and forty seven, which is pretty darn good. It means that so for every dollar Facebook makes between forty three to forty seven, cents of that dollar is an operating earnings, so that's profit before taxes. The rest would go to covering the costs of doing business and it's pretty clear in business. This is from a non business e sort of person, and I think this is a fair thing to say that making more money and being able to keep more of the money you make is considered a good thing. I'm pretty sure about that one. A company could spend more to make more But if the operating margin gets more narrow over time, if that percentage begins to drop, you could end up in a situation in which your actual profits, your your take home profits after all costs are lower than they used to be because your costs have grown so much. So in the end of the day, you could say, yeah, we made more money than we did last year. Like last year we made fifty million, this year we made a hundred million. But then last year our costs were ten million, and this year our costs were ninety million. Well, that means your your take home profits from the year before we're forty million dollars. Your take home profits from this year are ten million dollars. So yeah, you made more money, but you weren't able to keep all of it. Right, So that operating margin is a very important thing, especially for investors. So some critics question Zuckerbert's commitment because what's CEO of a publicly traded company is going to come forward and say, I'm gonna pivot here and focus on a new approach that we can't monetize the way we could with our old model because I would cost Facebook money and it would be expensive to make this change to the platform. And if there's not a substantial way to generate revenue, then what you're looking at is the operating margin getting smaller and investors getting angry. So this is really the the critics pointing out. Facebook isn't saying they're not going to make money off your data. That's not what Zuckerberg was saying at all. That that by definition will have to continue because otherwise Facebook won't be a business anymore. It won't be making money. So keep that in mind when you're hearing these messages now. Other critics say that pushing this new approach for Facebook would play right into the hands of people who use Facebook to spread misinformation. That's something the company has been struggling with for a while now. Charges that bad actors have been using Facebook to spread lies and propaganda all around the world. We're really familiar with it here in the United States, but the same thing is happening in other countries around the world. So these critics are interestingly arguing against Facebook for making things more private. They're they're not saying Facebook is still going to be making money off of you. What they're saying is this particular approach to communication is more dangerous. They say that the threat to security is too great. And I find that really interesting because you've got privacy advocates on one side that's say that these approaches aren't really addressing privacy problems, and then you have security people on the other side that think that this is pushing toward a more insecure future for online communication and that encrypted private messaging is a dangerous strategy. So two different perspectives, both of which are saying that Facebook is maybe not so uh so wise to pursue this particular approach. But one of the best critical examinations of Zuckerberg's essay I have seen was written by Molly Wood, and it was published by Wired. And in the interest of full disclosure, I've known Molly Wood's work since before I was recording text stuff, and she and I have chatted several times, so we do know each other. But the funny thing was I was reading her piece, and I was making notes, and I had fully intended on incorporating those notes into this episode before I had even really noticed that she was the one who wrote the piece. I was reading the article based on the headline, and it was only after I had started making notes that I went to see who wrote it, and then I thought, oh, well, that makes sense. It was Molly Wood. Anyway. Her article is titled Zuckerberg's Privacy Manifesto is actually about messaging end quote, and she makes the case that what Zuckerberg was really doing was laying out a product development plan to address some pressing issues for the company. Not issues about user privacy or misinformation campaigns or anything like that, but rather issues with growth and usage and retention of customers. She points out that Edison Research found Facebook had around fifteen million fewer users in the United States in two thousand eighteen as compared to two thousand seventeen. That keep in mind, that's an overall loss of fifteen million users even after new members have joined the service. So you've got new people coming on to Facebook. That the people leaving Facebook means that you have an overall decline in users. That's an ugly warning sign that people are leaving your platform in droves. Moreover, it was the second year in a row that Edison Research measured a drop in users year over year, and the United States is while it's while it's small in population compared to other countries, it's large as far as a revenue generator for Facebook. So seeing declining numbers in your prime market is really bad business. In addition, Molly sites and article in the Information dot Com that pointed out another big problem with Facebook, which is a decline in original sharing. So the people who haven't left Facebook are sharing less about themselves. They might share links to articles or post cartoons or memes or videos or or whatever, but they're not sharing information about themselves as much. And since we've already covered how Facebook bases its revenue off of user data, that's a problem for the company. I mean, you can draw some con inclusions about what a person likes based on the stuff they share on Facebook, but it's not the same as information about the actual people behind those accounts. So Facebook is facing what could be in the long run, an existential crisis for the company. And messaging is where there's a glimmer of hope Molly points out that in China, a popular service called we Chat brings together messaging, phone calls, apps, and it's a product where the average user is spending an hour a day on it, and she says, quote, this is almost exactly what Zuckerberg describes wanting to build over the next few years. End quote. And if you remember when I quoted Zuckerberg earlier in this episode, that was pretty much what he was saying. And she goes on to argue that a Facebook can make a super app, one that can accommodate advertising and commerce transactions, with Facebook getting a cut of each transaction. Since it's facilitating those deals, it could shift its dependence from a social networking site that seems to have passed its peak and move on to a messaging service more in line with the way younger people in particular are communicating with one another. She also Kennely points out that the commitment to encryption doesn't mean the company will stop targeted advertising or profiting off of users. Quote. The fact that your individual messages might be encrypted in transit does not in any way prevent Facebook the entity from knowing who your friends are, where you go, what links you click, what apps you use, what you buy, what you pay for and where, what businesses you communicate with, what games you play, and whatever information you might have given to Facebook or Instagram in the past. End quote that actually touches on another point that I didn't mention before, the one about location services. If you have Facebook installed on a phone and you haven't opted out of location data being, incorporate it into the service. Or even if you have, as some studies have shown, you've got yourself an app that's keeping tabs on you wherever you're going and how long you are spending at every location you go to, and potentially who you are there with, because if they also have the Facebook app installed, Facebook is able to to core operate all that information. You can bet that would still be a part of the messaging service in the future that Facebook and visions. It's a gold mine of valuable data. Molly Wood concludes with quote. In fact, nowhere in the more than three thousand words that Zuckerberg published on Wednesday does he say that users will ultimately control their own data or have the option to reduce the amount of data they share with Facebook, or delete their information, or operate anonymously, or pay a subscription fee to reduce or eliminate ad tracking, anything that would represent an actual commitment to privacy other than secure messaging. End quote. Now, I urge you to read her whole piece on Wired, because while I've given some sizeable excerpts, it's best if you read it from start to finish. It's a really good piece, and I pretty much agree with her not Now, this doesn't mean I think Zuckerberg is some sort of evil data tyrant, but he is the CEO of a multibillion dollar global company that's in the business of leveraging user data. The business pressures are enormous, and he's likely making decisions to ensure the health of his company, including how to make users interested into companies products before too many people migrate away from Facebook entirely. And again, I think this whole story brings up the necessity for us to employ critical thinking when we come across various pieces of news, to really put it through some difficult questions, ask ourselves, what are the actual motivators here? What are the benefits of the approach that is suggested in this piece? You know, answering those questions can lead you to some conclusions that might be different than whatever the surface level happens to be of that communication. Uh Now, it could very well be that maybe Zuckerberg is being extremely sincere, maybe there's no old terrier motive there, But I feel pretty confident that the critics are onto something here. I'm curious what you guys think. Are you guys still using Facebook out? That's another thing, Just in the interest of full disclosure. Again, I actually planned to step away from Facebook this year after June. I think right now I'm in a production that is using Facebook to communicate things like schedules and that kind of stuff, So I actually need to use the service right now just so that I can continue to do my job with this production. However, once that's concluded, I plan on sort of taking a break, not deactivating my account entirely, but not visiting frequently, not really relying upon it, and hoping that my friends will go through the effort of reaching out to me through other means and not just assume that I've disappeared off the face of the earth. I'll report back after that's gone on for a while. But I'm curious what you guys think and if you have any suggestions for future episodes of tech Stuff. If you do send me a message, the email address is text stuff at how stuff works dot com, or you can drop me a line on Facebook or Twitter. The handle for both of those is tech Stuff hs W. Pop on over to our website that's tech Stuff podcast dot com. You'll find a link to our store there. Every purchase you make goes to help the show. We greatly appreciate it. It's all the advertising I have for you right now, and I'll talk to you again. Release for more on this and thousands of other topics, is that how stuff works dot com.