A Nobel Economist's Plan To Tax Digital Ads

Published Oct 30, 2024, 4:00 AM

In this episode, Ed Zitron is joined by Daron Acemoglu, MIT Economist and recent winner of the Nobel Prize in Economics, to talk about his daring - and likely effective - plan to tax all digital advertising revenue over $500m at 50% as well as how we might adjust incentives to bring big tech under control.

PAPER: https://shapingwork.mit.edu/research/the-urgent-need-to-tax-digital-advertising/ 

---

LINKS: https://www.tinyurl.com/betterofflinelinks

Newsletter: https://www.wheresyoured.at/

Reddit: https://www.reddit.com/r/BetterOffline/ 

Discord: chat.wheresyoured.at

Ed's Socials:

https://twitter.com/edzitron

https://www.instagram.com/edzitron

https://bsky.app/profile/zitron.bsky.social

https://www.threads.net/@edzitron

Also media.

Hello and welcome to Better Offline. I'm your host ed Zeitron. Back in March of this year, MIT economist, their own some Oglu and MIT Sloan Simon Johnson published a paper called the Urgent Need to Tax Digital Advertising, where digital advertisers will be levied with a flat tax of fifty percent on all revenue above five hundred million dollars. And today I'm joined by Drone to walk through it. Daron, thank you for coming on.

Thank you, Thanks Ed.

It's my pleasure, so walk me through the idea. It seems pretty simple, but maybe there's more nuance to it.

Well, it's very simple to implement, it's very simple to explain. The question is why, you know, why do something like this? You know, after all, who doesn't love something free? And we're getting a lot of free things because the entire our online ecosystem is monetized via digital ads. The issue is, I think this ecosystem creates a number of problems which are set to get worse.

With AI right.

The most obvious of those is something that's been discussed quite a bit over the last few years, which is that monetizing via digital ads means that you are going to encourage platforms to collect more and more data about people, and this creates a much more intrusive, lower privacy, and potentially distorted sort of system.

Right.

The second, perhaps corollary or separate thing which has started Reach receiving more attention, is that this also means you want to make sure that the eyeballs are glued to the screen, which your large screen, small screen, and that itself might generate a lot of other problems, including perhaps mental health problems, because you try to get people to stay on the platform by inducing strong emotions envy, jealousy, anger, outrage, and so on and sort of it creates a very different type of social environment than we are used to, with widespread negative consequences. But I think the one that I'm most worried about is that this reduces competition. It is impossible today for any service by a newcomer to come in and say, Okay, you're going to get this from Google, Facebook, Instagram for free, and I'm going to provide higher quality content for which you have to pay a subscription fee or something else. Especially difficult because newcomers are not going to have the network. There's going to be uncertainty about their quality. So it really cements the system where everything is going to be monetized via digital ads, and that's going to discourage the entry of new products, new services, and especially new technologies. For example, everybody worries about what social media and other online platforms and AI implies will imply for democracy. Well, something that actually creates more pro democratic conversations, higher quality content, that's going to be very difficult to get off the ground today.

Right, So how does this tax actually change the incentive though?

Well, I think the main way in which it changes the incentives is that it makes it possibly more likely that both existing platforms and new platforms will now say, let me experiment with new products that are higher quality for which I can build a clientele via subscription fees or other things. Because if I get one hundred million dollars via digital ads, half of it is gonna go away. If I can get that money for I have subscription fees, that's good. It's no longer so inferior to doing it via digital ads.

But I think one of the things that concerns me with it, or maybe it's less of a concern and more just the after effect will be is the people are used to not paying for social media.

That's why this needs to be high. You know, if this was a ten percent tax, it wouldn't make so much of a difference. So in some sense, when I mentioned ecosystem, I was intending to imply by that both what the offering side the platforms are doing and what the consumers are expecting. So there's a synergistic relationship between users, consumers, and the firms, and you want to change that relationship. Again, you wouldn't want to do it if the market system was working perfectly and everything was hanky dory, but I think there's a lot of evidence that's not the case at the moment.

So and the incentives would be that they just can't make that much money off of digital ads, so they will have to find new business lines. But could this not potentially kill off the idea of the free social network.

Well, first of all, I think that idea has become excessive. If we scale it back, it wouldn't be so bad. And in some sense, I think the purpose is to scale it back because once everybody expects everything for free, that does create a race to the bottom in terms of quality, in terms of data protection, in terms of you know, new technologies that you know, will actually change the face of the kinds of offerings that we get right.

And it's I remember in the past, we've had companies try paid social networks and it just hasn't worked.

Well. It didn't work because they were up against free social networks digitized so it monetized via digital ads, especially at a time when people didn't understand what the costs of these things were. You know, I think even today there are consumers who do not fully recognize the amount of data that's being collected about them. So if you go back ten years ago, I think both the sort of addictive or quasi addictive nature of social media, the extensive data collection, and how that data can be used both you know, to guide you towards certain products, perhaps to behaviorally manipulate you, or perhaps to charge you higher prices. All of these things were not completely understood.

And I think I gain what you mean.

It's like the latest stage model of Instagram and Facebook is so different because when everything's free, you don't really have a mechanism to control the customer other than just tricking them just continually.

I mean, I'm not gonna trick is probably wrong. There is tricking. There is tricking, it's not everything. So you know, when you get an AD, it's useful, you're finding out about products. But what is the trade off between how much of your private data you want other people to have access to versus getting some of those products? And second, once people wants platforms have access to that data, what is there to stop them from offering some of the manipulative products as well as some of the useful products.

So perhaps the biggest thing, because I just while prepping for this episode, I went and did some math, and what you're suggesting would kill Meta. Now I'm not saying that you should. You're suggesting you should kill Meta. I'm just saying that metas for the last four quarters their net income was fifty one point three five billion, offer one hundred and thirty billion dollars worth of revenue. So this would make their business model untenable.

Is that well?

First of all, no, I mean, you know, instead of fifty five billion, if they make you know, thirty billion, it's not the end of the world. I would be happy to have a company that is worth thirty billion.

As would I.

But the hope is that they would also then offer products and services that are higher quality, that are subscription based. You know, we know people still you know, pay for certain things, although less like Netflix. Netflix is also now being sort of being forced to move more in that direction. You know, is there a world in which higher quality Netflix for which some people pay is viable, where whereas you know, some lower quality one financed via you know, monetized via ads also coexists. I think those are questions, and I think it sort of opens up the market system to different, different sort of forces. So if there is a paid What's Up that has better security features and you don't get these completely unexpected annoying things that have started popping up in What's Up, perhaps that's attractive and some people will be happy to pay ten dollars a year for that.

Kind of feels like the advertising model is a lazy man's business as well, because it doesn't incentivize you to make a better product. To exactly, a device is a monopoly, right.

And why, you know, how did I arrive to this idea? My stick for the last you know, two decades at least, and especially with regards to AI and the digital technologies, is these technologies are extremely malleable. We can develop them in different ways. We can have higher quality technologies that increase worker productivity, or we can have rot automation lazy automation. You can have addictive social media that makes you go into your cocoon and you become uninterested in forming bridges and engaging in democratic citizenry. Or we can have platforms that actually encourage different types of communications and the active citizenry that democracy requires. Which ones of those will be? And I think since we have gone more and more in the automation way and the lay in the sort of toxic environment of social media, a change requires new products, new technologies, and that means a more open system. So the way that I sort of started sort of coming towards this idea was, well, what's the barrier to this? Well, the fact that there are a few big tech companies that can acquire competitors as a barrier, but the fact that even if they don't acquire you, they pigeonhole you into the same ecosystem in which they exist. And I think that's a big problem.

How do you mean they pigeonhole you even if they haven't acquired you as a user, Because.

To compete against them, you need to also offer your products for free, which means the only thing you can do is for digital advertising. If you need quality data, you can't afford it, You're not going to have an incentive to get it. If you need high quality, niche products, you know nobody's going to buy them because everybody is used to this free free v's and.

Everything, right, So it creates a kind of h yeah, RaSE the bottom. But also just there's no way to there's no reason they would have.

To compete, like you couldn't compete as they We haven't really seen any new social networks in years other than Blue Sky and Threads. And Threads is a part of a part of the Facebook machine.

And we have seen talk about TikTok is just a amplification of the same business model, same monetization system, same sort of weaknesses being exploited as Instagram, Facebook.

And they burned, and they burned billions and billions of dollars to get there. It wasn't like that. People try and I don't know why people would possibly think this, but they frame Byte Dance some plucky upstart versus just a massively funded Chinese juggernaut.

Actually much is why I actually use the word ecosystem because this model itself is highly synergistic with companies growing very fast. And why do they grow very fast Because they want market share, they want to dominate the market. But I also want data and how is that possible? How can you grow so fast? Well, you get venture capital or Microsoft to bankroll you, as in the case of open ai, So you burn through a lot of cash early on in order to get data dominance and market dominance. And again it's not a pro competitive picture that's emerging here.

I feel like General VII is something separate because with that they well they don't have a business model yet, they do not.

Have a business model. But you know what that does, It makes it even more likely that we're gonna just repeat the same sins of earlier social media. Look at what is open ai doing. It's burning through a lot of cash in order to acquire market share and data. It's not monetizing through anything. And I think the most likely cash cow ideas that are coming up we're going to use generative AI for Internet search and we're gonna take over the digital ad revenues. Well, that's more of the same.

And I don't different conversation.

I don't think that'll work, but it's Nevertheless, I can see the idea.

It's funny, these incentives really have it.

I've never really thought about it before this conversation, but it feels like digital advertising really did harm the tech industry in that they found something very profitable that got more profit ple without making the user's life better at all.

Well, Larry Page and say Gay Brinn, when they first formed their company, said we don't want ads. That's not like a good model.

Yeah, it's empathetical to a search problem.

At the moment, somebody came and gave them venture capital money and said, but we do it so that you can actually the lads.

Yes, what if you were super rich.

But it doesn't sound like this would kill the digital ads industry, because if.

I don't think we should kill it either. I mean I think again, if we created a monolith, a monoculture, that wouldn't be good. If everything was on online, everything was based on subscription, that wouldn't be a good model either.

So you want a variety, but so well is more operational level. I'm guessing that this would also have to have some way of cutting through things like if people set up subsidiaries to try and pass off the tax as well. You would have to make sure that there was a way around.

You know. That's why this is not a sort of a graduated tax that is easier to gain. But if you know five hundred million, you know it's like a very small amount of money. So you just set a minimum like that, perhaps a little bit more so that the really small companies are not burdened by it. We know one problem, for example, with the European GDPR is that once you put a regulation like that, the burden is heavier on smaller companies, and so it actually it's a competitive advantage for the large platforms that you don't want to do that.

But I think five hundred million would actually that would make a very healthy market for smaller companies.

Which is great. We want more smaller companies as well.

But wouldn't I think that this would help with the problem a great deal. But don't we also need something to do with the algorithms themselves. Because I feel you could do this and it would solve some problems. It would begin incentivizing them in the right direction. Theoretically, yeah, but you still have this problem of they wouldn't stop algorithm. In fact, this might encourage them to be more algorithmically.

Absolutely, and that's why I don't think this is a silver bullet, you know it. You need a range of policies. But the hope is that at least such a policy would create some push towards some platforms and products to have better quality algorithms that don't trap you like that. But for example, if we're talking about social media, we should also be talking about repeal or relaxation of Section two thirty of the Digital Acts, so that you know, companies that algorithmically boost content cannot then say, well, you know, this is not our speech, this is somebody else's thing that we have nothing to do with it. So there are some details there that we have to think about. And I think Section two thirty, which was written in a different age in the nineteen nineties, is definitely not up for dealing with issues algorithmic boosty.

And debating Section two thirty a side, because that is also a separate conversation. It does feel like we have just kind of not given tech companies much responsibility for what they're doing.

No exactly, just that's the main reason why I want to have a conversation about Section two thirty because it can't be optimal that that corporations that are arguably the largest and the most powerful humanity has ever seen can then wash off their hands and say we're not responsible for anything that happens on our platforms.

Yeah, it feels with two thirty and I am not as well read as I should be about it, but it feels as if that responsibility side is the real niggling issue but also the most thorny because on some level it makes sense with social media platforms that they should be able to say we're not responsible for all the posts, and if they had to be, they will.

Absolutely And look, I think free speeches of major issue. I am worried about erosion of free speech, but you know, I'm also not like a one hundred percent free speech absolutist. I think we have to balance it, and the way that I would suggest as is again, so now we're changing topic a little bit. Second two thirty is, you know, subject to whatever legal requirements there are on you know, abusive content, et cetera. If you post something on Instagram or Facebook and the company doesn't algorithmically promote it, then.

It's completely right.

It's there. It's my speech. If my friends find out about it, they can go and look at it. If somebody stumbles on it, that's fine. But algorithmic boosting is like New York Times putting you on their front page. We can't say New York Times is putting you on your front page, and you can say all sorts of lies and crazy things, and that's just freedom of wealth.

Yeah, okay, yeah, now I know what you mean.

Though, if you have a more more preferred newspaper that example.

No, no, no, but I know what you mean. It almost, and it seems like the problem is that you're saying is it isn't two thirty itself. It's the fact that the algorithms boost yeah, these particular things.

Thirty was fine for the age it was written where nobody could could have understood the onslaught of algorithmic boosting promotion and manipulation that would come.

And it feels like it's an incentive problem that we just it's almost that platforms should either move away from the algorithm model, because if you because the amendments section two thirty that just says it should go away kind of doesn't sit right with me. I don't think you should. It would make platforms like WordPress impossible to use, or like blogging platforms, but the idea of what they promote being more intentional.

Absolutely that I actually reallyze this rule that I suggested completely protects WordPress. WordPress doesn't promote, right, So this sort of reformed two thirty is completely fine in that respect.

So as far as this fifty percent tax, do we have any kind of historical press to something like this being done?

And that's why, you know, am I sure fifty percent is the right level? Absolutely not per thirty.

But I think fifty is great.

Thank you.

Well, Because the thing is, the way I look at it is these companies are so adept to avoiding responsibility, and they're so unwilling to change their ways and so unwilling to be responsible with what they're doing, that it needs to be like this. And I'm sure that if this actually got anywhere near a government, they would have the world's biggest tantrum.

But I feel like we need stuff.

It needs to be a shock to the system, absolutely.

And I think that well, actually maybe that's the question. How practical would this be to actually implement?

Well, Well, it depends on what you mean by that.

Well, I mean, how how hard would it be to get to get this actually into existence with the government able to levy that tax.

It would be extremely hard because of the lobbying, as you pointed out, But if there was agreement, if the Chinese government wanted to do it right, they could do it overnight. It's it's all out there. The digital ad revenue is there, it's measured, you know all. And the flat tax is very easy to implement. You could do it at source when you know advertisers, companies pay for advertising. You could do it in many different ways, so it's very easy, you know. We it's a version of variety of VAT like at volume taxes. We have them in under much more complicated situations when there is much lower quality data about what's going on. You know, many middle income countries have a very complex sales tax or a VAT tax.

Yeah, and it almost feels that you could have a little fun with it as well. Maybe you could do this fifty percent tax and freed it directly into some sort of national venture fund. It would end up funding the feure. It feels as if the as go like on a larger scale, it feels like governments are just twenty years behind technology. Not even putting aside even Section two thirty. It feels like we should have an EPA or an FDA for data. It feels like we should have ways of Actually we don't know how these algorithms work, and it just feels a little crazy to me.

Yeah. Absolutely, But the way I would say it is, I'm not sure that I would say they are twenty years behind in terms of knowledge.

I actually, oh, not knowledge, just legislation.

Legislation. Yeah. So, like a completely different but related topic is we actually need an infrastructure for data markets, right, so you know, I think everybody says data is going to be one of the most important factors of production for the future, more important than land. Imagine that I told you that today land that is still pretty relevant for many businesses. It's up for grabs. You can just go and get whatever piece of land you want and you don't have to pay for it. You know, that tragedy of the commons is well understood from history. It will be disastrous.

Yeah, it's just the wild West.

Yeah, it's wild West. Nobody produces high quality data, Nobody gets compensated for the data that they have, and it encourages you know, the monetization that model that we talked about where you actually sweep people's data without their permission or without their understanding, and you try to monetize it by digital app So a complementary thing is we think about how how is it that we can have a system where data producers are encouraged to produce higher quality data so that perhaps, you know, next version of a large language model learns not from Reddit, but learns from high quality, dominant the domain relevant expertise.

I think in that case, Hey, I think in that case.

I'm not sure I agree, just because large language models getting higher quality data is a pro like how you build them, would like it? Just higher quality data would just be they still need more. But I get what you mean that if these companies were incentivized to actually have good data and provide good services with the data, that would be better. Because that's the thing I'm not. I don't love digital targeting. I don't love any of it. But man, if they have all this data, why are all their services so impersonal? They don't feel like they're for us?

And why does Microsoft word always crash? Yes?

I know that, but that one, that one's it doesn't matter. They have the dot doc and dot doc x model. They have a tiny little kind of monopoly now though they were various sense trust actions have to change.

Let me actually, let me actually talk push back a little bit on generative AI. Okay, because I think your statement makes sense. If you buy in that generative AI is most productively developed in the form of general purpose human like chatbots, they will need like huge amount of data because next word prediction is very very inefficient and you need to imitate humans and variety of circumstances. But imagine that we use generative to AI in a very domain specific way. You know, I want to know what drug creates, what side effects in conjunction with other drugs, right, Well, for that, I don't need my generative AI tool to communicate with me in you know, human like fashion or write Shakespearean sonnets. I just need some very specific domain expertise. But that high quality data is actually not out.

There, and I fully agree with that.

And smaller language models focus language moddles does make sense. But I think we are actually agreeing because this is an incentive problem. The reason open AI isn't really focused on that is because it doesn't make that much money.

Well, open AI itself is not making a lot of money. They're not liking No, I think I think they are not focused on that because A, if you want to get big, very very fast and collect a lot of data, domain specific models are not going to work. So creating hype around something that sounds very intelligent is a much better tool for doing that. And second, I think in the industry is still in an unhealthy way, in my opinion, preoccupied with artificial general intelligence right, human like intelligent, even if that's not what we need, even if that's not what's feasible, even if that's got a lot of downsides. So that's why they don't, in my opinion, pay sufficient attention to these domain specific expertise models.

So if for you, I actually agree, But also then and that is an economics problem, that is a these companies are incentivized to grow at all costs, to get as big as possible. Is what you're suggesting that they shouldn't it is.

It is not just an economics problem, but it has an important economic leg If we did not have venture capital be so important, this model would not have gotten off the ground.

So how do we push back on the VC model? How do we actually make technology work without that model, because I agree it's the growth of all costs. Build as big as possible, then IPO and everyone gets rich other than the user.

Well, what are the alternatives?

I don't know. I mean, I am not so much of an interventionist that I'm going to say, you know, you should tax VC as well. But I think one thing that encourages the VC grow at all cost model is that we don't have any anti trust right there was very strong anti trust and implementation. Then becoming so big wouldn't be so attractive, and you wouldn't be able to acquire all your competitors in the process as well, which is a very important part of this get very big, very quick. So I think our big failure of you know, upholding existing antitrust laws and introducing new anti trust laws appropriate for the digital age, I think has contributed to this problem.

Is there a way of incentivizing vcs? So actually this is how we are. This is the final question, because you might actually have an answer here. How do we incentivize venture capital and the startup industry to start investing at the earlier stage because a big fund just gave back a chunk of their fund because they were not finding as many opportunities in the late stage, and it like most of the money goes into that late stage. How do we incentivize that.

That's a much harder problem, because even well qualified vcs are not going to have an easy time recognizing a very promising product when it's in the garage stage. Right. But and this brings another, you know, set of issues. The fact that we tax capital so lightly contributes to this. Again, we're subsidizing vcs because if you make your money via vcs as sort of investment on your capital, you pay very little tax. You know, if you are you know, a tech billionaire, I want name names, you don't even pay yourself a salary. You keep on borrowing money from venture capitals or other specialized financial vehicles, and you pay yourself out of that. Everything is capital income. You pay minimal taxes. So our tax system, already a big contributor to inequality, actually also distorts the digital landscape. So one simple thing, again, which is probably even less likely to be implemented in our current polarized environment, is let's tax capital and labor at a flat rate the same or you know, you can act whether a progressity you want, but you do not distinguish capital and labor income. If you're going to tax labor income at thirty percent tax capital at thirty percent.

I agree on that one.

It just feels like it would be great if there was a way of just almost because you're talking a lot about taxation such and I agree that we need those controls, But is there a way to incentivize them to put more money earlier make more risky bets. I'm not saying a deduction, because that would be crazy, but it feels like that would also solve, in part in concert with these taxes, a way of getting that money into the early ecosystem.

There may be, but I don't know exactly. We should think more about it, but there is an alternative solution yet another policy proposal, as we fund a federal agency on AI which is tasked both by communicating and developing best standards on things like privacy data you know AI standards, security safety, but also has deep enough pockets that it can play that incubator role for especially for technologies that are deemed to be socially beneficial. So if we have technologies that actually protect in users of privacy and enable users to make better decisions. That's the kind of thing that the government should put money in early on. Some of it will go to waste, some of it will go bankrupt. But if a few of them are successful, that's great. And if the alternative is that the vcs are going to develop the technologies that are most manipulative, there is even more reason for putting this money.

Thorne, thank you so much for joining me again. It's been such a pleasure.

Of course, this was my pleasure. Thank you for being interested in these issues and such by having such a great conversation.

Thanks so much. Thank you for listening to Better Offline.

The editor and composer of the Better Offline theme song is Mattersowski. You can check out more of his music and audio projects at Matasowski dot com, M A T T O s O W s ki dot com. You can email me at easy at Better Offline dot com, or visit Better Offline dot com to find more podcast links and of course, my newsletter. I also really recommend you go to chat dot Where's youreed dot at to visit the discord, and go to our slash.

Better Offline to check out our reddit. Thank you so much for.

Listening Better Offline is a production of cool Zone Media. For more from cool Zone Media, visit our website cool zonemedia dot com, or check us out on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.

In 1 playlist(s)

  1. Better Offline

    87 clip(s)

Better Offline

Better Offline is a weekly show exploring the tech industry’s influence and manipulation of society  
Social links
Follow podcast
Recent clips
Browse 87 clip(s)