Their new paper is “Digital Deceit: The Technologies Behind Precision Propaganda on the Internet.”
On this episode of Recode Decode, hosted by Kara Swisher, New America fellow Dipayan Ghosh and senior adviser Ben Scott talk about their new policy paper, “Digital Deceit: The Technologies Behind Precision Propaganda on the Internet.” They say we need to fundamentally reevaluate how digital platforms collect data on their users, and how advertisers can use that information.
You can listen to the entire interview here or in the audio player above. Below, we’ve also provided a lightly edited complete transcript of their conversation.
If you like this, be sure to subscribe to Recode Decode on Apple Podcasts, Spotify, Pocket Casts, Overcast or wherever you listen to podcasts.
Kara Swisher: Recode Radio presents Recode Decode coming to you from the Vox Media podcast network.
Hi, I’m Kara Swisher, executive editor at Recode. You may know me as a Russian bot, but in my spare time, I talk tech. You’re listening to Recode Decode, a podcast about tech and media’s key players, big ideas and how they’re changing the world we live in. You can find more episodes of Recode Decode on Apple podcast, Spotify, Google Play Music, or wherever you listen to podcasts, or just visit Recode.net/podcasts for more.
Today I’m talking to Dipayan Ghosh and Ben Scott. Dipayan is a fellow at the think tank New America and he’s here with me in the studio in Washington, D.C. Ben is also at New America. He’s a senior adviser to its Open Technology Institute, and he’s joining us over the phone. The two of them previously advised the White House under Barack Obama, and the State Department under Hillary Clinton. They’re also the co-authors of a new policy paper that I had tagged on Twitter that is the talk of the town called “Digital Deceit: The Technologies Behind Precision Propaganda on the Internet.”
Dipayan and Ben, welcome to Recode Decode and thank you for coming in after I strafed you on Twitter, but I don’t want to apologize for that. I’m glad I did it. So, we’re going to talk about all of that, but why don’t we start … Talk about where you started, and then we’ll get to how you guys got to this report. Ben, why don’t you go first? And then Dipayan. Talk about your backgrounds and where you met.
Ben Scott: Sure. I spent most of my career in Washington working on technology and telecommunications policy. A little-known fact, I might be the only guy on the planet who staffed both Bernie Sanders and Hillary Clinton over the course of the last 15 years.
How was that? How did you get into it? Are you a techy or you were just interested in tech policy?
BS: I did a PhD in communications and studied the political economy in media markets, and spent enough time in academia to realize that I didn’t want to be an academic. I came to Washington to try to learn politics, and got a job as a young legislative aid with then Congressman Bernie Sanders.
Okay, and then you’ve moved on to the State Department. Is that correct?
BS: Eventually. I worked at a shop called Free Press for many years, working on such little-known issues as net neutrality. And then went into the Obama administration in 2010, and worked on tech informant policy at the State Department. When Secretary Clinton left, I also left government. I went to Europe. Spent four years in Germany running a think tank on tech policy over there.
Which they love to talk about in Germany, and they have very different attitudes, obviously. Dipayan, how did you get … And then I want to know how you guys came to do this paper.
Dipayan Ghosh: Yeah, of course. I’m a privacy engineer by training. I did a PhD in electrical engineering, and then moved on to start a postdoc out in California and was doing study of information theory — how you send information from point A to point B in as efficient a manner as possible. That got me interested in privacy issues. I think, really, the pivotal moment for me was the Snowden disclosures.
DG: For many, yes. After that news in 2013, I became very interested in public policy. At the time, I was thinking about different options and …
Where were you working at the time?
DG: I was doing my postdoc.
Postdoc. Oh, so you were working on that. When you say became interested, what does that mean? You’re like, “Oh my god, the government is spying on us.” Shocker.
DG: Well, obviously, at the time, it was a huge public debate. What should the government do? What should companies do? How did we come to this place? And how do we move forward from here? The Obama administration at the time began a serious inquiry into all of those questions. I had the opportunity to come to the Obama White House and work on a lot of these issues. So, I was there for a couple of years helping think about consumer privacy and commercial privacy issues, amongst some other things. Ultimately, we released a major report on big data and privacy. In that report, I think we made a number of recommendations that move the ball forward toward progressive consumer privacy reform. There was a lot of follow-through on all of the recommendations, particularly from the administration itself.
A couple of years later, I transitioned to Facebook. It was an inspiring time to join the company and to try to help think about how it could solve a lot of challenges around user privacy issues. That brings me to my work with Ben at New America.
You were at Facebook for how long?
DG: About a year and a half.
And what did you do there? What was the actual … You worked on privacy issues. What does that mean?
DG: Yeah, I was functionally the company’s privacy and public policy adviser focused on U.S. privacy issues.
U.S. privacy. You were supposed to do what? Be an irritant to them and say … Or go along: “How can we violate privacy more?”
DG: Think strategically within the policy team, which is the team that I worked for, think strategically about how the company can continue to innovate and create amazing products that can win users and win attention while also thinking about user privacy issues and maintaining a commitment toward individual privacy.
Right, right. It’s really hard when your business is predicated on a slot machine of attention, really. It creates a real problem. It’s like being a cigarette company and saying, “We want people to be healthy,” kind of thing, which I think we can talk about. You know I have that attitude, much to Facebook’s chagrin.
So, you two got together at the New America Institute, is that correct? It’s just called New America, correct?
DG: It’s called New America. It used to be …
We’ve heard about the controversies around it. The Eric Schmidt’s New America. That’s what we like to call it in Silicon Valley. Sorry, I won’t put that on you.
DG: Well, we do still have a conference room named after him.
Oh, do you? Okay.
DG: It used to be called the New America Foundation. It’s now called New America. Ben should really answer part of this question because he fundamentally helped create this new initiative that I joined.
All right. Ben, why don’t you talk about the new initiative and how it led to this paper, “Digital Deceit: The Technologies Behind Precision Propaganda on the Internet.” And I do like “Digital Deceit.” It sounds like a movie that we should all be watching. But why don’t you talk about how you got to that? You created this group within New America?
BS: I am a part of a team that had the idea that we ought to build new ways of working for think tanks. This old model of white papers for insiders was only going to go so far in this political environment, perhaps not even as far as it once did. We decided, “Hey, there’s an opportunity to do work here at the intersection of technology, policy and practice. Let’s build teams of people who are policy experts, engineers, designers, organizers and product developers. Let’s try to do stuff in addition to just talking about it.”
Like a startup, in a lot of ways. Like a policy startup.
BS: In a lot of ways, yes.
BS: It’s called Public Interest Technology. We have a group of about 15 people that come from all of those different backgrounds. One piece of what we do is technically informed policy analysis that tries to make interventions in policy debates by talking about what’s happening in the market, how the technology influences the development of both public policy issues and consumer benefit or consumer harm, in order to inform debates in the media, in governments and in the industries. This paper is that piece of the public interest technology.
Sure. What you’re trying to do is essentially think tanks, which have been around since the beginning of time. It’s just using new tools to do so in different ways, and actually have solution-based …
Rather than just a pompous paper, everybody has a cocktail party, they discuss, and then it goes in a drawer. Right? Pretty much?
BS: The idea is that the paper is the starting point and not the ending point of the process.
All right. So, with this particular thing, you focus on something people have just recently started to pay attention to, but a lot of us had been talking about this issue, which is how the weaponization of social media, the uses of it for propaganda. You may not know this, but I actually was a propaganda studies major at Georgetown and Columbia. I’m very attuned to the idea of uses of propaganda have been used since the beginning of time from walls in China to Nazi … The most famous is Nazi propaganda, obviously, but it’s been going on forever. This is not a new thing, it’s just these tools are more sophisticated and dangerous, probably.
Your idea behind the paper is that … You guys got together to write this paper. Dipayan, why don’t you explain the paper and what you’re saying in it in your words. I’m not going …
DG: Sure. Fundamentally, we believe that, and we write about, that there is a fundamental issue in the construct of this whole digital advertising ecosystem. The ways that we have created this digital advertising ecosystem over time, which is the fundamental way that the internet works nowadays, is probably the biggest money-making machine over the internet, is it carries a fundamental flaw, which is that there is an implicit alignment in the interests of the large internet platform and the advertiser.
That is all fair and well, but when that advertiser becomes, let’s say …
DG: Malevolent. Has a nefarious motivation, like a disinformation agent, we need to segregate that shared goal between the internet platform and the advertiser. And that means that we need to figure out who those disinformation agents are. Our paper discusses all those issues, but it also pulls away from the specific lens on Facebook, Twitter, Google and Russia, and argues …
Which is highly emotional, obviously, as well as political.
As it should be, by the way.
DG: Absolutely, it is. And argues that while that’s the center of this debate, and as it should be, there is a much broader ecosystem here that goes from behavioral data tracking to, certainly, online ad buying, customer audiences and lookalike audiences, but also extends to search engine optimization and management platforms of social media. Many of these technologies are going to be supercharged by artificial intelligence as it’s increasingly integrated into the technologies underlying the web.
All right. Ben, give me your thoughts on what you’re trying to achieve here with the paper. To start this discussion, as you said.
BS: We want to shift the frame from a political analysis of how to stop Russians from violating American national security by distorting elections and focus on the large …
Which is a laudable thing to try to stop doing, but go ahead.
BS: Yeah, I’m glad people are doing that, but this is a … Even if we slammed the door on the Russians and they were unable to use social media to affect elections again, we haven’t stopped the problem of disinformation.
BS: Disinformation is endemic to internet communications. How do we account for that? What’s going on? How does it work? What are we going to do about it? Our goal was to literally lay out the map of the mechanics of how disinformation operations work. It’s not just the Russian agents woke up in the morning and bought some ads on Facebook.
Right. No, no. God, no. I mean, it’s easier to talk about it that way, for sure.
BS: It’s easier to talk about that. We wanted to systematically identify all the players in the market, describe what the tools are that they use, and to make a point that these are the standard tools of the digital advertising industry, which is one of the most effective innovations in digital commerce, and that those tools are the tools the disinformation campaign uses because they are indistinguishable from any other advertiser.
BS: Until we deal with that problem, we’re not going to solve this problem. It is very likely to get worse because the technologies of the advertising industry, from micro targeting, behavior tracking, segmentation and influence, are getting better and better.
Right, and changing, also, as we change the technology.
I was just talking this morning, actually, on one of the cable shows. I said it’s not what happens in VR, what happens in AR, what happens when we have things embedded in our eyes, when we can be tricked and fooled with all kinds of Cloud and technologies and things like that. Things we can’t even imagine, which are coming down the pike in some … Essentially, any old episode of “Black Mirror” will help you understand that. I mean that in a joking way, but not really, is that it’s progressing rather quickly. As they clear one deck, another one comes up open, pretty much. Correct?
BS: That’s right. It’s a moving target.
Right. When we get back, we’re going to talk a little bit about what these tools are and what are the ones that are really problematic. And then in the final part, we’re going to talk about what are the solutions. I do want to talk about some of the issues I have with you because I do think the administrations you worked for really did drop the ball in a lot of ways to regulate these companies before they got more powerful. It was really the Obama administration who was in power when these companies came to real true power.
We’ll talk about that and more with Dipayan Ghosh and Ben Scott. They are both fellows at the Washington think tank New America. They’ve just written a paper that’s gotten a lot of attention, a policy paper called “Digital Deceit: The Technologies Behind Precision Propaganda on the Internet,” when we get back.
We’re here with Dipayan Ghosh and Ben Scott. They’re both fellows at the think tank New America. We’re talking about their latest paper, which talks about, I think, pretty much the weaponization of social media and how it’s beyond the Russian bots, essentially, which Ben was just talking about, and how you push that back. Why don’t you start, Dipayan, talking about the landscape. What are some of the things that are happening right now? Because as Ben just said, things change rather quickly.
DG: Yeah, of course. There’s a broad ecosystem here. At the center of that are the advertising technologies that are managed and operated by the companies that are in the public attention right now. This whole issue of Russian propaganda over the internet is based on a series of technologies. The number one thing that Ben and I are concerned about is the long-term collection of behavioral data. In the report, we talk about …
Let me just stop you.
Explain that to the regular people. Just assume that everybody’s you. That is essentially what everything people do on the internet is recorded and how they behave, where they go.
DG: Absolutely. We talk about the internet. We also talk about technology, including mobile phones.
Right, especially mobile phones.
DG: Absolutely. I think it starts with, certainly, the use of major consumer web applications like Facebook, like Google, like Amazon, which increasingly understand our interests. They know our preferences, our behaviors, our beliefs, our likes, our dislikes. That information that they infer from clicking “Like” or searching for Recode on Google …
Well, there’s signals all the time.
DG: There are signals. Increasingly, they can determine our individual personal intent. That’s how all these technologies over the internet are monetized, by understanding our intent and targeting advertising.
And the more they know about you, the better. That’s gone back since when you were in a supermarket and they knew if you bought cat food, litter and skim milk, you might be a single lady, or something like that. Whatever they did. I used to cover retail and it was a big deal, the data they’d collect from your purchases at the time, which was minor compared to now.
One of things I think we should underscore is when we move into the mobile age … When you sat at a computer and went from website to website, that was all they knew. They could infer a lot from that, but now when you’re on a mobile phone, they know what you searched, then where you go, and then what you call, and then what you like. It creates an entire tableau. You can’t even understand how much. When you introduce it into the home, it’s even more so because you get a complete picture. It’s a photograph of a person’s life, essentially, and their needs.
DG: Absolutely. That was where I was going to go next, which is the implementation of cookies on the web, which really help the major companies, as well as the smaller players in the advertising ecosystems, as well as consumer-facing retail companies like Nike or eBay, understand more about us.
All right. Ben, when they’re doing this, that is their business plan. There’s nothing else. I mean, that is at the heart of Google’s business for sure, getting you to what you want. Same thing with Amazon. Same thing with Facebook. Not Apple or others necessarily, but in that zone. Can you talk about that? You just described what they are and what their business is.
BS: Yeah, I mean, that’s a fundamental point of the paper, that the scandal over Russian disinformation on the internet does not depend on the Russian’s using some tricky new technology.
BS: They’re using the fundamentally normative tools of the digital advertising industry that are the center of the business.
Talk then about what you’d change, because you’re just saying, “Well …” I’m going to use the cigarette analogy, even though this doesn’t actually kill people. This is a little different. It warps people. It mentally warps people. If your business is predicated on people smoking cigarettes, you can’t really shift into another business, or can you? You essentially described what they do and say, “This is their business and their business is easily manipulatable in all kinds of malevolent ways.”
DG: Well, I’m not necessarily saying … Yes, we are describing how this ecosystem work. We’re not necessarily saying that it needs to be broken down and changed and we need to change the fundamental ways that we pursue monetization on the internet. Rather, we’re saying that there are some key technical flaws in the system that need to be solved.
All right. Go through them. Ben, why don’t you start?
BS: Well, for me, the core of it comes down to the harm created by leveraging personal information and the social graph to enable political advertisers peddling this information to manipulate your prejudices in the same way that a retail advertiser would attempt to manipulate your consumer preferences. That’s the core of the problem. That’s what’s generating filter bubbles, alternative media realities that people live in that inform their political views and choices.
How do we disconnect from that pattern? We have to look at whether it is appropriate to collect and store political information as a part of the data profile, whether it’s appropriate to sell that to advertisers, whether it’s appropriate to sell it to political advertisers. If you’re going to sell it, what kind of transparency should be in place? How should the consumer know why and how they’re being targeted with what kinds of messages? None of those questions are anticipated under current law, and we’re not having that discussion yet in the context of Russia, national security and Twitter bots.
I think nor are the companies thinking of it because they don’t — and we’ll talk about how they think, because they don’t think. They don’t anticipate malevolence, oftentimes, or else they willfully ignore it. I’m not really clear on which one it is. I often think they think an opportunity is not necessarily in problems, like they could happen.
All right. So, that’s one, obviously. Second one? Another one?
DG: Well, for me, this is very much related to Ben’s point, but the flaw, to me, the major flaw, is that it is hard to detect malevolence. There has to be a way to build around this. There has to be a way to detect the fact that there is an agent working, let’s say, for the Russian government who wants to persuade an American voter or 100,000 American voters with a particular message.
Or confuse them or distract or anger them.
DG: Or confuse them or distract them. I think that we have to start thinking about our regulatory system. Traditional media is the space for which our regulatory system has been designed. It’s eroded in terms of its influence. It’s obviously still very influential, and you could argue that …
No, the game is at Facebook. The game is …
DG: Increasingly, yes. As traditional media erodes, new media is coming to the fore. Yet, our regulatory system is designed for that old world. What Ben and I have been thinking about is how do we translate the regulatory regime for this old system into one that can be applied to the new one? That, I think, can solve this major flaw of detecting malevolent actors in a number of ways.
Ben, let’s talk about the technology of doing that, because a lot of what Mark Zuckerberg, the CEO of Facebook and founder, and others have talked about is — and I just interviewed Sundar Pichai, the CEO of Google, talking about AI and how the ability to use that will give them more tools in order to do this. Is this just more … They tend to try to talk about AI and other things. I was focused on the future of jobs and this happy, shiny future “we’ll take care of it” ways. Pretty much, they’re not bringing a very happy, shiny future at this point, and have made enough mistakes that people have begin to question their ability to handle the technology.
Can you talk a little bit about how that happens? If you want to create a new regulatory framework, you’ve got to, obviously … Because look at the lobbying money that Google and others are spending. It’s enormous. I think it’s one of the biggest this year. How do you then create something that actually does what you’re saying?
BS: The first thing you have to do is create political will. You have to create political will by educating the public about what’s at the root of this problem, which is why we’re so focused on shifting the debate from Russia and national security, as important as that is, to the fundamentals of the digital media marketplace that trades on data to target persuasive messaging.
We need to begin to control the way that’s done, otherwise we’re going to continue to see adverse effects in the public. People need to understand that. I think, ironically, it could well be that the political will that we’ve never had to address, commercial data privacy in this marketplace, won’t come from e-commerce but from political disinformation.
Okay. Get people there.
BS: We’ve got to get people there. Second is …
Can I just, before you answer that second part?
How come they’re not mad? Well, because they never are. I mean, there could be one data breach after the next and nobody seems to care.
BS: That’s a question that has been vexing for customer advocates like me for many years, that people are willing to trade away their privacy for very little.
BS: As much as I might like it to be different, that is how it is. The companies recognize that, and as long as their users aren’t complaining, they don’t believe they have a problem.
BS: That’s what I think is different about this moment. This moment where we are now in a political context where people are realizing, “Wow.” Something fundamental has shifted. We are divided in a way that we haven’t been in generations. We just live in different media environments with different sets of facts. How did that happen? And how can we begin to undo it?
At the root of that is data privacy. Can you get people to understand the connection between what they’re mad about and what the problem is? That is our public education challenge. Once we have accomplished that, gotten people there, then we can begin to talk about, all right, how are we going to solve that problem?
That’s the second point.
BS: That is where I was going with that, is once you’ve gotten people there, then you need to think about, all right, how much of this can be solved with new technology? Can you deploy an AI system to detect and remedy disinformation in the way that companies have done it with child pornography, terrorist speech, things of that nature.
BS: I think there’s certainly a role for technology to play. I would welcome all the innovative minds focusing on it. But I think ultimately there have to be, at the very least, brightline rules about what the public needs in order to be protected from harm, and in order to maximize benefit. This is the new media environment that provides information and news to our democracy. We need to protect it and make sure that it has integrity.
You’re talking about regulation that doesn’t seem to be coming, even though there’s been a lot of hearings and everything else. Perhaps I’m cynical and old, but I feel like this has been going on since the beginning of the internet. Nobody seems to have the political will despite even … I know this could be …
You’re always looking for a moment that will change everything. Everyone’s always talking about a privacy or data Armageddon that will finally get everyone to realize the way this stuff has control of our lives. We don’t understand. This moment may be that, but again, it’s highly politicized. People are even arguing whether it even matters. I think it gets pulled down in the miasma of Washington politics.
If this is a moment, there is political will, who has it? And how can that be done in a non-political way? Because it seems like that’s the only way to do it, but whoever’s being disadvantaged is going to be louder, like the Democrats right now. Whoever isn’t is going to be quieter, like the Republicans. When the Democrats were in power, they were very silent about these issues. I mean, now they’re suddenly born-again internet haters, but they were, as I said on Twitter, they were pandering suck-ups for a long time. Not you in particular. Not you two, but they were. There’s just no other way to put it. They were pandering suck-ups to the tech industry.
DG: Well, I think that the discussions in D.C., and I’m talking about the testimony that Colin Stretch participated in for Facebook and Richard Salgado for Google …
You know, they sent their most boring executives, but go ahead. Part of the strategy.
DG: I think that those discussions, this current intent in Washington and interest from the public to think strategically and critically about the industry, can lead somewhere. I think Ben just suggested it in his last statement about using AI. I think that these discussions can really lead to industry action in the, I wouldn’t say immediate, but shorter term, to try to use different signals, like the geography, the origin of the advertiser, the mechanisms for targeting, the intending audience, and the content of advertisements or organic content. Those types of signals can be used to try to draw out with, let’s say, 95 percent confidence that, okay, this advertiser, this guy, is a disinformation agent. He or she is trying to influence malevolently a voter. I hope that these discussions can lead us there in the next one or two years within the industry.
I think in the longer term — and touching on your broader point about Democrats, Republicans coming together and thinking about privacy — as we know, there is no baseline privacy law in the United States like there is in Europe. These two places are culturally different. In America, we have, for a long time, valued the strength of the market.
I do think that this is a moment, but I think it’ll pass. I think that we will be in that long standing trench of trying to figure out where we go on data privacy. It’s frustrating. It’s a long battle. There have been many people who’ve participated in it. Ben and I are part of that group. I think it can happen, but you’re right. There is political gridlock right now on that issue.
I know Ben has to go in a minute, and then we’ll finish up the last section with Dipayan. Ben, it reminds me a tiny bit of the oil industry, like how much influence they had. Are they stopping electric cars? You know, who killed the electric car? We all know that pollution is bad for humanity. We all know safety regulations that we didn’t have in mines were bad for people, but they continued because of political influence and will. Now, everybody is jacked into the internet and they do know. Besides, not just this issue, but the issue of tech addiction of how they design these systems to keep you, again, in the slot machine of attention, which I think if you think of slot machines you really do start to understand what’s happening here.
When you talk about this idea in this paper … What I want to do, because you’ve got to go, is if you could just … You write this paper, point out the stuff that people do know that this is … Who do you see as having … Of this actually turning into real change? The only one I see is Margrethe Vestager in Europe. I’m just using her as an example. There’s a whole bunch of commissioners there. Or European regulators seem to be the only ones who are holding internet companies’ feet to the fire. I only say that because when I interviewed President Obama a couple years ago, he was calling what they were doing in Europe protectionism, not good policy about privacy. Instead he was saying they’re trying to protect their crappy companies there.
BS: Short and sweet, I would say I’m not holding my breath for Washington to take any aggressive regulatory action any time soon. They’re not solving any problems, much less this one. However, I do think that the intelligence and analysis that comes out of the debate over these issues in the United States is being watched careful by Europeans. In May, they will implement the general data protection regulation.
Yeah, they will.
BS: There are many European regulators, not just in Brussels, but at the data protection authorities all across the EU, who believe that what we are pointing out in this paper is unlawful under the GDPR. I predict there’s going to be a Pandora’s box of adjudications on these questions as the Europeans test both how far the law allows them to go, and where there is political will to take action.
BS: And is that going to be partially informed by Europeans’ own economic interests to promote their own tech sector? Of course it is. Is it also informed by a principle of commitment to protecting data and the integrity of democracy? Yes.
Right, but the name of the game is America and China, isn’t it? Where there’s a little less respect for that.
BS: True enough.
True enough. All right, Ben, you have to go. When we get back, we’re going to talk to Dipayan about what are the things we have to do to make this happen and where it goes from here, and whether the internet companies are finally understanding the damage that they’ve caused and what they need to do about it.
We’re here with Dipayan Ghosh and Ben Scott. They’re both fellows at the think tank New America. They’ve written a really interesting paper you should read. It really is actually readable. “Digital Deceit: The Technologies Behind Precision Propaganda on the Internet.” And it isn’t just propaganda. It’s also targeting precision, manipulation precision, advertising, essentially, on the internet, when we get back on Recode Decode.
I’m here with Dipayan Ghosh in a studio in Washington, D.C., talking about Washington, D.C. He’s from the think tank New America. We just had his partner in a report that the pair did, Ben Scott. They’re co-authors of a new policy paper that’s the talk of the town called “Digital Deceit: The Technologies Behind Precision Propaganda on the Internet.” That would be the talk of the town in Washington, wouldn’t it? Nothing too exciting.
Let’s talk about the current environment, because Ben was just talking about Europe having commitment to do this, but in a lot of ways it’s not enough. It’s got to happen here in this country. One of the things I did talk to you about when I was on Twitter was, first of all, the money that these companies have, the influence, and not just that but the admiration that a lot of politicians have had for innovation. This is the one industry that we win at across the globe and still continue to dominate. Apple, Facebook, Google, all of them, Microsoft, Twitter — Twitter in their business. We dominate. The U.S. dominates.
Talk a little bit about how if you want to change these things, you’re also … And they make a very good argument that you’re hitting innovation. That China’s just going to do whatever it wants. Right now, even in Silicon Valley, there’s a lot of people going, “Well, in China, they don’t have to deal with the niceties of sexual harassment. They don’t have to deal with being diverse,” or the kind of things that are also being discussed in Silicon Valley. Can you talk a little bit about that?
DG: Sure. What a loaded question. It’s a hard one to answer, but I think …
Well, please unpack it and answer whatever question you want.
DG: I think the place where we have to all come back to is our human values. What do we value in this country? What do we value as people? Well, for me, for one, I value the right to privacy. I value the right to individual agency and autonomy. I think that, as we discussed earlier, the fundamental business model of many of these internet companies is in contrast with that.
Engagement. Keeping your attention, even in a healthy way, that’s what you’re saying.
DG: Absolutely. These two principles just don’t match up. That’s okay. We’ve always tried to figure out how we can organize markets in a way that is respectful and meaningful for individual consumers. I think that that’s exactly where we have to come back to. I think, in short, we really need to think about the regulatory regimes that apply to data privacy, as Ben mentioned earlier, consumer protection, anti-trust and a few others.
We really need to go back and assess how our regulatory regime in the United States is written and applied and think about how the internet can be under a new set of guidance from government and informed by the public in a way that can bring us back to those human values that we want to respect as consumers continue to use these engaging products.
Do you imagine these companies feel the pressure? Do you think it’s genuine, some of these … They’re now starting … Mark Zuckerberg, every five minutes, he writes a 6,000-word essay on feeling badly, or, “We’re going to change it,” or, “We’re going to fix it.” I literally have lunches every day. What’s interesting about it is about a year or so ago, I was more attacking of them. They were like, stop complaining about our responsibility. I think the phrase they tried to use, because they’re geeks and they like it, is “with great power comes great responsibility,” which is from the movie “Spider-Man.” It’s actually a Voltaire quote.
They’ve taken the money, they’ve taken the power, they’ve taken people’s … They’ve done lots of damage, but now they have to fix things. Whereas Facebook has said, “Move fast and break things,” I’m always like, “You’ve broken enough things. Let’s try to fix them.” Do you imagine that they are part of this answer? Or this too, you can’t let the powerful people control the outcome?
DG: They as in the companies?
DG: I think the companies are part of the answer. I think the internet has created so much in the world. I think that, as many leader have said already, on the whole, social media and the internet platforms have, and still do, create so much value for …
DG: So much positive. This is a flaw that can probably be solved if we can overcome these things like political gridlock and so on and so forth. I do think it’s achievable, but we need to think now about these things. When we have the opportunity to push them forward again, we have to seize it.
I want to talk about how to do that, because the thing I do stress, and I’ve said it … I want you to explain it. You’re to blame for all the Obama administration and Clinton people. You now carry the weight of all of them on your shoulders. Why didn’t they do something earlier when they had a chance? I think what they intended to do was there’s laws in place already around all kinds of issues. I remember saying, “These are real new problems.” We need new laws, like when we went from horses to cars. Cars are different. They’re sort of the same, but they’re not. They move people forward in time or space or whatever. Why was there not? Was it just the starry-eyed, “Isn’t Elon Musk dreamy?” kind of thing? What was the problem from being in it?
DG: Well, technology’s inspirational. I mean, Facebook is inspirational. Google is inspirational. Tesla and SpaceX are inspirational technologies and ideas that move people.
Also they have a lot of money.
DG: They do have a lot of money, but it’s because they engage people across these ideas in such an emotional way. I think that’s important. That has guided a lot of the rhetoric about these companies and this industry over time.
Obviously, I don’t speak for the administration. Only the president does. I can say personally that while I was there, I felt as though we were doing everything in our power to push forward fundamental privacy reform. We’ve already stated that there has been political gridlock on that issue. Obviously, in a broader context, there was gridlock on every issue in 2015 when the Obama administration …
Health care, everything. Right.
DG: Yes. Released the Consumer Privacy Bill of Rights Act of 2015. I’m not saying that was a flawless piece of … It was called a discussion draft legislative proposal. It was not flawless. It was advised by industry, by consumer advocates, by …
I love your “advised by industry.” Sorry.
DG: Well, what I mean by that is that there were a lot of experts within the administration who had come from …
Yes, they had.
DG: … Google and Facebook and other companies. Similarly, we were talking directly to the FTC. I myself, at the time, have considered myself, and still do, a privacy engineer and advocate. It was advised by a lot of voices. It fell flat. We intentionally released it on a Friday night, I think.
“Taking out the trash,” it’s called.
DG: Because we knew that it was dead on arrival. Let alone in Congress in the public decision, we anticipated it was going to be attacked from all angles, whether we were talking about EPIC, the left-leaning privacy advocacy group, which I deeply respect, or industry groups.
IAB or whatever.
So, you wrote something nobody liked.
DG: We wrote something nobody liked, which the long-running joke is that that’s where you want to be.
DG: Yet, because of all these reasons …
And other things happening. You’re busy with health care, you’re busy with this.
DG: Of course. It wasn’t necessarily the administration’s top priority at the time. The president was, obviously, in his second term and wanted to leave a legacy and get things done that could really have momentous impact for many, many people. This, of course, can. But I don’t think any of us, myself included, anticipated all the harm that could come to the country from a lack of fundamental privacy legislation.
Oversight. Let’s talk about where we’re going. I think one of things … I mean, here we are with this administration, which seems more hostile to tech for sure, but in a really odd way. Bezos, Washington Post. That’s really not helpful. That’s not really … It doesn’t have any direction. It’s just sort of free-ranging rage toward it. At the same time it has been helped by it, looks like it, a little bit. How do you imagine this administration? And then I want to talk about what you see happening. What’s actually going to get done?
DG: Well, it’s very hard to read what this administration is thinking about technology.
I don’t think they’re thinking at all.
DG: Yeah, maybe you stated more directly.
I don’t think there’s anyone over there thinking. I don’t think they filled those jobs or anything like that.
DG: Well, yeah. That’s a great point. OSTP, the Office of Science and Tech Policy in the White House, is empty.
There’s a real estate agent running it, but anyway, go ahead.
DG: I shouldn’t say that. They have long-standing staff there …
DG: … who work very hard.
But it’s decimated. Let’s be clear.
DG: It has been decimated and cleared out. I mean, it should be cleared out and then filled up again by people who are qualified and aligned with the president’s views.
Except we don’t know what those are.
DG: We don’t know what those are.
He likes the Twitter. That’s pretty much all we know.
DG: He likes Twitter. I don’t know where he is on Apple and encryption, exactly.
We’ll know when the next thing happens. That’s when he’ll …
DG: Maybe. Maybe we’ll know. So, it’s hard to read. Who knows whether or not he’s re-elected, but at least for the next two or three years, it seems as though we won’t have much besides deregulation of the telecommunications industry, which is already happening. I hope it can be stopped by Congress, but we’ll see. But we don’t really know what’s happening with the core tech companies. It remains to be seen. I think that if they start cozying up to the administration, there could be prolonged …
DG: Gridlock. Yes, exactly.
That’s the goal, I think, in a lot of ways. Even though they’ve pulled off the counsels, and they’ve tried to make … I think their issue is their base, their employees, which is their base, don’t like the cozying up. So, they tend to want to react to that, first of all. At the same time, their interests … I read a piece saying, “Here’s where their seven interests lie with them,” including tax repatriation, especially tax repatriation. Because a lot of things they are aligned, and it all has to do with corporate profits, essentially.
In that story in the New York Times, which was really interesting, the Democrats are starting to … We did a Cory Booker interview a while ago where he was quite anti-tech, I would say. He’d say thoughtful, but it was pretty anti-tech from a Democrat. And others are talking about it. Kamala Harris certainly, and when she was attorney general of California, was very tough on privacy issues in that state, which is similar to Europe in that regard. Similar, not totally similar, but … Where are the people doing that? Name some names here who you think are going to be important going forward.
DG: Yeah, sure. I think that …
And states too, because I think it might be on the state level. We might seem some …
DG: Yeah, in the Senate, I think it’s hugely important and inspiring that Warner and Booker and the like are speaking about this, McCain, are speaking out about this and taking this issue on. I should also mention people who have had a long-standing commitment toward consumer privacy as well, Markey and Blumenthal among them.
Yes, Markey has been.
DG: That is important. I would love to see more people, especially in the Senate, standing up and thinking about consumer privacy. It doesn’t have to be in a way that you state rhetoric that is directly in contradiction of the internet companies. I think that something like the Consumer Privacy Bill of Rights legislation introduces a lot of interesting ideas that the companies could absolutely get behind if it were done in the right way, but I would love to see people like Tim Kaine and …
Elizabeth Warren. Kamala Harris.
What about the House?
DG: In the House, things are …
There used to be a few geeks.
DG: Yeah. I mean, things are more difficult. I think Alison is … I think he can be a real advocate, but it’s just hard.
Yeah, you can’t come up with any, can you?
DG: No, it’s hard because I’m thinking of the commerce committee. One of the issues is that it is … For a long time, the companies have been very close to the commerce company. It’s very hard to see a way in for consumer advocates. I would have to think more about that.
Yeah, all right. So, where else in the states? Any state that is particularly …
DG: Yeah, I mean, I think there are some states that have really taken on interesting and novel approaches to privacy, in particular privacy. I mean, California has had a long-standing commitment to privacy and has been thinking about, for example, broadband privacy.
The think tank that Ben and I worked for, New America, has done a significant push for broadband privacy at the national level with the FCC, and in California, once the FCC dropped the ball. I think California is one of those states that I’d look to. Certainly Washington, Texas, Illinois and Connecticut has. Yeah.
Lastly, and then I want to get some predictions from you, the FCC. FTC and the FCC. Where are we with them?
DG: Well, it seems as though the FCC’s intent is to just deregulate the industry.
Yes, deregulate everything. No more new laws.
DG: Rip everything down. It obviously hits hard for me. One of the issues that I worked on with many people in the Obama administration was the net neutrality.
DG: It’s gone. I mean, I shouldn’t …
Congress should have taken this up. This shouldn’t be a volleyball every time there’s a new administration.
DG: Of course. It shouldn’t be. I have strong emotions about it that …
Don’t cry. Don’t cry, Dipayan.
DG: I could go on for a long time.
You just weeped over net neutrality. That makes you a Washington person.
DG: That’s an example where President Obama did have a strong commitment starting from his days in the Senate, tried everything he could, and people accused him of putting his finger on the button of the …
Yes, the did.
DG: You know, I think that with the FCC, it’s frustrating to see a lot of the things that are happening. I mean, the FCC’s role is to bring regulation and enforce it on a huge industry that has a lot of power over regular people.
And influence. Right.
DG: And to rip down the regulatory framework that has overseen …
Yeah, they’re not doing nothing. I was thinking of having you say that. “They’re not doing nothing.”
They’re not going to do nothing. And not for a while, I think, for a long time.
DG: I think not for a while.
Lastly, people. What do people have to do? And what are some of your predictions? I don’t see any regulation here. I just don’t see … I think they can’t agree on lunch in Washington anymore. So, getting to that is the last thing, and creating more controversy seems … Again, there has to be some sort of digital Armageddon, which even Equifax hasn’t kept people’s level of outrage up. That’s just plain old incompetence and greed, essentially, that everyone can react to, right? Like, “Wow, they really took advantage of me,” and people don’t seem to care.
What can plain old people do about this? How do you maintain a level [of outrage]? I think what they’re meant to do is anesthetize us into liking your apps and liking … Having engineers figure out how to keep you more and more engaged with this, and less and less enraged. It’s like soma, almost, from [“Brave New World.”] That’s how I look at it.
DG: Well, I think that people have the voice. When they come together, they can have an incredible voice. I’m thinking back to net neutrality, for example. John Oliver talked about net neutrality on his show in 2014. That pushed four million people to comment to the FCC toward the goal of bringing strong net neutrality regulation forward.
Of course, we’ve heard a lot about bots and fake accounts commenting to the FCC now and impersonations. But I think that that is the fundamental thing that a citizen can do. Comment to her government that, “I want you to take these actions on my behalf and my kids behalf.”
DG: That’s not just the FCC. That’s offices, too, on the Hill. That’s the No. 1 thing I would say.
Call your congressman.
DG: Call your congressman. I think Ben had mentioned earlier in this podcast that consumer education is a huge element to …
What you’re trying to do here, obviously.
DG: … to the particular problem of disinformation and privacy, more broadly. I think that we all have to be part of that. The company that I used to work for, Facebook, has actually taken a lot of incredible steps to try to inform people about how it collects data and how they can make choices about how they share information with Facebook.
Are you with a group … There’s so many ex-Facebook people now are like, “I can’t believe what we did.” Do you feel that way?
DG: No, no. Absolutely not.
You know, there’s like, “Regrets, I have a few. Except for the billions. I now enjoy my boat and plane.”
DG: Without naming any names, I know who you’re talking about.
There’s lots. There’s not just a few, but go ahead.
DG: Recently, there have been a couple of notable people. I don’t think Ben and I are … I know neither of us are trying to swing at the knees of the industry that has become the medium of communication today. Instead, what we want to do is develop productive conversation and help drive this conversation to something that can lead to something that can, frankly, make the world better for people.
So, is this overkill? I mean, I just said George Soros. I don’t want to get a lecture from him about …
DG: Sure, sure.
Come on. Manipulator of so many things. You know what I mean? Everyone feels the ability to pile on now.
DG: Yeah. I thought that you were asking more about ex-Facebook employees than …
Well, that too.
DG: No, no. You’re absolutely right. This is a broad issue now. It’s one that everyone cares about, wants to have an opinion on, and I think that’s important. It just shows the importance of getting it right. I’m not with the camp that has said that, for example, Facebook, as an example, Facebook was complicit. Having worked there at the time, I can say that that’s … I believe that that’s not the case.
I agree with you. I just think they weren’t paying attention. They don’t think like that. I think it’s hard to understand their mentality. I’ve talked about this before, but often I’ll go in and I’ll name six bad things that could happen with one of their products. They’re always like, “Kara, you’re so negative.” You know what I mean? I don’t think they’re being tricky and they’re wink-wink, nudge-nudge, we know the problem. I think they think about opportunity over problems.
DG: I think that that’s absolutely right, and I think that that’s how the market is designed. That’s the underpinning of the American capitalist system. That’s okay.
DG: I don’t think that we should revert — not revert, but change our system to the European one. I agree with President Obama when he stated that there might actually be some commercial reasons as to the particular points of advocacy from some of the vice presidents of the European commission and EU, for example.
DG: I think that that’s important for us to think critically about.
Right. With this very briefly, because we’ve got to go, this report is the first thing. What’s next?
DG: Well, as we state in the conclusion, what we would like to do is … Just very briefly, in the conclusion we talk about a few different steps, high-level principles that can be pursued, including consumer education, by industry, by public advocates, and the rest. And then we talk, very briefly also, about steps toward a new regulatory framework, not just for the industry but for communication overall. Amongst those potential areas of further inquiry, I think are privacy, consumer protection and anti-trust. I think we lay out a couple of initial directions, but we’re not fully knowledgeable about exactly where to go yet. We’re conducting more analysis and want to …
This is just the beginning.
DG: This is the beginning, and we want to think more about where we go from here.
Great. Thank you for coming on. I really appreciate it. I’m glad you tweeted back at me.
DG: Thank you for having me.
I was just trying to cause debate. I’m mostly mad at the Democrats because I saw them at all the parties. They’re like, “Oh, the internet.” I’m like, “You didn’t see that when you were enjoying the lovely caviar canapes.”
DG: Well, you’re a lot nicer in person than you are [online].
[laughter] That’s how I do it. See? That’s how I do it. That’s how I drag you into my web of interesting discussions. I always want interesting discussions, whether I agree with people or not, and that’s what’s most important, because we’ve got to talk about these things because we need to solve them. We don’t want our society to go down and down and down into some miasma of anti … no privacy and we’re staring at phones and we’re being taken advantage of. Because I think with Silicon Valley … The good part of Silicon Valley is very good, and the bad part is problematic.
Anyway, I’m here with Dipayan Ghosh, and Ben Scott was just here also. They’re two fellows at the think tank New America and they’ve written a report that you should read. It’s a really important one. A policy paper called “Digital Deceit: The Technologies Behind Precision Propaganda on the Internet.” It’s more than just Russian bots. That’s my addition.
Dipayan and Ben, thank you for coming on the show. We really appreciate it.
DG: Thank you.
Recode – All