How Complete Beginners are using an ‘Untapped’ Google Network to create Passive Income ON DEMAND
The data privacy scandal has Facebook scrambling.
On this episode of Too Embarrassed to Ask, Recode’s Kurt Wagner talks with Kara Swisher and Lauren Goode about the Facebook-Cambridge Analytica scandal. Wagner says reports of a political data firm exploiting a loophole in Facebook’s old data platform has severely undermined public trust in Facebook.
You can read a write-up of the interview here or listen to the whole thing in the audio player above. Below, we’ve posted a lightly edited complete transcript of their conversation.
If you like this, be sure to subscribe to Too Embarrassed to Ask on Apple Podcasts, Spotify, Pocket Casts, Overcast or wherever you listen to podcasts.
KS: Hi, I’m Kara Swisher, executive editor of Recode.
LG: I’m Lauren Goode, senior tech editor at The Verge.
KS: You’re listening to Too Embarrassed To Ask, coming to you from the Vox Media podcast network. This is a show where we answer all of your embarrassing questions about consumer tech.
LG: It could be anything at all, like should Kara Swisher delete her Facebook?
KS: I have to use it first, I don’t use it all. I never use it.
LG: Do you have two Facebooks? Do you have a personal and professional?
KS: I have several. Yeah, I’ve got that. I have a lot, I have like 750,000 fans or whatever the hell you call them.
LG: Likes, followers.
KS: I don’t know. I never go there.
LG: You don’t use them?
KS: I use Instagram and I use WhatsApp.
LG: Oh, therefore you’re not using Facebook.
KS: Yeah, I know, but I use their properties. I use Facebook properties.
LG: No, I’m joking.
KS: Who owns Waze? Is that Google or Facebook?
KS: All right, anyway, no, I don’t use main Facebook, it’s too heavy-handed for me. Anyway, so send us your questions, we’ll talk about that more. Find us on Twitter and tweet them to @Recode or to myself or to Lauren with a hashtag #TooEmbarrassed.
LG: We also have an email address, TooEmbarrassed@Recode.net, and a friendly reminder, there are two Rs and two Ss in embarrassed.
KS: As always, Lauren.
LG: As always.
KS: So, the reason we’re talking about Facebook, Lauren, what’s the reason? They’re in a bit of hot water, wouldn’t you say?
LG: As we tape this podcast right now, it’s Wednesday, you’re going to hear this on Friday, but it’s Wednesday about six pm Pacific, and Mark Zuckerberg is doing an interview live on television with CNN right now, but he also spoke to some media outlets, including you.
LG: Including Recode and Kurt Wagner.
KS: Right. Just a few precious ones.
LG: Just a few.
LG: About the Cambridge Analytica Story, which has really blown up over the past week. So, we’re delighted to bring in Kurt Wagner, Recode’s social media editor, who’s going to join us. I think The Verge’s Casey Newton may pop in in a moment.
KS: Pop in, he’s trying to catch up to our scoop.
LG: He’s filing furiously right now, as are a lot of the news media. Yeah, this is a really … It’s interesting because Facebook sharing your data, Facebook is a free service sharing your data, is not a new story, it’s not a new theme, we all sort of implicitly understand the exchange that goes on when we sign up for a service and we use a service like Facebook. This story in particular has really captivated people. Kurt, why is this happening?
Kurt Wagner: I think it’s a transparency issue. A lot of people do know what Facebook does with your data. I think what caught people off guard here is that, one, some 50 million users found their data in the hands of someone that they did not give permission to have it. Two, we find out that Facebook actually knew about this three years ago and never said anything publicly. So I think there’s this betrayal of trust right now. Not so much that, “Hey, we didn’t know Facebook …”
KS: Mark called it a breach of trust.
A breach of trust.
We didn’t know … It’s not so much, “Hey, we didn’t know Facebook had our data,” it’s more, “We gave it to you thinking one thing, and now all of the sudden we’re learning another.”
KS: More importantly, they didn’t protect it. They weren’t monitoring … What happened is, in 2007 — and I was at the 2008 F8 where he announced this — they did something called Facebook Connect. One of the ways it grew the platform, and the thing that made it big, was bringing all these developers onto the platform to do all kinds of things. There was one called Super Wall from RockYou, where you could put pictures. So they were bringing in lots of apps to get activity going on, which was …
LG: Those app makers were tapping into Facebook’s API to get the data.
KS: Well, in exchange, to bring them, they gave them precious data. They didn’t have a lot of rules, they had rules around it, but they didn’t monitor the rules. So, they had laws, but they didn’t enforce the laws, or they didn’t know what people were doing. So, all this enormous data went out for seven years. Wasn’t that right, Kurt?
More than that. I mean, if it would have been 2007, that’s 10 years, right?
KS: Well, no, they sort of slowed it down in 2014.
I’m sorry. Well, in 2014, they stopped what they allowed. So, let’s pretend you signed up for Words With Friends back in 2013, they would have also had access to all of your friends’ data.
In 2014, they said, “No,” if you give them permission, Kara, they can take your data, they can’t take all of your friend data.
Yeah. So for about seven years or so, they were not only giving away data of the people who agreed to it, but also everyone in their network was kind of losing their data, as well, without their permission.
LG: So, where does Cambridge Analytica come into the picture? Talk about that, and how they were mining the data, and how they weren’t exactly transparent about what it was being used for.
Yeah. They got ahold of all their data actually from a researcher, a professor from Cambridge, who created a personality app, I guess, and some 270,000 people used it, 300,000 people is what Mark Zuckerberg said today. So they all signed up to take this personality quiz, and as a result, all of their friends handed over their data unknowingly, as well. Then that professor gave the data to Cambridge Analytica, which is a data firm. That is where the issue happened.
KS: Right, so it’s pass-along, it’s like a virus.
KS: They passed it along and they didn’t have … Facebook was just not monitoring. Look, what Cambridge Analytica did was just suspect and misuse, and they said they were going to do something and they did something else. They said they were going to destroy data, they didn’t. They’re just liars, right?
LG: Mm-hmm. They said, just for background for people who are listening, they ultimately used the data in a way that influenced the … This is a U.K. based firm …
KS: Yes, then they used it for …
LG: … that ultimately used the data to influence in some … Well, the fact of whether they actually did influence people is questionable, it depends on how you feel about psychographics, but they used it with the intent to influence the U.S. election. Then when Facebook did become aware of it, they insisted that Cambridge Analytica basically would certify that they’d cleaned everything up.
They said, “We deleted it,” and now we’re finding out they didn’t actually …
KS: Also, they shouldn’t have had it in the first place. The whole question is how Facebook didn’t monitor the data it gave out. It was handing out data like candy to get these developers on its platform and then it wasn’t monitoring the data. It’s not just Cambridge Analytica, it’s like who did they give it out to? There’s tons of companies they gave data out to that don’t exist anymore, tons and tons of those. Who knows where the data has gone?
LG: Where does that go?
KS: It goes into the great data … Data sets are critical to marketers and everyone else, and Facebook handed these out for free, essentially, for getting people on their platform for benefit to Facebook. Essentially, you, the user, are the product, you are the product they’re selling. It wasn’t that you didn’t know, it was so confusing, it went from person … Especially the friend graph. If Kurt gave it out, I didn’t agree for Kurt to give out.
KS: So that’s the problem with this, they just did no monitoring. We did an interview with Mark tonight where he said, “Yeah, we didn’t.”
Yeah, and we asked him, we said, “Is it even possible to go out and get it back?” Can you go out and find some app from 2012 that had 100,000 users and therefore the data of maybe 20 million users, could you go out and get all that back? He said, “Not always,” right?
It’s kind of like what we wrote today, it’s like putting the genie back in the bottle. The data’s out there. Once it’s off of Facebook’s servers and onto someone else’s servers, you don’t have much control over it.
KS: They’re not the police, they can’t go in and get it. People could hide it, it could go into dark parts of the web. When he was asked if he could recover some of the data, he admitted not always, and I think it’s more than not always. Not at all, like, really, pretty much.
Again, once it’s out there, it’s out there, and they used it to build their business to what it is today. The responsible use of that data brings up lots of regulatory problems, there’s all sorts of violations of possible agreements they had made with the government before …
Yeah, the FTC might be investigating now, after all this Cambridge Analytica stuff, to see if they violated a consent decree that they signed in 2011. There could be a financial fine, which I don’t think is that big of a deal for someone like Facebook, they have so much money, but I think more concerning would be if Congress comes in and says, “Hey, we’re going to start regulating the data that you take in, because we no longer trust you to do this on your own,” all of a sudden.
Facebook’s whole business is based on that data, that targeting, specific hyper-targeting of ads that requires that data. If Congress says, “You can’t collect it,” or, “You have to collect it in a certain way,” that could change the whole advertising landscape that Facebook is built on.
KS: It’s a rolling controversy, it just keeps going. First it was fake news, then it was the Russian bots, then it was the fake advertising … it’s all the same thing.
LG: Yeah, there’s a convergence of issues that are happening right now, and they’re all contributing to this distrust with Facebook. When you look at … there’s fake news, like literally fake news websites that are having this presence on Facebook, there is the Russian propaganda and Russian influence in the U.S. election-
KS: Advertising lies.
LG: Right. In general, it just seems like there’s this …
KS: It’s the same story, lack of control of its platform, lack of monitoring, lack of responsibility around the data that it’s supposed to protect.
LG: Then on top of it, just as sort of a meta-theme, is people right now wondering if Facebook is good for them in general.
KS: Well, that’s a whole nother thing.
Is it good for your health, on top of all of this.
LG: Right. Right.
Are they stealing your data? Oh, by the way, is it making you depressed?
LG: Oh, by the way, does it make me sad?
LG: So, we’re …
KS: Can I just say, in the middle of this there was also this idea of remaining a neutral platform, which Mark would not go there. We’ve all tried to press him saying, “You have to have values and rules and things like that,” and he said … He keeps in this line, it’s a very Silicon Valley line, that they don’t want to have their personal ideology influencing Facebook rules or regulations. I’m like, “Why not?” It’s your company, kind of thing.
He really controls it because he’s got that special stock, this is the quote, “A lot of the most sensitive issues we face, there are conflicts between real values, right? Freedom of speech and hate speech or offensive conduct, where is the line?” Sounding more like an ethics student than the billionaire CEO of the one of the world’s most valuable companies. “What I’d really like to do is find a way to get our policies set in a way that reflects the values of the community, so I’m not the one making those decisions. I fundamentally feel uncomfortable sitting here in California in an office making content policy decisions for people around the world.” Well, he has to, it’s his company. That, I don’t agree with.
The end of that quote, actually, was the best part.
Which is him basically saying, and I’m reading over your shoulder now, “Who chose me to be the person that basically makes these decisions? I guess I have to because of where we are now, but I’d rather not.”
So it’s kind of that first, almost his first admission I’ve ever heard, of him kind of being like, “I really want to be a neutral platform, it’s not really working, and now, I guess, it falls on me to have to make these tough decisions.” He’s never really said that before.
KS: Which, I’m sorry, I’ve always thought that was just bullshit. Not from him, he’s a very earnest and thoughtful person. Let’s be clear, this is not Travis Kalanick at Uber talking, this guy really does think about it. The fact of the matter is, he has a responsibility and he’s got to start making choices, and they just don’t want to. They keep saying, “We’d rather have the community do it,” but the community has nine different opinions.
LG: That’s also a very data-driven approach. It’s like, how do you actually take the temperature of entire communities of 2.2 billion people around the world?
KS: And too easy to game.
LG: You do it using data and you say, “What do you …”
KS: It’s too easy to game.
LG: It’s kind of like, you vote for the most reputable publishers, you vote for what you want, and I think they’ve been hiding behind that idea that if they just had enough data, then it’s the user base that’s deciding, but that’s not … Kara, I think you kind of pushed this idea of, how did you not anticipate these bad actors though, as you’re building this massive platform.
KS: Yeah. They never do. Facebook Live, they … They have to take responsibility, that’s what adults do. This is their company, they’ve made billions of dollars off of it, they’ve decimated industries, like they really control the online advertising market. They need to be responsible and make choices. Making choices means you piss people off, making choices means you have to give up some things, they can’t have everything. They can’t have the world’s biggest platform and not be responsible for it. I just don’t … I don’t know why we’re even arguing over this situation. If they don’t want to do it, get out of the way and let someone else.
LG: Just to backtrack a little bit, we’re speaking right now, literally on the heels of this mini media blitz that went on this evening, on Wednesday evening. Prior to this, Zuckerberg was silent for about five days after the story broke last Friday night. So, where was he?
KS: Kurt, where was he?
Yeah. He was working. It was so bad that they came out with a statement that said he was working, he was working around the clock is what they had to say.
KS: Apparently around the clock. That’s what you do, you work around the clock.
You work around the clock. I think this was an example of …
KS: Kurt was working around the clock, by the way.
Yeah. What day? I don’t even know what day it is right now, yeah.
LG: Lucky Facebook reporters.
Yeah, this is great. No, I think this was an example of something that they learned from the really big scandal they had 18 months ago, right after the election when he comes out like a few days after and he says it’s crazy that fake news could have influenced the election. Do you know how many times people pointed to that interview and said, “Hey, remember that time Mark Zuckerberg said it was crazy and now look, he looks super naïve, he looked like he had no idea what he was talking about.”
In this scenario, I think that they remembered that interview, or that statement, and they said, “Well, before we get all the facts, the last thing we want is to put Mark out there in front of the press to say something that we’re going to have to backtrack later on when we know more details.” That is my hunch. They have not really come out and said specifically. He said in our interview, he was like, “Oh, one of the reasons it took me so long is I was going through this … I wanted to unveil a plan for all this.”
KS: Yeah, he’s like that.
“Before I say something,” but he could have said something a few days ago.
I think they just didn’t know enough and they didn’t want him to say something they were going to have to walk back.
KS: They badly handled this in the beginning, when he first said, “We had no impact on the election,” then, “Maybe a little bit,” and then, “Okay, maybe more.” “Oh no, there’s more Russians.” It’s like cockroaches.
LG: Right, right.
KS: If there’s one Russian, there’s hundreds.
LG: Then it became a personal …
KS: By the way, not all Russians are bad, just these Russians.
LG: Then it became a personal resolution of his in the new year to essentially fix Facebook. It went from, “No, this is not a problem, no problems here, nothing to see here,” to, “I need to fix something.”
LG: That was an acknowledgement.
LG: So what do you think happens from here?
I do think that, as we reported today, there’s a real chance that he could testify in front of Congress now and I think that …
KS: He’s open to it.
He’s “open to it.”
KS: That’s not a yes.
No, it’s not, but I guarantee that they’re all going to ask him now, right?
If he’s open to it.
KS: He doesn’t have a choice if he gets subpoenaed, FYI, he has to.
I think that could happen, I think that’d be a really big deal. I think this FTC investigation could be a big deal. I don’t fully know how realistic it is at this point that they would be regulated more severely, the way we were kind of talking about earlier, but hell, I’m afraid to say that anything is off the table at this point, I think that it’s possible.
I think if you look what’s going to happen in the next week, you’re going to see a lot more about their policy stuff and changes. I think that that’s the immediate plan for them is probably going to be, “Here are all of the things we’re doing to protect your data right now.” It’ll be things like, “We’re going to put News Feed alerts so that you remember to go check and make sure that you’re sharing with the right people, and that you’re severing ties with apps that you maybe used five years ago that you no longer have a relationship with.” I think big picture is that this is not …
KS: It’s not good.
This is far from over.
KS: The stock has gotten really hit because people do intuitively understand this goes to the heart of their business, that’s one of the parts.
KS: The second part is, again, I really like … You like Mark, I like Mark.
I do. I do.
KS: But the slow rolling. I like Sheryl …
LG: Where is Sheryl in all of this?
That’s a better question.
KS: She’s working around the clock.
Yeah. I think that’s a better …
When you think about Sheryl, she built Google’s ad business, or was a huge part in building it, she built Facebook’s ad business, for sure, she’s been there 10 years. What are the two companies right now that are in the middle of this entire ad dilemma? It’s Google and Facebook, right?
She’s very visible on Facebook, but it’s a lot of her “Lean In” stuff, it’s a lot of her philanthropy, and I think there’s a lot of people who would love to hear more from her on this topic.
KS: The only thing I would say, I’m going to push back because today, when I was on CNBC, they were talking about, “Well, why doesn’t Sheryl talk about this?” Mark’s the CEO of this company and he is the founder, he’s the CEO, he’s the technical founder, Sheryl’s not technical, these are technical, highly technical issues. He’s the one that has to talk.
I know they want to bring in Adult Lady, but he’s an adult. He’s an adult man with children, he’s married, he’s been running it for a long time, he’s a very smart man. I talked about this earlier, you’re juvenilizing these Silicon Valley men, “Let’s bring Sheryl to clean up.” She’s absolutely responsible, I a hundred percent agree, but he has to be the face. Just because she’s smoother and talks better, he’s the one, he has the controlling stock, he’s the one responsible, he’s the one that should talk, he’s the one that should take responsibility. It’s fine to have Sheryl, or Chris Cox, who’s head of the platform, or Dan Rose, any of these executives, or the CTO should probably speak, too, but really, it falls to Mark. Mark Zuckerberg wants to be the CEO of Facebook, he has to … Years ago, when he wasn’t being as adult as he was, he had a card that said, “I’m the CEO, bitch,” on his card, which I thought was funny, everyone didn’t like it, I thought it was so funny. But he’s the CEO, bitch.
LG: You want that business card.
KS: Yes, you do.
LG: Has this inspired either of you to reconsider your own Facebook accounts?
KS: I always monitor my security preferences.
No. Yeah, I did actually go through my settings and kind of just poke around since it had been a while, but no. As you pointed out at the very beginning, I kind of know what I got into when I signed up on Facebook. I think I’m also a little bit different in the sense that I write about it all the time. I don’t think it’d be possible for me to do that and not be on Facebook.
LG: You can’t just check out.
I rode from the airport here today and my Lyft driver told me he deleted Facebook.
KS: Oh, wow.
LG: Interesting. Did you ask or did the Lyft driver volunteer that?
No, we were … I’m trying to think. He was asking me what I did and I told him that I wrote about Facebook, and then we started talking about this data scandal, and he was like, “Oh, you know, a few weeks ago I actually deleted my Facebook so I don’t have to deal with any of that anymore.”
LG: That’s really interesting.
KS: I just don’t use it that much. There’s just …
Yeah, I don’t really either, to be honest. I’m much more about … I spend much more time on Instagram than I do Facebook.
KS: Which is a Facebook property.
Correct, but way more time on Twitter, as I’m sure you guys are, given our jobs.
LG: Yeah. I have a professional Facebook page, so I’m not inclined to delete that. My personal one, I think I am using it less, I haven’t done a very sophisticated analysis of my own usage, but I think I am using it a lot less. Yeah, there is an element of it that feels a little bit like “Hotel California,” it’s just very difficult to check out. Some people have brought up …
KS: Can you sing that please?
LG: Yeah. I know, I’m singing on the other podcast, right?
KS: Yeah, you did.
LG: A couple people, reporters, and I don’t want to give credit to the wrong person, but have brought this idea, too, that just to say, “Oh, well, just delete your Facebook,” in some markets or in some countries, that seems almost impossible.
KS: It’s ridiculous. You should just responsibly run it.
LG: It’s the way people … It’s like the primary way people connect with certain people. It is synonymous with the internet for some people in certain markets.
KS: They also Instagram and WhatsApp, and WhatsApp is an enormous property.
KS: So they’ve got … The overall leadership of this company has to take this privacy seriously. One thing Mark said, I think, that was super interesting was around the mistakes were made section of our interview, which he said, “I made a mistake,” so I appreciated that. The idea that it was built incorrectly at the beginning, which is back in 2007, and especially around privacy. He said he came to realize people did not want their privacy violated, and he just came to realize that.
“Frankly, I think I got it wrong,” he said, in a sentiment that most Silicon Valley moguls are loath to admit. “There was this values tension playing out between the value of data portability, being able to take your data and some social data, the ability to create new experiences on one hand, and privacy on the other hand. I was maybe too idealistic on the side of data portability that would create more good experiences and created some, but I think the clear feedback from our community was that people value privacy a lot more.”
LG: What does that say about the mentality of the people who made Facebook and continue to build Facebook?
KS: Data portability, it means money for them.
LG: Do they just really value this idea of …
KS: No, they don’t.
LG: … openness and data moves freely and things like that.
KS: That’s their word, but you know what? It makes money for them. That’s why. Privacy does not make money for them. Right, Kurt?
Yeah, I think that’s a huge part of it.
KS: Come on.
At that point, a venture-backed business that’s trying to rapidly scale and trying to add as many new users as possible, if you’re the profile, if I’m downloading 10 new apps a month and I’m using my Facebook identity to log into all 10 of those, I’m probably not leaving Facebook. There’s a huge value to them in doing that. I do, though, having spoken — and Kara has talked to more Facebook executives for longer than I have — I do believe that they are drinking the Kool-aid in terms of that mission, though. They truly believe the whole … You don’t think?
KS: No, I think it’s such bullshit. I think they’re lying to themselves.
I think they believe it.
LG: I would say that lying to themselves …
KS: Of course they believe it, they became billionaires.
LG: … and drinking the Kool-aid are kind of the same thing.
KS: Yeah, it is, but I think they believe it because they made money on it.
KS: I think, ultimately they pretend they don’t care about money and then they have giant houses and planes. So, I don’t know. I just feel like …
I’m not trying to say it’s not … I guess what I’m saying is, I think that it can be both. I think that it can be a good business and that they can believe in this broader mission of everyone connecting.
KS: Yes, libertarianism. Yes, it’s in that thematic …
It’s that idealistic idea of, “Oh, well, why would anyone ever use Facebook Live to murder somebody?” right?
The rest of the world is like, “Yo, the internet sucks, people do stupid stuff on the internet all the time.”
KS: Well, to me, that’s willful ignorance then.
KS: It’s absolute willful ignorance, pretending the inventions do not have consequences in the real world. You know what? Adults know about consequences.
KS: Maybe my 15 year old doesn’t know about consequences, but certainly, Mark Zuckerberg should.
Right. They don’t foresee a lot of them.
KS: Ultimately, after 10 times of this, it’s like, listen, you don’t have kids, but if my kid did it 10 times, I’d be like, “Okay, he means it,” kind of thing. Anyway, I’m giving a little parenting advice to Kurt.
I know, thank you.
KS: I’m such a scold. I am a scold.
LG: Too embarrassed to ask.
KS: Am I too much of a scold?
KS: I don’t think I … I’ve been banging on this drum for a while. With great power comes great responsibility, which was actually written by Voltaire, even though all of the geeks think it’s Spider-man, but no.
LG: I know, you told Sundar Pichai that during your interview with him.
KS: Yeah. He argued and …
LG: You were like, “Google that.”
KS: All right, we’re here with Recode’s Kurt Wagner and in a minute, we’re going to get through some questions from our readers and listeners about Facebook. But first, we’re going to take a quick break, a word from our sponsors. Lauren?
LG: Hashtag #Money, that thing that’s driving all of this scandal.
KS: Yeah. Kurt, do you want to say it?
Yeah, sure. What am I saying? #Money.
KS: No, that’s not … Come on.
LG: My stomach is growling.
KS: Do it like a sports tag.
KS: Very nice. I think I’ve found my new #Money person. Anyway, we’ll get back to you.
KS: We’re back with Recode reporter Kurt Wagner, talking about what else Facebook … Kurt has done an astonishing job this week because there’s so much news coming out of Facebook. He’s doing tons of stories, kudos all around, but we’re going to answer some of the questions that our readers and listeners have been asking. Lauren, would you read the first question?
LG: Absolutely. The first question is from Two Lamb Fam, who asks via Twitter, “Why does Facebook even give other apps access to all of that user data? Their targeted ads product doesn’t require other companies to own the data. Advertisers just tell Facebook who they want to reach and Facebook serves ads to that, no need to hand over any data. #TooEmbarrassed.” Kurt, is that true? If so, why the access?
Yeah. Well, this is what we’ve kind of been riffing on this whole conversation so far, is that it was a huge way for Facebook to grow in the early days. If they’re …
Yeah. If they’re bringing in other apps, and Facebook benefited in the way that if I’m logging in through Spotify, maybe I’m posting back to Facebook and saying, “Hey, here’s the song that I’m currently listening to,” right?
KS: It was done to create users.
Create users, create content, create a dependency on Facebook, I’m not going to delete my Facebook account if it’s my login information for every app on my phone. So there’s a lot of different reasons that Facebook saw value in this. I think it’s been more recently, obviously, that they’ve realized, “Oh, maybe this isn’t always the best approach.”
KS: I would agree with that. I think they don’t need to. They needed to grow the company and now, of course, they’ve pulled back because they don’t need them anymore, and so they should control all their data, and then they should protect it. Let’s hope they don’t have a hacking after this.
Right. The question is, how much is already out there, right?
KS: It’s out there. Come on.
KS: Come on. That’s the thing, he was close to saying that.
I know it is out there, I’m saying how much? How many developers have your information, Kara?
KS: I don’t use Facebook that much. What “Likes” do you have? I never “Liked” anything, I hardly put my school in there.
I bet I’d laugh, I haven’t looked at my “Likes” in a long time. I’m pretty sure Rascal Flatts, I “Liked.”
KS: All right.
The show “Friends,” I liked.
KS: That’ll get you fired here at Recode. All right, next question.
LG: Why did the show “Friends” go off the air?
I don’t know, I’m a huge fan.
KS: Such a good show.
LG: Oh my goodness, which “Friends” character do you identify with the most?
Everyone thinks I’m Ross, which is a bummer, because he’s like the worst.
LG: Yeah, but you’re not a Joey.
KS: You’re not a Joey.
LG: Maybe a Chandler.
KS: No, you’re not really a Chandler.
LG: I don’t know. You know what this sounds like? A personality quiz.
Let’s get a BuzzFeed quiz on this, right?
KS: I’m trying to think.
LG: You know what happens with personality quizzes.
KS: You’re not any friend.
LG: Were you going to say he’s a Phoebe?
KS: No, you’re a Phoebe, obviously.
LG: I am a little “Whoo.”
KS: Whatever. I’m sorry, I’m really tired, it’s been a long day, we did a long interview with Mark Zuckerberg, who did a very good job, I thought. This only talks about Facebook, the social network, I wonder what Facebook the company does with intimate and more personal details from Instagram and WhatsApp. I agree, they have other things you may not know they own. Kurt?
Well, they don’t sell it, or they claim they don’t sell it, and I believe them only because I think it’d be very bad business for them to sell your data. They use it, though, to show you targeted ads. Right?
That is the whole point of all of this data that they collect, is that they know that you’re male or female, and you’re in this age group, and you live in this city, and you like “Friends,” or you don’t like “Friends,” or you’re a Joey more than a Chandler. They know that stuff about you, which is why their ad business is so good.
LG: What’s the one ad that you both see consistently on your Instagram?
KS: I don’t go on Facebook.
LG: On your Instagram.
I get a ton of those ads that follow you around the internet. Right now, I’m getting a lot of golf club ads. I like to golf.
KS: Of course you are. I could guess that about you.
LG: But you already own them, right?
Actually, fun fact, and what a waste of ad money, I looked them up online, I went and bought them in a physical retail store, brick and mortar, and there was no way apparently for the online advertiser to know that I’d already made the purchase. So for the last six weeks I’ve been getting golf club ads, and I just laugh every time because I’m like, “I already made this purchase, man, you’re wasting your dollars on me.” Yes, I do like to golf.
LG: Yes. I get followed by …
Most people at Recode make fun of me for that.
LG: The funny thing is that I don’t even like to … I don’t really like to decorate that much, and I’ve been followed by this Parachute Home ad on Instagram for months now, and everything is the same sort of very, I don’t know, southern California aesthetic.
KS: It’s sheets. Brooke Linen is our sponsor, but okay.
LG: Is that the same company?
KS: No, they’re not the same.
LG: I don’t know, I just get followed by home décor ads a lot. I must have “Liked” accounts at some point.
KS: You know what I get? I’ll tell you what I get. I read the New York Times every day and I had to finally literally complain to Twitter, and they took it down, I think they went and added a special Kara Swisher squad. I complained to the New York Times CEO, too. It was a shirt company that had a rainbow sort of painted shirt, so that it was like a … It was a really ugly white shirt with gay rainbow splatter, and I was like, “I’m never buying that shirt, stop.”
LG: That was on the New York Times?
KS: Every day. Every five scrolls.
LG: So you think it targeted you?
KS: Some gay thing. Why would I want a rainbow paint-splattered shirt? It was crazy.
So what we’ve learned is you just go complain to Jack Dorsey at Twitter.
KS: I did that, yeah.
Everyone out there, just call up Mark Zuckerberg like Kara would, and tell him that you don’t like the ads you see.
KS: I might go to Sheryl for that.
Okay, call Sheryl.
KS: Yeah. Okay. I’m just saying, they’re irritating. Okay, next one.
LG: Next one is from Fernanda Beltrao, who wrote via email, “My first question is, should we really call it a data breach? No one stole …”
KS: No one calls it that, Fernanda.
LG: She said, “No one stole the data, right? Facebook sold it. Also, is there any way that Facebook can keep control of the data they sell and make sure it’s not used in an ethical way? Or, I don’t know, don’t sell it at all.”
LG: Well, we’re kind of talking about this one.
Yeah, there’s a few things here.
KS: Go ahead, Kurt, just take this one. The answer is yes and yes.
Let’s unpack this question.
One, Facebook doesn’t sell your data, we just talked about that. This was not an issue of Facebook selling data. It was also not a breach because, as Fernanda pointed out, there was no technical hacking, there was no breaking through the firewall into the back systems. Facebook gave this data away to a partner that had used its API, that was all above board, by the books. That partner then gave that data away, which was a violation of the rules. So, as we’ve talked about before, Facebook doesn’t sell your data, it gives it away to certain partners.
Yeah. Really, there’s no way for them to kind of keep tabs on where it goes after the partner has it, and that’s really what the problem is here.
KS: It’s like the clap. Not that I …
KS: Sorry. Oh Kurt, I’m sorry.
No, that’s good. That just caught me … You know. I was in Facebook data mindset.
KS: It just goes. It’s like if you opened a pillow up and spread the feathers, they’re gone.
KS: You can’t get them back, or not all of them.
It’d be a pain.
KS: You can’t, it’s gone. It’s gone. All right, feather pillow was a much better thing than the clap.
All right, Mike Stehle via email, “Does a Facebook user/account holder essentially have access to all their friends’ data? In other words, if I have a lot of friends or followers, can I access their data? Can I voluntarily pass that data to others such as Cambridge Analytica? Couldn’t any Facebook user with lots of friends who is sympathetic to Trump simply consent to Cambridge Analytica accessing the data for their friends?” I don’t know that. Kurt?
LG: That’s interesting.
Yeah. I read this before and I went and tried to do a little looking. Obviously, if you are friends with someone you have certain information that they share with their friends, right?
I would probably share my location, name, school, all of that stuff, with the people I’ve agreed to be friends with. I guess there’s nothing stopping you from going to all of your friend profiles individually, tracking all of that information …
KS: That’s crazy.
Collecting it and then sending it to someone. It seems like a huge hassle and I don’t think any …
KS: You can.
It’s not scalable in the way that any advertiser would want it.
KS: It’s an interesting question. Then, “We know from the recent indictments that 13 Russian individuals/companies set up Facebook accounts, posing as U.S. citizens or groups. Did that give the Russians access to the data of those individuals that followed or friended those fake individuals/groups?” Yeah, I suppose they did. “If so, is there a way for FB to trace to see if those people — the unwitting friends of the Russians — were then targeted in the manner described by Cambridge Analytica?” By the way, I did see a New York Times piece about when they found out they were going to Russian events, they were like, “Yeah, that’s okay.” Like they went to some of the people who got duped-
LG: Who went?
KS: New York Times went to some of the people that got duped into going to events the Russians put together. They’re like, “Well, so what? I still agree.”
LG: Oh geez.
KS: Yeah, exactly. All right, were targeted in the manner … So, what about that?
This question is basically asking, if you’re a page or a publisher, can you get all of the data from the people who follow your page or “Like” your page, in the way that you would get friends’ data. I have a professional Facebook page, Lauren, you do as well, I read this question, so I went onto my professional Facebook page and I tried to see, can I get all the data from my followers?
What I was able to do was see aggregated data. So I could see, for example, that I had 10 followers from Seattle, Washington, or I could see that I had 20 followers in the 18-to-24 age bracket, but I wasn’t able, unless I just missed it, which is possible, but I wasn’t able to go in and look at individual profiles, or collect all of that personal granular data. All I could get was big-picture stuff.
LG: Yeah, but even aggregate data sets have been shown through data science, you could work into it backwards and find out, at least who people are.
The value of that is, if we’re Recode, which we are, and we go to an advertiser and we say, “Hey, we want to give you a …” What is the kind of ad now? “Native ad, native content, or whatever. Here’s the demographics of our followers.”
LG: Yeah, I have 10 followers in Seattle.
Yeah, they might say, “Oh great, just the person we wanted to reach, we’re going to pay you money for something.” So there is value to it, but I don’t think it’s the same as having the individual granular data of all these users, which is what we’re talking about here with the Cambridge Analytica situation.
KS: Yeah, 100 percent. Just to be clear, we’re here at Vox tonight and we have … I tweeted out the other day Vox sells data, too, but it keeps it anonymized, it’s quite conservative, and we don’t give it out to third parties in the same way. Anyway, you can go read it. I tweeted it out, I’ll retweet it again. All publishers do this, but not in this massive amount and with this much information. We don’t have people’s “Likes,” we don’t have people’s behaviors, we don’t have … Just read the story, essentially.
LG: It’d be a good if 2.2 billion people were subscribed to Vox content.
KS: Yeah, that would be great. I sure would have a lot more value … Everybody must wear pink or something like that, I declare it. Okay, so we’re going to take a quick break, one more word from our sponsors, more questions after this. Lauren?
LG: I thought it was Kurt’s turn.
LG: Part two.
KS: Kurt, do that again, please?
LG: Oh my God, you little …
Is that good? That was different. That was part two.
LG: Move over, [Michael] Buble.
That was more country music than I intended.
KS: I liked it. I like it, Kurt, you’re hired. When we get back we’ll ask more questions.
KS: We’re back with Kurt Wagner of Recode who covers Facebook. We just finished a 20-minute interview with Mark Zuckerberg about mistakes were made. Would you call it that, mistakes were made?
Mistakes were made.
KS: Uh-oh, uh-oh, a little bit of uh-oh.
I’m sorry, whoops.
KS: So sorry. Whoops.
KS: Yeah, that kind of stuff. Yeah. He was very thoughtful about a lot of stuff.
I thought he was. He actually answered all the questions.
KS: He did.
And I thought he didn’t really dodge.
KS: He didn’t.
I know he didn’t fully say yes to the …
KS: He didn’t agree with me on every issue.
He didn’t agree, but I thought he actually answered the questions pretty appropriately.
KS: He did. We’re going to put the whole transcript up. I have a different thought about it, he doesn’t agree with everything …
KS: … and us, but that’s okay, he made his case.
LG: Did you ask him about what the possible fallout would be from all of this?
LG: What did he say?
KS: Not good. Right?
Yeah. That was at the very end, he was running to an all-hands meeting with staff. Which was kind of cool, he was literally on the cellphone as he was walking to the all-hands meeting.
KS: Yeah, he could have hung up on us.
You kind of said, “Hey, is this a big deal for Facebook’s legacy?”
KS: How bad?
He was pretty much like, “Uh, yeah, people seem pretty upset about it.”
KS: He gets it.
He wasn’t being naïve about the fact that this is a big story.
KS: Yeah, exactly. He was good. Look, people are still furious. Right now, on Twitter, I’m looking at people just really hating Facebook right now.
LG: What are they saying?
KS: That they stole our stuff, how dare he, he should be in jail, a lot of stuff. A lot of stuff that’s not nice. I feel your pain. I feel your pain, I agree.
LG: Well, yeah, and those are the users. They matter.
KS: They did not responsibly run that platform the way they should have. There’s no two ways about it.
All right, next question, EF something: “What’s the difference between how prior campaigns used Facebook data, especially Obama, and what Cambridge Analytica did?” Kurt?
This is a really good question, so I tried to … I wasn’t actually covering Facebook back when Obama was having his 2012 campaign, but what I gathered, because it’s now been brought back up to the surface in the last couple days, is that they also did something very similar. The difference is that they claim they used their own app. So, whereas Cambridge Analytica got a lot of this information from that professor that we talked about at the very beginning, which violated Facebook’s rules, the Obama campaign is saying, “Well, we actually just created our own app, people opted in, and as a result we were able to gather the information about them and their friend networks to do targeting.”
LG: Was this in 2008 or 2012? I would imagine 2012?
2012 is my understanding, which would have still been before they made the changes, so it makes sense. It doesn’t sound like there was a huge difference in terms of the data that each group had, it just was how the data was collected. So, not saying that Cambridge Analytica necessarily did anything different than Obama did, but they got their data in a different way.
LG: So what you’re saying is that the Obama campaign also knew you “Liked” “Friends.”
They knew I “Liked” “Friends” and Red Robin and Rascal Flatts.
KS: What? What is Red Robin?
These are the things I probably “Liked” on my Facebook profile.
LG: Are the days of innocent … You know what? I don’t even know if campaigns were ever innocent.
KS: We’ve been using data forever.
LG: I was just going to say, data’s been used and manipulated in elections for as long as …
KS: They went to Facebook because it was like, “Why do you rob banks?” “Because that’s where the money is.” Facebook’s where the people are.
LG: I guess my question is, does this really change anything now that we …
KS: It gets worse and worse. It gets worse and worse, as long as these companies don’t take this seriously. There may be a point where they just can’t sell the political … Again, they’ll find a way to get at this data. This is a treasure trove. AI, for example, needs huge data sets to be effective and they have the biggest data sets, them, Google, Amazon, these data sets are valuable beyond their … They’re just money in the bank, so to say.
KS: All right, next question, Lauren, why don’t you read it.
LG: Next question is from Swaroop Satheesh.
KS: Oh, that’s a great name.
LG: “With Facebook scandal and Uber autonomous car getting into a fatal accident” — it was a terrible story — “do you think we’re going to see a paradigm shift in the way tech companies treat data?”
KS: I don’t know if they have anything to do with each other?
LG: Yeah. Data and …
KS: Look, autonomous cars are going to … This is a tragic event, but you’re going to see this happening, hundreds of people die in car accidents every day, they’re all tragic, every one of them. So you’re going to see this as autonomous cars roll out, it’s going to be a lot of … Over the years, any technology has its price. In this case, it was tragic. I don’t know if it has to do with data. That had to do with sensors and the pedestrian.
I think it’s more to do with responsibility, right?
KS: Yeah. Yeah. Good point, Kurt.
It’s less specific to data, more about how we hold these companies accountable. Clearly, Facebook’s going through that now. I haven’t, believe it or not, even been following the Uber thing as much, just because Facebook has been so crazy the last couple days, but I’m sure that Uber will have to hold itself accountable and people will hold it accountable.
KS: Yep, a hundred percent. All right, next question is from Diego Siles, “The president of Bolivia is going for unconstitutional reelection”— that’s his opinion, I don’t know much about Bolivia and politics — “next year and plans to have a social media team.” I’m surprised he doesn’t have one already. “Will Facebook be controlling this for things outside the U.S. considering it’s a market not particularly interesting to them?” That’s an issue, because Facebook has a lot of impact in countries in Indonesia and others, they’ve affected things by fake news.
LG: The Philippines.
KS: They have impact everywhere, and the massive impact they have in these other countries where people rely on them is really … They’ve got a business that’s super complex and super prone to controversy, I think.
KS: That’s a nice way of putting it.
LG: It’s not only about people in certain markets getting their news entirely from Facebook, but it’s the way that certain governments are able to manipulate Facebook data and the messages that are being shown to people in a very undemocratic way. It’s of concern.
I only have one point on this, which is that — I believe it was almost a year ago, right before the French presidential election, Facebook came out and said they banned like 30,000 fake accounts or bot accounts, that they were afraid could try and sway that election. I realize that France is perhaps different than Bolivia, but at the same time, I think Facebook does not want to be a manipulative service in any country or for any election. Obviously, we’ve talked primarily about the U.S. presidential election for the last 18 months or so, but I can guarantee that they’re focused and thinking about stuff that’s happening in other parts of the world, as well.
KS: Right. Absolutely. Next question, Lauren?
LG: Michael Pacholik, “How do I become …”
KS: These names are so good today.
LG: Everyone, Kara likes your name. “How do I become a ghost on Facebook without deactivating my account? Convincing my friends to abandon it with me is a losing battle and we as a group use it to organize events.” So he wants to ghost, but he still kind of wants to use it.
It’s actually not that hard.
KS: It’s not.
You have your account and you don’t post.
LG: And you don’t “Like” things.
Yeah. You can participate in a group conversation without posting publicly. Even if you do want to post, you can change … I think there’s just so many little details about Facebook in terms of privacy that people just aren’t aware of. So, every time you post you can set who can see that post, you can show it to everybody on Facebook, you could show it just to your friends, you could actually eliminate your ex-girlfriend or boyfriend from seeing it. There’s a ton of controls that you have, it’s just a matter of understanding them, knowing where to find them.
If you want to be on Facebook but you don’t want people to really know you’re on Facebook, you can create an account, give them the bare minimum information they would need from you, which is probably just the name and an email, and participate in things like private groups and that’s about it.
LG: There you go.
KS: Yep. Question from Alan Hui? Okay. “What’s the chances of Mark become POTUS now that this storm happened?” I’m not so sure he was going to be POTUS. I never went with that one, I don’t think even Kurt …
Yeah. I wasn’t a believer.
KS: It’s not good, but here we have Trump, so …
Yeah, I was going to say, people seem to forget things pretty quickly.
LG: Yeah, especially when it comes to business deals and interactions.
KS: I don’t know, we’ve got a porn star and a former Playboy model suing the president, so I don’t know. I feel like the data breach should be …
Yeah, I wouldn’t say there’s anything that disqualifies you from being president now.
KS: The data was still …
KS: I don’t know, that’s a good question. I don’t think he’s running. It’s not good. It’s not good. It’s not good for him. He did point this out, he said he’s made mistakes before and he’s going to make them again.
Oh yeah. Honestly, five days ago, six days ago, when this first broke, I was like, “Well, you know, this is certainly a notable story …”
LG: Another Facebook scandal.
“But you know, this’ll be over in 48 hours.” I’ve been pretty blown away by the reaction. You can just tell people are fed up.
KS: The timing. People are tired.
Yeah, people are fed up.
KS: People are fed up, and because of the election, the political part of it, even the idea that they may have even slightly impacted the election — and people will debate how much or how little — that’s really disturbing.
KS: If you need to focus on someone, rather than the Russians, you focus on Facebook.
LG: Well, a lot of times, in any relationship, whether that’s with a product or service or in real life, it’s not the thing that seems to … What’s the saying? The straw that broke the camel’s back?
LG: It might not be the biggest thing, but it might be a buildup of things that have betrayed people’s trust.
KS: You know I’ve been hammering on this, it’s people are getting a very lizard sense that technology might not be for the good. Like self-driving cars, automation, robotics, all these things, AI, I think people understand very clearly in the back of their minds that these things are going to have real consequences.
LG: What you’re describing, too, are also very … Those are technology, and I think what’s happening here is there’s this confluence of events where the culture of technology is meeting with the products and services in a way that people aren’t comfortable with.
KS: It’s the political part.
LG: It’s not just whatever’s going on with Uber, it’s the Uber culture that we’ve heard a lot about, and the evasiveness, and that whole “move fast and break things, and ask for forgiveness later” kind of ethos. When that starts to butt up against the actual … It starts to feel very real to people, when they’re not just reading about those stories in the newspaper, but when it actually impacts the products and services they use every day. When they can see that and it’s a very tangible thing, I think that’s when you have this perfect storm of events.
I would say the last point is that this is a very personal thing for potentially every Facebook user, right?
All the dilemmas and drama and issues we’ve dealt with so far have been pretty big-picture, honestly. Like, fake news, was it used by the Russians to whatever? A lot of that, first of all, apparently half the people don’t care or don’t believe about it, a lot of people aren’t in the U.S. so they don’t care, they don’t believe about it. This is like every single Facebook user has data within Facebook.
LG: Right, or has connected to some third-party app.
So, this is a problem that does not just affect people who are disgruntled about the 2016 presidential election.
This is something that could theoretically impact two billion people who use the service, so I think that’s why we’re seeing even more …
KS: And it’s the political …
I think there’s a political spin, too.
LG: Do you remember when there were websites you could click on where it would tell you if you had read something that was made by some type of fake news/Russian bot, if you went and clicked on it, it would say, “No.” Like in my case, it said, “No, don’t worry, you didn’t read anything that actually came from one of those sources.” You kind of feel like, “It’s okay, I think have sense on the platform.”
LG: When it comes to … I, at one point, you mentioned Words With Friends earlier.
LG: I connected my account with something that does this data scraping. It really does open it up to so many people.
Right. Well, and not just if you did it, but now we know …
LG: And my friends.
… friends do it.
LG: Right, prior to 2014.
Words with Friends probably got my data.
KS: Yep, a hundred percent. All right, last question, Lauren?
LG: This is via email from Liz Weeks, one of our most loyal listeners, who sent a lot more questions than this, but we’re going to read a couple. “First and foremost, I have a rudimentary understanding of what it means to ‘delete’ data, I just assume even if it’s deleted by me or even a company, it’s still out there somewhere in the ether.” Good point, Liz. “When Facebook promises to delete data once and for all, what precisely do they mean?”
It’s an amazing question because I don’t know if anyone has a super, super strong answer for that. Basically, traditionally, it means if you’re deleting it, you’re wiping it off of a company’s servers. They have these servers where they store messages and posts and videos, so that when you open the app and you say, “Oh, I want to look at that vacation photo I haven’t looked at for two years,” it’s stored somewhere on their server so that you’re able to look it up. If it’s deleted, that means it’s wiped completely off that.
I think the issue here that Liz is getting at is that once the data leaves Facebook’s servers, and once they share it with Words With Friends or Spotify or with Airbnb or whoever it may be, it’s now living in two places. Facebook can only delete it on the servers that it controls. It has to rely on these third parties to also treat it responsibly, take care of it, protect it, and that’s where we’re running into the issue: How many tens of thousands of developers that created an app that was cool for three weeks and then disappeared, how do we know that they were practicing safe user privacy regulation?
LG: Yeah, that’s the issue.
KS: That’s the issue, they didn’t monitor it. It’s monitoring, monitoring, monitoring.
So you should assume that perhaps most of the things you’ve given Facebook probably do exist somewhere out there, even if they’ve deleted it.
LG: Fun. Another question from Liz, “What role does Joseph Chancellor play? I would like to understand if he’s a psychiatrist for Facebook or if he was simply given access to Facebook data. What checks does Facebook have on researchers using that data for non-academic purposes and B) do they have any conflicts of interest provisions?”
I tried to look into this a little bit. Joseph Chancellor is the former Cambridge Analytica employee who is now employed by Facebook. I do not know much. My understanding from what I’ve read is that Facebook is now exploring … He’s still there, he’s still employed there, I believe, and Facebook’s now looking into whether there was any wrongdoing from him, was there a connection of some kind, was he helping?
I think it’s very much an innocent-until-proven-guilty kind of thing, because I think this guy could very well have just been an employee there and he’s now an employee at Facebook. That’s pretty much what I know. I assume, especially given the gravity of the situation, that Facebook is looking into that very closely.
KS: Yeah, absolutely. Overall, this is the last question, Kurt, what’s the next story?
I think the big, big story that’s going to be important and also might not happen right away, is what impact all of this has on Facebook’s executive team.
KS: Right. The cohesive team, they always brag about their cohesion.
Yeah. Kara and I have talked about this a ton. Facebook’s executive team is very close, they’ve almost all been there for 10 years or more, a lot of them were the original crew that helped build Facebook from Palo Alto.
KS: You’d say OG.
LG: Who are some of those people?
KS: Not these people.
Yeah, I wouldn’t call them OG.
KS: Original geeks.
What? Say what?
LG: Who are some of these people?
Like Andrew Bosworth, who goes by Bos, Naomi Gleit.
Mike Schroepfer, Sheryl Sandberg.
KS: Dan Rose.
Dan Rose is a good one. I think he’s been there 12 years.
KS: Chris Cox.
KS: It’s more than a dozen years.
Chris Cox is head of product at Facebook, he’s literally like Zuckerberg’s best friend, they travel together, there was like paparazzi photos of them in Hawaii.
KS: It’s a very cohesive … Elliot Schrage is there.
I think the issue is, we just found out this week that their chief security officer is leaving over some disagreements about how to handle all of this stuff.
KS: He wanted more transparency.
I can’t believe he’s …
LG: The whole team, Nicole Perlroth reported that …
KS: Great job on that, New York Times.
Yes, awesome story. I can’t imagine he’s the only one who’s had a disagreement about this internally. I can’t imagine that there aren’t people who … Obviously, Mark is responsible, Sheryl is responsible, but there have got to be other people who have been there a long time who are responsible for what’s going on here.
So, are there going to be more exits? Are there going to be people who are asked to leave because, “Thank you for your service but you’ve screwed up”? Then are there going to be people like Alex Stamos, who say, “Well, we disagree with how things have gone,” or, “This is just too much for us and we’re going to leave.”
LG: How’s the board reacting?
The board gave a statement in support of Mark and Sheryl just a few hours ago, and I actually thought that was really interesting because Mark and Sheryl are both on the board and Mark basically controls the whole …
KS: Controls the board. It’s like a Russian election.
People were making a big deal, they were like, “Oh, the board came out with a statement in support,” and I was like …
KS: Well, he’s won by 76 percent.
KS: What a surprise.
LG: I call a board meeting.
KS: Mark is not like that, but the fact of the matter is, he controls the board, period.
KS: Period, period, period, end of story. If he didn’t … By the way, boards are like … Come on, look at what the Uber board did. This guy practically killed a puppy in front of them, he would have had to do that, I don’t know what he could have done and … He didn’t kill a puppy.
Facebook in particular has been around for a long time. Marc Andreessen is not going to come out and publicly chastise Mark Zuckerberg.
KS: Marc Andreessen is not going to slap around Mark Zuckerberg. Never. Then Peter Thiel, this is not a group of people that are going to object, they’re going to stick together. That’s the issue is the wagon, whatever you do with wagons.
LG: No, what do you do with wagons?
KS: You circle them.
LG: That’s right, okay.
KS: You circle them. So, circling wagons is what Silicon Valley does.
LG: I didn’t know if you were going to say like, the wheels are coming off. I was really wondering where you were going with that.
KS: No, the wheels are coming off some of the wagons, and wobbly wheels, but they’re never going to do this, not Mark Zuckerberg, not. He’s like, no, he’s the top top, do you know what I mean?
KS: Nobody’s going to mess with him. The question is, are they going to do their job? These boards, none of these boards in Silicon Valley — and by the way, across the country, really, come on — I just don’t expect any kind of courage from any of them. As it’s shown over and over again, the Yahoo board, the … Just every … It’s just not going to happen, right?
Yeah, I agree.
KS: They’re going to support him.
I saw the statement and I thought, “Of course.”
KS: I’d like to see one board member saying, “This sucks.”
It would have been way, way, way, way more interesting if they had not shown support for Mark and Sheryl.
KS: Yeah. Getting to what Kurt was talking about, and we will finish on this, is this cohesion. I had a back and forth with Elliot Schrage, who’s the head of policy and comms essentially, he came from Google with Sheryl. I put my hand up, and he’s like, “Oh no.” And I said, “You guys brag about your cohesion, that you all get along.” I said, “Is that a problem? Because there’s nobody, an irritant, in the room.”
LG: Right, it’s a bunch of yes people. Well, not …
KS: Not yes people, that’s too easy. It’s a very different kind of thing, it’s a cohesive mentality of these people that agree …
They believe in their mission you were talking about earlier.
KS: They’re in agreement. They’re in violent agreement. They don’t want to get angry at each other, they’re very cohesive, they’re incredibly smart and everything else. So, there’s nobody like … I was joking with Marc Andreessen and I was texting with him, and I was like, “Put me on the board, that’ll be …” He didn’t respond, but it was really interesting. It’s like, you need irritants in these companies to say, “No,” and that doesn’t happen. Anyway, we’ll see. There’s lots to come, right Kurt?
A lot more.
KS: Get some sleep, Kurt.
I would like to.
KS: Get up early.
I’m doing CNN International at 10 pm tonight.
KS: Fantastic. I’m going to pass you a lot more.
KS: Okay. All right. Thank you, Kurt.
Thanks for having me, guys.
LG: Thank you Kurt. It was really good chatting with you.
Recode – AllCash For Apps: Make money with android app