Transcribe your podcast
[00:00:00]

Episode 27, The Trinity of Trinity or three Kube, the 27 club, wanted admittance their famous people died at 27 include Kurt Cobain, Amy Winehouse and Jimi Hendrix. A number of letters in the Spanish alphabet. Bienvenido on my podcast, bitches. Go, go, go. Welcome to the twenty seventh episode of The Proff show and today's episode, we speak with Yil Eisenstat. Yale is a visiting fellow at Cornell Tech's Digital Life Initiative. True Story.

[00:00:38]

I taught at Cornell Tech for a semester when they had a floor and the Google campus, and I was fairly underwhelmed. Cornell had this enormous opportunity, incredible positioning. The third university, Roosevelt Island, support of the mayor, a lot of funding. A lot of Cornell alumni stepped up. What went wrong? They held on to the same model, kind of lame tenured professors. And I think the faculty there is fairly over, fairly underwhelming. I'm sure I'm going to grab shit for that.

[00:01:05]

And I find that their tech offering is somewhat anemic and that they are not commanding the space they occupy.

[00:01:11]

They're a little competitive, competitive juices flowing.

[00:01:14]

But I was actually very excited about Cornell Tech and think they have underwhelmed anyways.

[00:01:19]

Anyways, today's episode we're talking Daio.

[00:01:23]

She works on technology effects on civil discourse and democracy. She previously served as the Elections Integrity Head for political ads at Facebook. Look at this election's integrity, Facebook, not words you find in the same sentence office. She's also a former CIA officer and White House advisor. We discussed the damaging role Facebook plays in our elections and the online threats to our democracy. She also has worked at Exxon and was an adviser to the White House and some in some it should just say on our Wikipedia profile, total badass.

[00:01:54]

She also having a bit of a moment. She gave a TED talk that went viral. And it's just in general a very thoughtful person. And something I love about her is that while I don't know her exact political leanings, she comes across as and moderate to me. Anyone who works at Exxon is likely not going to the Wolk's to get the progressive pedicure, if you will.

[00:02:12]

Anyways, the big news, the big news bite dance tonight, Microsoft's bid for Tick-Tock, which leaves Oracle as the winner. Well, not quite. Oracle would serve as Tic Tacs trusted technology provider, which means abidance is not actually selling tech. Talk to a US company and therefore holds the reins on the algorithm or continues to control the algorithm. Microsoft's bid was rejected because it would have taken over this powerful algorithm had the deal gone through. What was Microsoft's biggest mistake?

[00:02:43]

Simple, the same mistake that almost everybody in this country has made, and that is they took the president at his word and thought that big dance was going to actually have to sell. And Microsoft proposed actually taking over the company, taking over the algorithm, putting their security in place, having their engineers dictate the algorithm or control the algorithm. Microsoft has had some success with the consumer companies. They have the cash flow. They have the security. They seem to me to be the likely acquirer.

[00:03:08]

But it appears that holding fundraisers for the president is, in fact the deciding factor. And that is Larry Ellison and his president are kind of the two, two of the few that have come out of the closet as Trump supporters.

[00:03:19]

My my bet is there's a lot more Fortune 500 CEOs who are going to go into the voting booth and vote red, because I think they mostly vote with their pocketbook and think whoever it is can put more money in my pocket than a lot of faith in government, in our kind of closeted chambers, if you will. Anyways, these two, to their credit, I guess, are fairly out and proud about their support of Trump. And what do you know, the Sequoia and General Atlantic, backed by dance, figures out a way to not sell.

[00:03:49]

This is what is this is another example of how China has usurped global leadership from the US. We've had ten years pulled forward and ten weeks. And the new geopolitical leader is, in fact, China. We've been played. This is similar to the trade where the intention here was good. China can't expect to have free reign over our markets with their technology companies while kicking meticulously and deliberately all of our technology jobs out of mainland China and expect to have free reign over here.

[00:04:15]

However, however, going about it as a series of one offs based on the president's ID or personal biases or who is throwing fundraisers form seems I don't know, seems like we've become fucking Russia. I mean, this is just totally out of control. What happens when China turns around and says, you know, we'd like to crash your markets and we've decided all supply chain facilities from Apple have to turn over to Huawei within forty five days. Could that not spark a major selloff in the Nasdaq and potentially spark, I don't know, a market crash?

[00:04:48]

What happens when India, Brazil, Canada, Indonesia say, you know, Facebook, your second largest market is in Indonesia.

[00:04:57]

We like your hosting to be done with the local provider or we are going to force a sale within 45 days. This isn't even a sale. This isn't even a sale.

[00:05:05]

What happened here? What happened here? Error number one, Microsoft. The president has word heir to this was legally unenforceable. You were going to have to get Google and Apple to pull the Ticktock app off of their app stores, which would have caused a legal battle because Apple nor Google want to be forced into taking certain apps off in certain. Is this this was never legally enforceable, so it's likely that the legal advisers whispering in Trump's here said, hey, boss, we recognize you think you're in a reality show where you wake up and deploy this ridiculously bad business judgment thinking you're going to be the hero at the end.

[00:05:40]

But the reality is legally, you're job Schitt's Creek without a paddle. So maybe if your buddy Ellison comes in and turns it into basically an investment and gets the cloud business and they seem happy, you can recover from being way out too far in front of your skis, declare victory and leave by dance. Seems happy. Although the bottom line is I don't think this deal closes. I think they're going to play, wait out the clock, beat the clock, and then they'll be a bad administration based on all the polls I'm looking at.

[00:06:11]

And we're going to see if, in fact, that bite dance ends up closing this ridiculous transaction. Another example of how we have passed the baton of global leadership to the Chinese, another example of how governing by ID, it just doesn't work. Yeah, it makes sense that China should not have free rein in markets and not have any sort of reciprocity. But it has to be policies. It has to be things that are enforceable. It has to be certain standards and protocols.

[00:06:37]

The companies know the rules that they're playing by so they can make appropriate investments. Can you imagine how pissed off Microsoft is, as evidenced by their press release, basically saying, yeah, we were going to we were going to actually have security here, we were going to actually do what you you wanted or you said you wanted to happen. So good luck with that over at Oracle.

[00:06:56]

But what's the inside here when the dog puts his nose in the air and he smells something in the air and he goes, something's up, something's up. I smell a bear, right? I smell a bear. Or is it or is it that great chicken dinner that mom's making?

[00:07:10]

I don't know where I got that great chicken there. Anyways, dogs are very intuitive. What is the intuition here? What is the insight?

[00:07:16]

I spent some time on Tick Tock and a little bit of reminds me of my son turn 13 and it literally feels as if yesterday I dropped him off at preschool and today I came home and he was surfing and was a seventh grader who rolls his eyes and won't kiss me any longer. But that's another story. That's another story.

[00:07:37]

So along those lines, along the lines of time, just flying by, I decided to check out Tock I'd never been on. And I went on it last Friday and I lifted my head and it was Monday. This shit is unbelievably addictive.

[00:07:52]

We're talking MDMA, heroin kind of addiction.

[00:07:56]

And I was saying it got me thinking it got the dog thinking. And that is in my head, Scott. I'm thinking like when a dog walks into a room and doesn't know why it's in that room. And what is so powerful about ticktock and what is it what does it mean for the rest? What's the learning?

[00:08:10]

What it comes down to for me is signal liquidity, signal liquidity, trademark hashtag, all rights registered to Proview, signal liquidity. And that is the example I always think of is Netflix. And that is if I'm watching Season three, Episode four of House of Cards and I watch it all the way through the EHI on the back end of Netflix goes, well, we think and we're so confident that Scott's going to like Season three, Episode five that will begin playing it in three to one without asking him to find his remote click.

[00:08:42]

Yes, what have you. That for me is kind of how A.I. has changed my life, if you will. And this signal liquidity, is there a couple of things. One, I picked, I clicked. I found a house of cards and I watched it all the way through. And I'm sure there's several other signals there. But we're tick tock. What you have is you have the signal liquidity that is just exponential and that is for every signal that Netflix gets from me to inform their AI algorithms by virtue of the short form video that's on tick tock.

[00:09:14]

They get one hundred and twenty signals and they get exponentially more than that, based on which videos I watch all the way through the topics of the videos, what I like, what I comment on, and slowly but surely and slowly but surely, I end up with a stream of videos that have chiropractor's adjusting people's necks. For some reason. I find that fascinating and I do. And I didn't even know I found that fascinating. That is what is so fucking scary about ticktock in the algorithm here is it seems to know what you want before you know what what you want.

[00:09:45]

Anyway, this thing is so good and it calibrates, it takes out signal liquidity and it calibrates in on content that you find fascinating or enjoyable. And you go into a rabbit hole and you look up and boom, it's it's two hours later. So I think it comes down to signal quality.

[00:10:01]

Let's compare and contrast that versus another another short form video platform to launch around the same time or was exposed to Americans around the same time, quippy equivalent for a star mentality and got these famous 60 and 70 year olds.

[00:10:16]

Right. And by the way, a consumer doesn't care that you produce Shrek. The consumer just really doesn't care or that you ran. The consumer doesn't care. Not to say they aren't fantastic. But please name a tech firm that has been successful where the founders are in their 60s, that is incredibly ageist and guess what? Business is ageist and so is the human brain going back to those twenty seven year olds who kill themselves with heroin in a remarkably creative what happens, the creative brain after 30.

[00:10:43]

Jesus Christ, U2 hasn't written anything in 15 years. Michael Jackson couldn't couldn't slip and not spit out a number one song until about the age of twenty seven. And then he stopped doing anything and for the life of them, couldn't get a hit. But anyways, anyways, there's something unique about the young brain. There's something unique about young entrepreneurs. Back to Quimby. It's not working why founders were too old and too not enough liquidity. And three, this old notion of overproduced, expansive content and what is ticktock more signal liquidity sitting on top of free content that is created by users.

[00:11:20]

And then that algorithm, the genius is that algorithm begins zeroing in and calibrating on what type of seven and eight production value content you absolutely love, whether it's Labrador's on skateboards versus hoping that you can spend three or five or seven million dollars on Cezar, the dog whisperer, on a series on dogs that that embrace extreme sports.

[00:11:46]

And we're going to find a way to take all of that content and begin to slice and dice it with the use of the signal liquidity and this algorithm to get you to eight, seven or eight content that you love or that hit censor's for whatever reason, whatever those sensors might be, versus trying to find a point zero one percent of content are going to put their money behind and try and get it to an eight or nine intrusted.

[00:12:10]

It tickles that sensors or put another way, the new forward looking algorithms or the new forward looking platforms are more about a means of figuring out inspiring low cost content, but figuring out a way to get dramatically more content and then figuring out signal liquidities such that you can get to the seven or eight out of 10 content that is more relevant to you versus nine or ten quality content that may or may not appeal to you in this blows my fucking mind.

[00:12:37]

Why it's dangerous because with this type of signal liquidity, with this type of algorithm, someone on the other end, there's always a human on the other end of the algorithm. There's a human on the other end of the benign algorithms of Facebook saying we don't give a shit about the health of the Commonwealth or teen depression. We just want the algorithms to figure out a way to get more engagement. And then the algorithms figure out that the ultimate way to get engagement is engagement.

[00:13:01]

And then when Facebook executives get shit because they're enragement and tearing of the fabric of society and oppressing teens is bad for us, they decide to protect the algorithms and come up with bullshit like we don't want to be arbiters of truth or we don't want to be in the business of determining what's right and what's wrong, such that they can let the algorithms become the fucking antichrists of technology. But what could happen here? What could happen here?

[00:13:23]

The same decision on the front end of that design of these algorithms could say, all right, I want Biden to lose. I see Trump as being more favorable for my interests because he is tearing apart America because the pandemic will continue to rage on, because he will likely turn America into a shit show the virus ravaged, polarized, extrem, a society that will literally begin to collapse under its own self-indulgence, weight, narcissism, lies, conspiracy theories. All right.

[00:13:53]

That's what I want. Now, how could I get the algorithms and tick tock to play a role in that? Simple, I'm going to start sending you content that appeals to you that undermines the credibility of the Biden campaign. Now, it might be humorous videos. It might be videos of Trump rallies that are appealing to me. It might be videos around the economy. It might be videos undermining or misogynist videos about Kamala Harris or videos that are racist or subtly racist.

[00:14:24]

If there is such a thing as subtle racism, but slowly but surely start calibrating in on what is the soft tissue around individuals biases. What receptors are most open to this, these signals that undermine the credibility? Or maybe we don't even go there. Maybe the signals start immediately telling the algorithm or in forming algorithm, hey, it's hard to get these people off, Abidin. You're never going to get them to Trump. I know.

[00:14:48]

Let's tell the algorithm to recognize is that to immediately go to discouraging them, talking about extremist positions from both conspiracy theories, about both misinformation around voting and let's just suppress the vote. Let's just get people so fed up, so confused. Let's muddy the waters such that there's zero visibility, such a come Election Day in the areas that lean Biden, we're going to confuse them and discourage them and suppress the vote. That's what these algorithms could do with their signal liquidity and with their massive amounts of content such that they could begin zeroing in and slicing the cheese so finely, so finely.

[00:15:25]

Did they get the perfect type of cheese because there's more flavors in their ability to slice it and test it and test it over and over, such that we get you to the exact cheese that you cannot stop eating my brother, you cheese eating weirdo anyways.

[00:15:40]

Tick tock. Tick tock. To summarize, this was not a sale. This was the serious refinancing paid at an inflated valuation that includes a big cloud contract for Oracle. The president got over eskies legally, was out on a limb here, and is going to declare victory and leave and move on. I think there's more likely than not this deal will not close and tick tock, tick tock and signal liquidity and algorithms are dangerous and we should be concerned.

[00:16:11]

We'll be right back. Small businesses have unique needs, specifically survival, and despite the current uncertainty, one thing that remains unchanged is the importance of having the right people on your team. Team of the best players wins, team of the best players minds anyways, when your business is ready to make that next.

[00:16:30]

Harlington jobs can help you by matching your role with qualified candidates so that you can find the right person quickly. LinkedIn is an active community of professionals with more than six hundred and ninety million members worldwide.

[00:16:42]

Well, that's obvious is only 350 million people here. It's just us and Canada and Mongolia.

[00:16:47]

No, it's worldwide. LinkedIn Jobs puts your job post in front of qualified members every day. So that is seen by people looking for jobs like yours. It's easy to use and helps you organize your candidates all in one place so that you can prioritize your time and energy where you choose. That is redundant. That is what it means to prioritize anyways. When your business is ready to make that next hire, find the right person with LinkedIn jobs, you can pay what you want and get the first fifty dollars off.

[00:17:12]

Just visit LinkedIn Dotcom Egorov again, that's LinkedIn Dotcom Proff to get fifty dollars off your first job. Post terms and conditions apply.

[00:17:36]

Welcome back. Here's our conversation with Yael Eisenstadt, a visiting fellow at Cornell Tech's Digital Life Initiative, where she works on technology effects on civil discourse and democracy. Yale also served as the Elections Integrity head for political ads at Facebook back in 2010 and has a really impressive background in the national security sector, including stints as an advisor to the White House and with the Central Intelligence Agency.

[00:18:01]

YIO, where does this podcast find you?

[00:18:04]

I am sitting in my apartment in New York City. You have also taken the world by storm after 30 years or 20 years of good work, you're sort of an overnight success. I keep seeing your name everywhere. This was it was a coup to get you. So, first off, let's let's just start with you were the global head of elections integrity for political advertising at Facebook. Isn't that an oxymoron?

[00:18:31]

One could say it is a bit, yes. But when they reached out to me, they offered me that title. I said, don't hire me if you don't mean it. And so, yeah, that's what they said they were hiring me for. And did they mean it? They did not. In my case, they did not mean it. And just to be really blunt. So, yeah, I came in with this sort of mandate of, according to the recruiters and everyone I spoke to, of creating building a new team to really very shortly after the Cambridge Analytica scandal really became public.

[00:19:04]

And I mean, the reality is I came in to do this and on the second day they changed my title and job description. So I guess they didn't really second day they can say I screwed up. So, I mean, yes, I can do add in a job like that.

[00:19:21]

Is your job to actually try and figure out how to make how to ensure that there is some integrity that bad actors have weaponized the platform? Is it really about integrity of the platform as it relates to elections or is it or is it to get more money from political advertisers or is it to create a veneer of security? And what did you feel like you were? What did success look like for you in their eyes at Facebook?

[00:19:49]

It's just this talk about political advertising for a second, because as I'm sure you're aware, I actually don't think political advertising is the biggest problem on the platform. But for this particular role, there were some legitimate integrity efforts that they were trying, such as let's make sure that Russians can't pay in rubles to buy ads on our platform. Let's do I think for the foreign interference part, it's pretty clear what the mission should be there in terms of cleaning that up and making sure it doesn't happen again.

[00:20:19]

And it's a lot less politically risky for a company like Facebook to try to figure out how to not let Russian actors exploit the platform through political advertising. The trickier questions, though, is I have a much broader lens. I am looking at this in terms of how are you affecting our democracy? And that includes domestic actors. That includes a whole bunch of things that gets much more politically tricky for the company and for my experience anyway. There was no appetite for me to go deeper than the sort of reactionary moment of they were building that ad library.

[00:20:57]

They were putting out new requirements for how to verify political advertisers. They were very sort of reactionary tech responses. They weren't. They were questions of how are we affecting elections in general and what can we do to protect against that.

[00:21:13]

And you said that advertising wasn't the most dangerous thing about Facebook's role in elections. What is the most dangerous thing?

[00:21:20]

So, I mean, it clearly played an important role, especially in twenty sixteen. And I know that a lot of people they're talking point likes to be while the Russians only spent this much money on ads and therefore it wasn't a big deal. Let's be really clear, though. They might not have spent a lot of money on ads, but they also if they use them and some of this stuff is blackbox. Some of them, some of it we'll never know because there's just not transparency in a company like Facebook.

[00:21:44]

But to say they had no impact through ads is not true. I mean, they got to use their sophisticated targeting tools. They got to I mean, I would not say it is not important. That said, at this point, you have a platform who is fundamentally successful because it has succeeded in using our human behavioral data to then try to persuade us whether it's to buy Nike's instead of Adidas, whether whatever it is. Ultimately, it's a persuasion machine to try to get us to do something, to be on their platform or to engage more, to buy that or to look at the ads, maybe not buy, but to click on the ads they want us to click on.

[00:22:23]

But what does that do for political speech, for how we think about political rhetoric, for how we think about truth versus fiction, how we think about how we even consume information. These are the things I am much more concerned about. I really don't care if you show me and Nike's ad versus and Adidas ad I do you care on how you are affecting my ability to discern truth from fiction, to even understand the information environment at all anymore. And yes, bad actors are going to exploit the hell out of that.

[00:22:56]

I mean, I'll just say, I know you've been asked this, but I'll just say the idea that they can continue to say nobody could have seen this coming when a. So it's like the Russians, for example, did in twenty sixteen, maybe nobody at Facebook could have, but I guarantee you people who worked on Soviet Union information operations, propaganda, Cold War stuff could have seen it coming if they had understood how Facebook worked. So it's its guess.

[00:23:26]

Is it Facebook's fault that there are bad actors out there? No. Is it Facebook's fault that they are more concerned with growing and dominating the entire world's information ecosystem as opposed to figuring out how to not enable those bad actors and provide them tools to disrupt our democracy? Yeah, that's that's where I put the blame at their feet.

[00:23:47]

Isn't the danger or one of the dangers not or a bigger danger than the advertising itself is that if the algorithms promote content, that takes you one way or the other or absurd to or muddies the water or just discourages you from turning out and voting, that it's the actual content and the algorithms promotion of certain types of content. This whole freedom of speech versus freedom of speech is not the real threat, in my opinion.

[00:24:16]

Absolutely. So it's it's in it completely also contradicts whether you want to give the freedom of speech argument, whether you want to give the mirror to society argument. I'm not going to get the whole attention economy speech, and I've heard that before. But at the end of the day, this is how they make money. They make money by keeping us engaged in that. There are plenty of people to speak about what that means. But in terms of elections and political speech, it means that the algorithms, they steer us, they steer us towards what content we view.

[00:24:50]

They steer advertisers towards what content they target us with. They recommend groups to us. This is not just me going on Facebook and saying exactly what all of my friends posted on any given day. And the most dangerous part in this is really when it comes to things like voter suppression, when it comes to things like completely destroying the public's trust in our election system to begin with. But to be clear, this isn't something that just suddenly happened from day one.

[00:25:23]

Their product was built to steer us towards certain kinds of conversations. And so, I mean, I first got into this five years ago when I was looking at what was causing the breakdown in civil discourse here. Yeah. And that was well before we were talking about whether or not the Russians manipulated the platform for the elections. So I just think that there is there is a fundamentally unhealthy way in the way this platform has been built, the way it's monetized, the way I mean, this is I'm a public servant at heart.

[00:25:55]

I spent most of my life in the national security world. I think the idea that the world is a better place if we have frictionless morality and everybody can immediately, within a half, less than a half a second flat, boost their message out to the entire world, no matter how damaging that message is. And the algorithms can do whatever they want and amplify it and spread it without any. Just slow down for a second. Is this something that's worth being amplified?

[00:26:24]

Is it? I'm not saying take it down, but is it worth being boosted and spread and targeted? That's the questions that I want to get at.

[00:26:33]

Yeah, it feels like there's a difference between frictionless and accelerants that tend to be poured on things that might be damaging to the Commonwealth. Right. That I don't think people are arguing that we should shut down antibacterials. But if any of AXA's represent X percent of the population and the algorithm recognises that it creates a ton of controversy and more engagement and more ads, should it get 10x the amount of oxygen that it would naturally get on its own set of representative cocktail party?

[00:27:01]

If you had a cocktail party that consisted of the population of America, would you let the anti backsliders, the white supremacist stand on a table and just dominate all the conversations?

[00:27:13]

Isn't it isn't it that for some reason these algorithms have decided to, you know, what's good for advertising or just happen to be the most inflammatory, damaging things?

[00:27:22]

Right. So, you know, there's a lot of people who love to use these arguments that algorithms are neutral or algorithms or we some people will even say we don't even know what the algorithms are doing. But there's a critical step. They're missing there. It's at the outset you decided to set a goal for your algorithm. You decided that to train your algorithms to ensure maximum engagement on your platform. And the fact that the algorithms have figured out the way to keep us engaged is to feed us the most.

[00:27:54]

You know, I know some people don't like the term click bait, but it is the most click baity salacious. It is human nature to if you are offered two things to look at and one is super exciting and salacious. And oh, my gosh, just click here. You won't believe what you see that is going to engage more people than some super wonky like according to these three sources, this is what I found about X, Y or Z today.

[00:28:18]

And so that's what's happening. And that is in now Facebook is in this like whack a mole, reactionary, responsive stance of. But we're taking this down and we're taking that down. But they never, ever, ever talk about. OK, maybe the problem here is actually how our algorithms are delivering content or connecting people are recommending groups. I mean, I have lots of examples that just remain black boxes because we don't we'll never get the answers to them because we don't get there's no transparency around how these things work at that company.

[00:28:55]

So there are a few people who would be more qualified to discern whether I am being paranoid or if I have insight around this issue.

[00:29:03]

And I'll outline a scenario and tell me tell me where I stand on neurotic, paranoid or or common sense.

[00:29:09]

If I were working for the GRU and I went to Putin and said, OK, you can spend four billion dollars on a new nuclear class aircraft carrier, or you can give me five hundred million and I'm going to identify the fifty thousand most influential people in the U.S. who tend to have what I'll call anti Russia tendencies or a talk tracker, a narrative, the anti Russian and I am going to deploy using content farms and humans and and crawlers, an army of people and technology to do a couple of things.

[00:29:45]

One, undermine their credibility any time they bring up Russia, anytime they talk about Trump, who is perceived as pro-Russian versus an anti Russian candidate. I am going to weigh in on Twitter, on Facebook, and in a thoughtful way, say, hey, Scott, I love your stuff, but every time you bring up Russia, you get it wrong or whenever there's an opportunity.

[00:30:06]

When you say something provocative, I'm going to win and try and pick a fight and turn your Twitter feed into a cesspool of anger. So it's that people just turn off, you turn off your ideas. I think that would be a really smart thing to do in terms of an allocation of capital for a foreign government versus investing in traditional armaments. And as a result, I feel as if there are bad actors on my platform every hour of every day.

[00:30:31]

Am I paranoid or is that common sense?

[00:30:33]

God, I love this question so much. I mean, you're definitely a little paranoid, but it's also common sense. Doesn't mean I'm wrong, right?

[00:30:41]

Oh, listen, I'm going to I'm going to move away from my elections hat here and just say I did spend my life in the national security world. I recommend everybody go look at the video from the KGB defector Yuri Basman off in the 80s. He actually did an entire interview. He was a KGB defector and he did an interview about the Soviet Union's plan to demoralize America. And he completely lays this out right about it's going to take a generation or two to just inundate you with so much information that you don't know what to trust.

[00:31:12]

You don't know what to believe. It kind of lays out what they're what their grand strategy is. What you're saying, can I say if that's exactly happening or not? Know. But think about it. Russia gets to play a much larger role than than they actually have the military capabilities or the economy to do, because this is such a cheap operation that actually just requires like this true vision of Russia exporting its philosophy to the world. That does not include tanks, that does not include drones.

[00:31:46]

I mean, it is just technologically, maybe not the lowest lift. It is sophisticated, but it's inexpensive and it's completely consistent with what someone like Vladimir Putin's ultimate goals have been for a very long time. So, I mean, I can't confirm that your paranoia isn't somewhat misplaced. But I you're right. This is exactly and this is in part what they were doing. And we had this example. I hope I'm not going to get it wrong.

[00:32:14]

I'm working from memory here. But just a week or two ago where Facebook talked about how Russians were at it again and they were paying Americans who didn't even know they were being paid by Russians to actually create the content now. And they will continue to shift. There is not like Facebook created an ad library. So now the Russians aren't going to try anymore. And what really concerns me is like, do I think that some of the cybersecurity experts are great at what they do?

[00:32:47]

And the FBI is also watching this now. And Facebook doesn't want to be caught having the Russians overtly manipulate our elections again. But at the end of the day, you do have a platform that has tools that can be used in very dangerous ways. And those tools still exist and they've never been regulated. They are a free for all. More tech is going to solve it all. And this is an age or I mean, I don't want to overly focus on Russia.

[00:33:15]

I think we have just as many bad actors in the US right now. But the Russia question, this is not new. This is an age old ideological battle and they are getting a much less expensive way to handle it now.

[00:33:28]

And what year they bring you back to Facebook and they say, OK, our number one priority, our number one stakeholder. Is the health and well-being of the Commonwealth, not shareholder value? What would you recommend or what would you have them do? First of all, despite everything I I have looked at this from every angle possible. I do not think it should be up to Facebook to fix it. I think they should fix certain things for sure.

[00:33:52]

But the idea that this industry this many years on is still one hundred percent mean that one hundred percent unregulated. That's an exaggeration, but is largely unregulated. And if you ever talk about some of the ways that we should impose responsibility on these companies, all the free speech is all that that you're trying to curb free speech. And they go into these absolute is completely binary arguments is frustrating. So first of all, I would absolutely define responsibility for this industry.

[00:34:22]

But if I had the company itself, I mean, I would first and foremost. Say you have to just change your entire business model, and if you are like you have to figure out how to monetize your platform without a using my human behavioral data against to be without doing anything, I don't want to have to click through twenty seven things to figure out what my security profile is and what my data. No, you should ask me before you do anything with my data.

[00:34:53]

So first and foremost they have to change their business model and that's something that they'll never. Why would they. They are. That's, that's, this is your area more than mine. The markets keep rewarding them. Like no matter how much I might scream from a rooftop, they're not breaking the laws and the market keeps rewarding them.

[00:35:12]

Yeah, it feels as if Netflix hasn't been weaponized because it's subscription that if you think about in terms of tobacco, social media is nicotine. It's addictive, but it in in and among itself, nicotine doesn't give you cancer. It's the tobacco. It's a delivery mechanism. And in this case, it's the ad supported business model that is really the kind of the stuff that gets your sick. Right. That I agree it has to come down to a change in the business model doesn't also have to come down to what I refer to as the idea of deterrence.

[00:35:42]

And I'll use an extreme example.

[00:35:45]

The Rosenbergs conspire with the Russians. We decide to take a mother and a father and and execute them.

[00:35:52]

And at what point when social media platforms make so much effort to delay and obfuscate what is actually going on in their platforms because it might reflect them in a bad light, even if it means delaying or counterbalancing bad actors. At what point does that negligence become criminal? And do you think outside of a total change in the business model, this is going to get better unless the deterrence becomes stronger? I'm not talking about a five billion dollar fine. I'm talking about fifty billion dollar fine.

[00:36:23]

I'm talking about a perp walk.

[00:36:24]

Does this ever get better without substantially increased downside?

[00:36:29]

So I actually don't think it does. And that's someone who is somewhat optimistic. I wouldn't keep fighting this hard if I wasn't. I hate that I'm coming to the conclusion that I don't know if it is fixable. Listen, first and foremost, I do not believe it is fixable under its current leadership. I do not think it is fixable when no matter how much society, civil rights leaders, academics, journalists like I know that some people love to say that because I am not a computer scientist and I am not a lawyer, what right do I have to think that I should be part of this debate?

[00:37:03]

Because I am a member of the public. I am someone who spent my whole life fighting for democracy and I am a consumer of your product. And I fundamentally do not think that Mark Zuckerberg will ever be persuaded. He made a very intentional choice to grow at all costs, to scale, at all costs, to dominate. He uses the word dominate to dominate the world's social media landscape. And I don't know that there's any change in him because he is we've made it very clear there's no way to punish him.

[00:37:34]

And part of the reason there's no way to punish him is because we've never actually created laws that apply to these companies. Are our Internet laws. I mean, they were written in the nineties. And so I don't think it's fixable without actual strong regulation. I do think there's a way to do it without having to destroy the whole company. I do think that if we could I mean, can I give you like a just an example of a world that I would like to see?

[00:38:06]

Let's use a real example. Let's use this Bogalusa example from May. We had the situation where there are two men they met in a Facebook group. Right, too. And they were part of the group. For anyone who doesn't know, it's the group that's basically advocating for a civil war. So they meet in this in a Facebook group. They sketch out their plans, according to court documents, I believe, using messenger. Then they go and meet in person for the first time and they go and they kill the federal security guard in Oakland.

[00:38:36]

They're exploiting the Black Lives Matter protests and they end up killing a federal officer. And so there's like three different layers here. It's the first question, right, is, should Facebook bear responsibility for the fact that there's Bogalusa groups and content on their platform? And maybe not like you could argue, both sides of that. Right. You could argue free speech. You could. I'm not going to actually weigh in on an answer on that, but then you go to the next level and it's OK.

[00:39:00]

Well, what about the fact that should Facebook bear responsibility for the fact that these two men met in a Facebook boogaloo group on their platform? And the only way we could answer that question is if we knew if those two men actually went online and searched for that group or if Facebook's. What if you found out that Facebook's recommendation engine recommended that group to those two guys and they met because Facebook's platform actually steered them towards. That group that they weren't looking for to begin with and, you know, I you and I will never know the answer to that question because of the widow of the officer who was killed in Oakland, decide to try to take Facebook to court over this.

[00:39:38]

I suspect it would be thrown out based on S.T. 30. We would never get to the discovery process. We would never be able to find out if those two men were connected using Facebook tools as opposed to if this was their intent to begin with. And because we'll never know. Nobody will ever be held accountable. Why shouldn't we at least get to the point of even being able to find out that piece of information? What about Twitter? My sense of Jack Dorsey, despite the nose ring, despite the silent retreats, just just simply put, it doesn't give a shit and I see so much just rage and hate on Twitter.

[00:40:21]

And I don't know if you listen to the daily the New York Times podcast where they ask what what are they doing to try and prevent this in the best way to come up with was this app they were testing in Canada or this feature that would prompt you to say, do you really want to send this article? You haven't read it yet. I mean, that's kind of the sum of their efforts so far. And granted, they decided to stop taking political advertising.

[00:40:42]

That was a pretty big, pretty easy gift for them because they were making no money there.

[00:40:46]

My sense is Jack Dorsey and Twitter are just as bad or they're their complexion is just as irresponsible. It's just that they're they're they're negligence. They're delaying obfuscation. Their lack of regard for the Commonwealth doesn't sit on as big a platform, but it's just as toxic.

[00:41:05]

So it's interesting. I mean, I've heard you talk about Twitter before, and I'm going to start out with to be frank, I struggle with this one. I don't have direct I mean, on the surface, it definitely looks like Twitter is trying at least harder than Facebook is. And from my own experiences of trying to push for some of the things we care about and working with people do Twitter is at least more open to trying to figure out and grapple with what have we become and what do we need to change now?

[00:41:40]

Still, a lot of it is still the sort of Band-Aid whack a mole reactionary xes. But it's funny, I listen to that daily New York Times daily podcast and I very, very rarely tweet about this kind of stuff. But I did a whole Twitter thread about, come on, Jack, like you couldn't even that it's so frustrating you couldn't even answer these questions. Right. But I do think he's more thoughtful. I think he is more open to at least admitting that maybe some of his ideas are not perfect.

[00:42:14]

I think some of the people who work for him seem pretty dedicated to, at least in the election space, for example, to trying to take stronger stances. Whereas Facebook is immovable, they are never going to change their ideology. That more speech is better than bet counters bad speech. They're never going to change all of their ideologies that drive me insane. That said, I don't know enough about Twitter to know if if I completely agree with you.

[00:42:40]

I struggle with this one, to be honest.

[00:42:42]

But let me ask you this.

[00:42:44]

There's being thoughtful and responsive and then they're speaking in slow, hushed tones to give you the impression I'm thoughtful. And then when you listen to what I say, I'm not saying or doing anything.

[00:42:59]

And so let's let's say I write Facebook, Facebook. Well, you know, we can expect what we can expect their Twitter less bad. Where do you put Google on the spectrum?

[00:43:11]

So Google is another one where it's a little bit I mean, there were lots of things that I would have been very strong about how I feel about YouTube especially. I mean, you can just listen to Rabbit Hole if you want to listen to a podcast that goes into it. But I am not as much in the space in terms of Google's dominance in terms of ads like I'm not an antitrust expert. I do think they have done some work to clean up some of the things like auto populating, search results.

[00:43:42]

I mean, the fact that you used to be able to start with is a Jew and it auto populate all the most horrible things you could possibly imagine. They've cleaned some of that up. YouTube is what I'm focused on more because it's that same model, right? It's the engagement model. It'll be recommending videos to you. It's the that sort of mirror to society. And yet they're using information about me that they've tracked about me all over the Internet to try to persuade me to watch something.

[00:44:13]

And the only way that I am going to stay on and click, the next thing is if it is a little more exciting than the thing I watched before it. Otherwise, why would I click on the next more extreme, more titillating so that part of Google, the YouTube business model? Again, that is more my expertise in terms of the other things about Google. Are they too big like the I? I don't have the answers to that.

[00:44:38]

But one more thing about going back to Twitter. Yeah. The answer that Jack gave that I try to give him the benefit of the doubt, because I do know he's made some changes that I think are interesting and they've taken a stronger stance, to be frank, on speech, including from our current president than Facebook has. And so at least I give them credit for that. But in that interview, at the end of the day, he still implied that implied so pretty frankly, that growth is the solution.

[00:45:07]

More voices will lead to a better society. And if we've come this far, and that is still the unbelievable like end all be all a response from the Silicon Valley is it is more voices. I mean, more voices at the table certainly makes for a more robust democracy. I'm all for it. But more voices with no guardrails and no rules and not fixing any of the ways your platform has been weaponized to spread hatred and division. And all of that is not the solution.

[00:45:39]

And when you look forward, when you OK, so 2016, the election interference on Google, Twitter, Facebook versus to 2020, is it the same, better or worse, outside interference? I know. Outside in terms of foreign interference or any more bad actors, internal, whatever, might be trying to use these trying to use these platforms to, I would say, well, it's interesting, right? If it's if it's citizens trying to influence the outcome, that's it's two different things.

[00:46:07]

So I would say bad actors, bad, bad, bad actors.

[00:46:10]

OK, so I think we do have a better grasp in terms of the foreign interference angle. I also am encouraged to see that there seems to be collaboration between government in the platforms. The recent takedown was because of an FBI tip to Facebook, according to everything I've seen in the news. So that's good. I think we have a better handle. I think the platforms don't want to be caught again, allowing foreign actors to severely intervene. It doesn't mean it's perfect and there's still a long way to go at the end of the day, still, nobody's ever been held accountable for any of that.

[00:46:43]

So that's another conversation. But I think we've solved a lot for the threat of twenty sixteen. I don't think we've solved for the threat of twenty twenty. And while foreign actors are still a very important threat, I think we don't have a grasp of the domestic actors, whether it is people who truly are doing it for a profit, like some people who are spreading chaos and and all of this or doing it for profit or they're doing it for all sorts of reasons.

[00:47:10]

We have don't have we haven't really any. Look at the. The coordinated, inauthentic behavior that happens on the domestic side on a platform like Facebook, the fact that they've allowed that as long as they have because it was politically difficult, it is a politically difficult decision to tackle domestic bad actors when you need to stay on the right side of the administration in power. And also think about if you're a smart company, you know this. If you're smart company, you want to play both sides of the fence, because if you have a long game, you need both sides of the fence to not regulate you.

[00:47:48]

Right. You don't know who's going to win next. So tackling the domestic bad actors, the coordinated, inauthentic behavior and let's just be frank before I know I'll be accused of being a liberal for saying this, maybe just consider that I'm saying this because I'm looking at facts. There is a movement on the far right that Facebook is not tackling strongly enough because it is politically complicated for them to do so. Because they've cozied up to Trump and said, we'll leave these folks alone if you leave us alone in terms of what it feels like, there's from an outsider standpoint feels like there's this unholy alliance between Trump and Facebook.

[00:48:27]

So I cannot confirm if there's any sort of actual overt, unholy alliance. But there's also, you know, off for gloves to play this both side ism thing. Right. Like, we're going to put labels on all posts about voting. Now, that is the most politically convenient decision, because all that's doing is saying I'm not going to make any judgment on on who's being bad here. And at the end of the day, do I think he actually has some sort of deal with Trump?

[00:49:00]

I'm going to sway the election in your direction. I certainly hope not. The bigger question is, why does one man even have the power to be able to sway an entire election? That's a bigger question. But to get back to your question, I'm rambling now. Conservatives have been very, very good at mastering this talking point that there's an anti conservative bias at Facebook. I personally never saw it. In fact, the only time I was very surprised by a decision we made about an appeal on certain content, it definitely went in the conservative way, not the other way.

[00:49:37]

So but they're very good at mastering the talking points. So there's the public pressure of now, Zuckerberg can say, but anything I do, somebody will be unhappy. But the way he handled let's just be frank, the way he handled Trump's posts about the looting and shooting, very blatant attempts to lie about voting procedures like that's the line that has been crossed. That really makes me say you really don't want to be on the wrong side of this administration.

[00:50:09]

You must have your reasons because otherwise you would enforce your policies evenly as opposed to saying, I enforce my policies against everybody except for the president. So what are we missing as we tend to look at the dumpster fire, that is the elections where it's always the stuff you're not watching, that tends to be tends to jump up and bite you. What are you worried about?

[00:50:32]

So I'm really worried about what happens after November 3rd. And you are starting to see some people talk about that. Even Zukerberg actually made mention of that in his post a week or two ago. But my biggest concern now, in addition to everything leading up to the election, is just picture this picture that November 3rd hits. And we've already been inundated with every reason not to trust this election. Right. All the chaos that's been spread by whether it's a president or whether it's bad actors, whether it's foreign actors.

[00:51:03]

And then we come to a situation where let's say, for example, more people from one side voted in person because that is what their party has been really promoting. And more people on the Biden camp voted by mail because that's a lot of what we heard on the left. And on November 3rd, Trump declares victory and he starts declaring victory on Facebook, on Twitter, on every social media platform. And exit polls start to show that and the media starts to talk about it.

[00:51:29]

And then as the votes start getting counted, as they start coming in, slowly, slowly, the numbers start to shift. And what is he going to do then? He's going to immediately claim that. See, I told you they would steal the election. And my biggest concern is not even about the chaos at all of that is going to bring. But we're in a very volatile time with covid, with the pandemic, with, you know, social justice at the forefront of so much with fires in California, there's so much volatility.

[00:51:57]

We're so anxious. And, you know, Facebook definitely contributes to a lot of that by allowing all of the salacious content to constantly be flooding our feeds and by not ensuring that only true trusted content about elections is letting our feeds. And he starts dog whistling to his supporters to get out in the streets. And I'm just really concerned about because of all the disinformation that has been allowed to spread on these platforms about the election from November 3rd until we actually have a verified result.

[00:52:30]

That's the most volatile time. And if those platforms do not take very bold steps, including possibly not allowing candidates to talk at all about results like bolder steps, and they are used to and know they won't scale globally, this election is in crisis. Let's talk about how to protect this election right now. And if they don't take really big, bold steps after November 3rd, I'm very concerned about how the platforms are going to be used to really spark what is already a tinderbox of anxiety and what that might look like.

[00:53:02]

And advice to your twenty five year old self. Oh, so many things, I think if I were to narrow it down to one, I would advise myself to seek out mentors, especially female mentors. I grew up in a very, very male dominant world in the national security world. And I was so tough at the time and I thought that I thought I was tougher if I could do it on my own and not ask for help. And I think seeking out mentors, not like what's happening right now where I get five hundred LinkedIn requests a day and can I pick your brain.

[00:53:37]

But actually investing and finding someone who inspires you but who also can help really give you advice that is not necessarily helping me find a job, that is helping me think about my goals and my way forward. I think seeking out a true mentor, especially more women mentors, would be something I tell myself. A more just world jail, Eisenstadt is a visiting fellow at Cornell's Tech Digital Life Initiative, where she works on technology's effects on civil discourse and democracy.

[00:54:08]

She previously served as the Elections Integrity head for political ads Facebook and is a former CIA officer and White House adviser and joins us from New York. Jill, stay safe. Thank you. It's great chatting with you.

[00:54:22]

We'll be right back after this break. We've heard for years that it's important to have a diversified portfolio, stocks, bonds, mutual funds, that kind of thing, but if you've ever looked at a breakdown of the most successful portfolios, you'll typically see a diversified set of real estate. So why isn't it one of the first asset classes you'll consider when you're looking to diversify? Simple. That hasn't been available to investors like you and me until now. Thanks to fundraise, they make it easy for all investors to diversify by building you a portfolio of institutional quality real estate investments.

[00:54:54]

So whether you're just starting to invest in real estate or looking to add more, our friends at Fundrise have you covered. Here's how Fundrise is in investing platform that makes investing in high quality, high potential real estate as easy as investing in your favorite stock or mutual fund fundraises. Team of real estate professionals carefully vets and actively manages all of their real estate projects. And with their easy to use website, you can track your portfolios performance and watch. As properties across the country are acquired, improved and operated via asset updates, the platform manages more than one billion in assets from one hundred and thirty thousand plus investors to date.

[00:55:29]

Start building your better portfolio today. Get started at fundrise dotcom property to have your first 90 days of advisory fees waived. That's fundrise FDR. I see dotcom slash property to have your first 90 days of advisory fees waived fundrise dotcom property.

[00:55:51]

Want to know the secret to staying sweat free this summer? What's the secret? What's the secret? I recommend Tommy John's ultra breathable underwear and bras. They have a range of some ready, breathable options that they're cool. Cotton underwear for men and women is like having your own body air conditioning. Tommy Johns Cauchon is made from premium natural Pima Cotton for enhanced airflow and evaporate sweat super fast, keeping you drier, cooler and more comfortable than regular cotton. All of Tommy Jones layers are built for next level comfort, next level comfort.

[00:56:23]

Whether you're on the hunt for Loung Pants, Lazy Day joggers or the soft assume ready tees and polos you've ever worn. Tommy John has you covered Tommy. John is so confident in their underwear that if you don't love your first pair, you can get a full refund with their best peril everywhere or it's free guaranteed. Tommy John. No adjustment needed. I wear Tommy John for a limited time. Go to Tommy John Dotcom's proff to get 20 percent off site on your first order.

[00:56:48]

That's Tommy John Dotcom's Prov for 20 percent off site on your first order. Tommy John Dotcom's Proff Seaside for details.

[00:57:07]

Idea of happiness, I have been thinking a lot about time, actually, that's sort of misleading. I'm always thinking about time. I'm fascinated by it. This notion, the time is probably the most important metric in our life because it has to be static. It has to be trusted. It has to be immovable, because everything we do, whether it's launching missiles or showing up to have lunch with friends to when we're supposed to work is largely dictated or predicated based on this immovable, steady, static, entirely credible, valid, trustworthy thing called time.

[00:57:42]

No time waits for no man. And it's based on celestial objects, specifically the Sun. I was watching Game of Thrones last night and Calice says to her, call or call Droga, you are my son, my moon and my stars to connote you are my everything is to say you are my celestial object because these are the anchors for the most trusted metrics in the world. And that is time. But time is, I believe, malleable. And that is time is a function of celestial movement.

[00:58:13]

A year, right. Takes us three hundred and sixty five days to get around that spherical ten billion trillion ton item of hot plasma called the Sun. And at the same time it takes, I think, the moon about twenty four hours to get around us. And these things are immutable. What's not immovable is our perception of time. Remember when your mom told you that we weren't going to go to the movies tonight? We were going to go tomorrow night and it felt like that would be years.

[00:58:35]

And you are just outraged.

[00:58:37]

You're outraged. Who do I call? Let me speak to the manager.

[00:58:41]

Mom has put off the movie for twenty four hours. Anyways, your perception of time is entirely malleable and this is advice for parents. Time becomes incredibly porous and soft with kids. And that is literally yesterday I was, as I mentioned earlier in the show, I was dropping my oldest son off at pre-K and feeling very emotional. I remember when I dropped him off a pre-K nine years ago was after one of the many school shootings. I can't remember which one.

[00:59:13]

And when I came in to drop him off, there was a some semblance of an attempted an attempt at security. And they had this big kind of wood, metal plated door and they had to buzz you in. And I remember thinking, God, how just how how sad, how fucking sad that you have to drop four year olds off at a place where they are thinking about security. But that's not the story. The story is later that afternoon, I picked up a thirteen year old and took him surfing.

[00:59:44]

And one of the big themes of my work over the course of the past four months has been that covid-19 is not as much a change agent as it is an accelerant. And we talk about that in the context of business and how you make money. I think health care is going to change dramatically. I think 17 percent of the GDP in the form of health care is up for grabs because telemedicine and remote medicine and the way we consume and distribute medicine has accelerated ten years.

[01:00:09]

But what what can we take away personally?

[01:00:12]

And what I would ask of all of us or what I'm trying to do, whether you do it or not, is I'm trying to imagine that the next ten years with my kids are going to go even faster. And the advice I would give to dads is there's no such thing as quality time. There's just time. And if someone were to tell you, OK, you got ten more years with your kid, if your kid's eight or nine and in 10 years they're going to be at college or have left the household, but that 10 years was going to be one year.

[01:00:40]

If the rest of the time you had with your kid the rest of the time the total, some of the amount of time you had to spend time with them to enjoy yourself, to teach them, to love them, to be affectionate with them, to express your paternal and maternal emotions. But you only had 12 months to do it. How would you behave? How would you prioritize your time? What would you do? And then make that your life, make that your relationship with your kid, because trust me on this.

[01:01:05]

Trust me on this, it goes by so fast.

[01:01:11]

And in addition, for your own selfish measures, wanting to take advantage of that, wanting to really embrace that relationship. I got to think that the next 30 years are going to go even faster and selfishly, I want my kids near me toward the end and I want them to look back on their childhood and think, dad wasn't about quality time. Dad was just about time. He was there. He was probably there even a little too much. And granted, a lot of this comes from a privileged decision because I have the ability and the security and the wherewithal to spend a lot of time with my kids.

[01:01:46]

And some people just don't have those options.

[01:01:48]

But we all make tradeoffs with a certain band of the amount of time we allocate to to things, whether it's our friends, whether it's our work or whether a lot of the time is just we're in our own heads and not engaged, because I'm telling you, brothers and sisters, it is going to go so fast.

[01:02:06]

You want them at the end to look at you and think, yeah, dad was wrong. Dad screwed up a lot, but dad was there. There is no quality time. There are just just time. And it it is rocking and rolling and going fast. Our producers, our Caroline Chagrinned and Drew Burrow's, if you like what you heard, please follow, download and subscribe. Thank you for listening. We'll catch you next week with another episode of the Prophet Jesus show from Section four and the Westwood One podcast network.