On this episode of Invested, Michael hosts Sam Lessin, a General Partner at Slow Ventures.
Sam is currently an intern at The Information and has co-founded two companies, Fin and drop.io (acquired by Facebook in 2010). Between 2010 and 2014 he was a VP of product management at Facebook, where he managed the People, Places, and Things product group and the Identity product group. Sam started his career at Bain and Company, and attended Harvard ('05). In his spare time Sam enjoys skiing and kite-surfing.
Please rate this episode 5 stars wherever you stream your podcasts!
Follow Sam on X: https://x.com/lessin
Subscribe to Invested here: https://content.aleph.vc/invested
Learn more about Aleph: aleph.vc
Sign up for Aleph’s monthly email newsletter: https://newsletter.aleph.vc/
For the transcript of this episode, go to: https://content.aleph.vc/podcast-episodes/sam-lessin---episode-29
Subscribe to our YouTube channel: https://www.youtube.com/@aleph-vc/
Follow Michael on Twitter: twitter.com/mikeeisenberg
Follow Michael on LinkedIn: https://www.linkedin.com/in/mieisenberg/
Follow Aleph on Twitter: https://twitter.com/mikeeisenberg
Follow Aleph on LinkedIn: https://www.linkedin.com/company/aleph-vc/
Follow Aleph on Instagram: https://www.instagram.com/aleph.vc/
[00:00:00] Sam Lessin:
I think the happiest and proudest moments I've had in my careers have been those places where someone walks in, and everyone's written them off, and you're like, “But you might be right, or people are missing the point, or people are missing why you're great even though everyone thinks you're not right.”
And when you get to enable those people and then you're fucking right, and you see those people become super successful–I don't know. Like, I really take pride in that in a lot of ways.
[00:00:26] Michael Eisenberg:
Well, welcome everybody. I am absolutely thrilled to have my old friend Sam Lessin join me on the Invested podcast. Sam, good morning.
[00:00:34] Sam Lessin:
Hey, great to see you, Michael. It's been too long, but excited to be on the podcast.
[00:00:38] Michael Eisenberg:
It'd be better to do this in person rather than virtually, but I haven't gotten to California in a while. And I guess you haven't been in Israel in a while.
[00:00:43] Sam Lessin:
You know, I'm trying–I have a trip on the books to Israel for I believe October, and we'll see if I can make it happen or not, but I really want to get over there. It's been too long for me.
[00:00:53] Michael Eisenberg:
Shabbat dinner at the Eisenbergs when you come. So let us know.
Sam Lessin:
Deal. I'm in.
Michael Eisenberg:
All right. So I don't really want to introduce you, I want you to introduce yourself. But for our listeners, Sam is a GP at Slow Ventures. He was a co-founder of two companies, Fin and Drop.io, acquired by Meta. He was the former Vice President of Product Management at Facebook, and he writes for The Information and then takes a screenshot and posts it up on Twitter, which is where I find it most of the time.
But Sam, tell everybody who you are. That's kind of the professional side fo the things.
[00:01:20] Sam Lessin:
I mean, I think you just did. I started my career very briefly in consulting at Bain and Company a million years ago. I said, “What's the opposite of this?” after two years. Faxed in incorporation documents.
Started a company in Brooklyn. Facebook acquired it. Spent a bunch of great years, you know, reporting to Mark, working at Facebook through the IPO, which was a total ride and super fun and super, I don't know, I felt great about,. And then I said, “Look, I really liked the early stage stuff. I don't like the big company stuff.” And so we started a venture platform, Slow, which has gone great. We can talk about investments there. We're on our fifth fund cycle in that. And then, yeah look, I love building, and I love tinkering. And so I started a company after that with Andrew Kortina, who was the co-founder of Venmo.
I was Venmo’s first investor personally. And so, you know, he and I really wanted to work together, and we started some stuff, mucked around. I don't know. I generally try to have fun.
[00:02:14] Michael Eisenberg:
Not bad. But I don't even know how we first met. How do we know each other?
[00:02:19] Sam Lessin:
Was it, was it Aryeh? It might've been Aryeh.
[00:02:22] Michael Eisenberg:
Might’ve been.
[00:02:23] Sam Lessin:
Or it might have been Eden, because–I'm not 100 percent sure, but it's been sufficiently long that we'd have to go trace it through email.
[00:02:31] Michael Eisenberg:
Yeah. Aryeh, for those that don't know, is Aryeh Bourkoff, the CEO of Liontree, who was the first guest on Invested, by the way.
Sam Lessin:
Oh, wow.
Michael Eisenberg:
And he had maybe one of the most all-time controversial comments on this podcast. When I asked him what's something he thinks that nobody else thinks, he said, “I'm not sure democracy is the best form. Some dictatorships are actually good for people.”
[00:02:52] Sam Lessin:
I'll challenge that. That's not truly an original opinion. It's just a tough one to verbalize in 2024. [00:03:00] Throughout history, that's been said before.
[00:03:04] Michael Eisenberg:
Okay. Fair enough. So what we'd like to dig into to start with is just to ask you, what are your core personal values? And you already said that, you know, having fun and tinkering drives you, but what are the core values that–what drives you day in and day out?
[00:03:16] Sam Lessin:
Look, that's a really interesting question, and not one I frequently get asked. The first thing that comes to mind is family. I deeply care about my immediate family. I grew up with a lot of great cousins. And I think just putting family first and certainly kids first within the family is just something I really believe in, investing in. You know, the next generation–I was just biking with my kids right before starting this podcast before they went to school.
And that really is probably the thing that I care the most about and is most important. Beyond that, you know, it's interesting. In terms of values, I think a lot about what I can personally try to uniquely bring to the world. I think that's an important thing. Everyone comes with different perspectives and from a different place, and I really think that thinking about not just how to make a buck, but what can you give that's different? What can you give uniquely? And I think part of that is recognizing how incredibly large the world is. The world's so big and there's so many people doing so many different things.
And thinking about that, not just what must be or what should be, but where you can make a difference, really matters. And I guess the last thing that comes to mind is, again, which maybe goes hand in hand and maybe doesn't–and I don't have practiced answers to this, Michael, because I don't usually get asked about my values is–look, I have a lot of respect and an appreciation for the broad diversity of humanity.
You run into new people who are just themselves in their own quirky way, and pursuing their own thing. And I have a real love and appreciation for that. So I think helping people, learning who they really are, helping people be the best at what it is that they do, and helping them just get confident in expressing themselves in whatever way–it’s this thing I get really excited about because it's just, humans are so infinitely creative, and playful, and interesting when you let them be, right? When you don't put them in a box.
[00:05:05] Michael Eisenberg:
What do you think it is that you're actually uniquely suited to deliver on, or do better in this world than anybody else?
[00:05:13] Sam Lessin:
It's a great question. And look, I'm 41, and I've had kind of a really fun, interesting, I think impactful career so far in a lot of ways. But it's a question I'm asking myself right now, which is, how do I want to spend the next 10 years? Like, what do I want to do in a highly productive era of my life? And where do I think I'm really uniquely suited?
And I think in some ways, we're all intersections of these weird Venn diagrams of experiences, right? I wouldn't say I'm smarter than anyone else, although I don't know, I'm confident to say I'm pretty smart. I do think that I've had a heck of a unique set of experiences, based on where I was born and the things I was exposed to, and the things I got to see, and the lessons I got to learn throughout my life. And so for me, look, there's a reason I left Meta when I left, a decade plus ago, which is–I really didn't want to be a manager.
I was like, this doesn't feel different. I did it for a little bit. I figured out what it was. It was demystified to me. And I was like, I need to go do something that's going to be more authentic to myself, but what am I best at? I think what I'm best at is finding people, and finding ideas that other people have written off and saying, “But what if they're right?” and then every once in a while being right about that.
I'm the type of guy who in my venture practice, we go very early. I mostly lose money, right? I mostly light money on fire. Every once in a while though, I'm right. And when I'm right, I'm really right. And you know, that works for everyone. And you know, if you think about that, what does that really end up meaning you do?
Well, it means that you have to be willing to entertain crazy people. You have to be willing to entertain crazy ideas, and take them seriously, and listen to them. You have to be willing to kind of show up every day and actually talk about a lot of ideas, and a lot of people that are actually silly or stupid, right?
I think that's actually one thing that people miss about venture capital in terms of what you’re uniquely suited for, is, you sit down–especially if you're in the early stage–you hear a lot of bad ideas. And you have to be willing to show up every single day and hear another bad idea on the way to the good idea.
And it takes a lot of patience. It takes a lot of appreciation for people, and their ideas, and their hard work. It should take a lot of respect, right? Because again, these are people, whether or not they're right, they're pouring their lives and souls into this weird thing they believe that's different than everyone else.
And I don't know. I will say that on a bad day, it's really frustrating, right? ‘Cause on a bad day, you say, “Oh, my friends who do late stage VC, or work in these other industries, or are senior managers at big companies, their ratio of good ideas to bad ideas they hear is a lot higher.” They have to deal with a lot less crazy, right?
But then every once in a while, and I think the happiest and proudest moments I've had in my careers so far, maybe what I'm uniquely suited for has been those places where someone walks in, and everyone's written them off, right, or a lot of people have written them off because otherwise you wouldn't be talking to them, and you're like, “But you might be right, or people are missing the point, or people are missing why you're great, even though everyone thinks you're not right.”
And when you get to enable those people, and then you're fucking right, and you see those people become super successful. I don't know, I really take pride in that in a lot of ways.
[00:08:27] Michael Eisenberg:
How do you deal with the losses or the zeros?
[00:08:31] Sam Lessin:
You know, you talk about what you’re uniquely suited for–I actually think this is something I don't know that I started out in my career uniquely suited for, but I've gotten extremely comfortable losing money, right?
And I think that that's actually something that is in some ways–I don't think anyone wakes up on day one and they're like, “This is great. I feel awesome about dropping a million bucks and watching it just get incinerated,” right? Or whatever it ends up being. But I do actually think it's a learned and practiced skill in a lot of ways.
And so it's fine. I actually talk to entrepreneurs–right now is actually an interesting period because, as you well know, we've been through this cycle in the last few years where VC was so hot, right? And startups were so hot that everything kept getting funded. So all these things that you would seed early on, and you'd kind of be like, “why aren't these companies dead yet?” Right? Like, I know which the good ones are. And there's a bunch of others like, “wow, I'm glad they're kind of bumping along. They're still figuring it out. Someone will fund them to keep doing their experiment, but they won't die.” And what I've actually liked in the last, even few weeks is, I've gotten so many calls from entrepreneurs being like, “It's time to throw in the towel. This isn't working. I'm going to go do this corporate job. I'm going to change direction. You know, we're done in a lot of ways.” And the thing you hear for a lot of first time entrepreneurs is that they're really apologetic about it. And I was like, guys, it's just money to me. It's your life in time.
Like, do not worry about me. Now, you know, if you can do right by me and make me some money back while you're doing the right thing by yourself, great. That's honorable and I appreciate it, but do not optimize for me. Like I'm fine. And then that is my game, is, if you think about early stage, what we do is you mostly lose money, but you get so ridiculously overpaid when you're right that that's the entire equation. That's literally the business model. So I'm quite okay with it. I try not to lose enormous amounts of money, but losing small amounts of money repeatedly, I'm all about.
[00:10:27] Michael Eisenberg:
I'm interested to hear what you think about this. I've made the argument actually, made it to the military here over a bunch of years that the world actually looks much more like an early-stage venture capital investment than a hedge fund. Meaning, this is not a risk management business, it's actually, if you want to kind of change the vector of society, or in a different way, make sure that you're prepared for what's going to go on in the world, it actually looks more like a venture capital investment.
There's kind of a lot of middling failures along the way, there's a lot of failures, but if you get one of them right, it works. By the way, I'll argue that tragically, the Hamas attack on Israel was another one of those venture capital investments. They kind of played at this and, you know, MVP so to speak, and killed a lot of people.
Tragically, it's obviously not the same as losing money. But the world looks like this actually, and if you have a risk management model where you're kind of trying to minimize risk all the time, you miss out on big events, whether they're political, military, economic, technological. But if you have a venture capital mindset you can catch these.
Do you disagree with that?
[00:11:32] Sam Lessin:
Look, the reality is, as the world gets faster, and faster, and faster, and technology gives us more and more leverage, and you have these incredible platforms where if something works, you can scale it up ridiculously super quickly, which didn't used to be the case.
There used to be a lot of friction in the system, right? But that is certainly the way the digital world works. And so much of the physical world now follows the digital world for better and worse. That–I do think there's some truth to that, which is, you know, it's almost like if you're at a roulette wheel that it used to be 100:1 winnings, now is 10,000:1.
And what you're willing to experiment with rationally, like what the rational balance point is between 0000 and 100,000 versus 0000 and 10 is very, very different. And so I think the reality is yes you know, I think the world has moved in this direction, for what it's worth. It's not all good. Like I think it is. But you know, the fact that people realize this, right, does push the bounds of experimentation in ways which are pretty extreme. And again, as a business model, as trying to find the next thing that's going to be really important or change the world again, I'm all for it.
I actually think expanding the bounds of human experimentation is good, but we are managing a planet with billions of humans and real life consequences. and bad things. And again, I think it's, if nothing else, I'd say where we live is an increasingly chaotic time because of how that works.
It's rational, and it has its places. I think it's amazing, right? Like I think it pushes the balance in all sorts of great directions, but it's certainly not all good.
[00:13:10] Michael Eisenberg:
I made the comment at a conference probably five or six months ago in Abu Dhabi. I won't say everything I said, ‘cause it was Chatham house rules, but you know, this notion of AI increases chaos or entropy in the world because all of a sudden, if there's a small number of chaotic actors on a relative basis to the rest of the world, most of the world just wants to get along with life. But they're now empowered with superpowers, and you've actually increased the rate of chaos or the level of chaos in the world pretty dramatically, by endowing it with superpowers. Part of the point I made was, you can correlate between the increase in compute power and the increase of chaos in the world. How do you think about that?
[00:13:47] Sam Lessin:
I think that's basically like, look, there's two ways to look at it, right? One is exactly what you said. Which is yes, like more leverage equals more chaos, and also more opportunity, right? Like there's amazing things that can go great, but things can also go badly very quickly. The other way I'd almost go at it, imagine you know, the reality is, I do also broadly believe that markets are efficient.
Like the reason I like seed investing and what we do is, I think it's, in a lot of ways, the least efficient market in the world, right? Which means the last thing I'd want in life is to be known as just a market participant. Like that's always like, if you talk about like, what is the most upsetting concept is it, well, to say, “I lived my life as a market participant.”
God, what could be more boring? Right? You want to be the person who said “capital is my tool, I'm looking for places to deploy it.” That's the highest leverage where no one else is. And it's good for me. And it's good for capitalism. And it's good for the whole, you know, I have a really positive story about it.
But so, if you think about AI and all these things, and you believe in relatively efficient markets, it's almost like having like two, you know, AI can be adversarial, right? There's a balance point to it. So you think what you really get is a system where there's more and more pressure pushing against each other, right?
Like all the systems, the entire world, the white hot centers, everything's pushing and trying to keep itself in balance. The problem is, this thing slips a little bit, you have so much energy in the system, it goes flying off in a crazy direction, right, in a lot of ways. So that's kind of almost the model I would have in my head for what this looks like.
[00:15:16] Michael Eisenberg:
I want to get back to AI, but before that, I want to actually ask you a question. I'm realizing now that you basically left Facebook and became a venture capitalist roughly around the time that you had your first child. Is that right?
[00:15:27] Sam Lessin:
Let's think. Not quite. So, I left Facebook In 2014, and our first child, I think, was 2017.
I should know that. 2017. So, relatively. I had just gotten married though, and we were certainly on the course of starting a family, and kind of to figure out what the next stage was, is what I'd say. And I definitely did my 20s in that part of my career, in an all-encompassing, all in, let's build Facebook mode.
[00:15:53] Michael Eisenberg:
Do you find that having a family has changed your ability in any way, as an investor? Does it make you more understanding of crazy, more understanding of uncertainty? Or zero impact, you think? You’re just Sam.
[00:16:06] Sam Lessin:
You know, here's what I basically say. The big impact I have, I think, with kids is they're, I remember when I worked at Facebook, when I did my first startup, even at Bain and company–I worked way too hard at Bain, by the way for what I was getting paid and the relative impact, but I learned a lot– basically, if you think about it, if you had a problem, you could spend almost infinite time on it, right? Like you just, he number of times I slept in the office at Bain and then at Facebook was off the charts, right? And you basically could just work your way through any problem. You could just boil the ocean. You had the energy to do it.
You had the time to do it. Now it's funny. It's like the number of hours I have to dedicate to work is so far down, right? Compared to what it used to be, because I used to work all the time, and now there's a whole bunch of reasons I would never operate that way. It's not just literally the raw hours, it's just the energy. It's like, if you are up all night working, you're a wreck the next day, right? You just can't do it. You can't do the thing. You know, when you don't have kids, you can blow your brains out for a month, right? Getting something done or a project done, and then you can just rest and sleep for two days, and come back.
And you just can't do that when you have kids. So. It definitely changes how you have to approach things. I think I'm far more efficient now than I was before. I know where the value is. If you've ever read my screenshot essays or some of the stuff I work on, I am like the king of believing in 80-20, right?
People will send me notes all the time being like, “There's a typo on the third line.” I'm like, “I don't care.” It proves it's human, right? Like it's just totally irrelevant. So I think it changes how you can work. But essentially, I don't think it's changed my mentality as much as it's changed my approach.
And then I guess there are a few things you're probably not supposed to say on podcasts. Like, here's the reality. You know, backing parents of young kids in startups–very complicated conversation about whether that's a good idea or not, right? Because there are things, you know, there are places that if it's going to require the brutal work schedule, da, da, da, you're going to rip their marriages apart. You're going to rip their families apart in a lot of ways. And for some people, it's incredibly motivating, right? They're like, I have to put food on the table. This has to work. It's a healthy dynamic. But in some ways, it's a very, very unhealthy dynamic. So I don't know, you can go in a lot of directions with that.
[00:18:19] Michael Eisenberg:
By the way, for what it's worth, I'll tell you that in our portfolio, at least–and maybe things are different in Israel–a significantly high percentage of our best performing companies are founders who've had children during the startup run. Or are parents of young children. I don't know what to make of it, but it's definitely factual.
[00:18:39] Sam Lessin:
I believe that. Well, how about this? Let's just try, let's bullshit in the spirit of ‘we're probably wrong but we'll try on some right things.’ So Israel is a way better social network and support system than the United States, right? Like, in a whole bunch of ways.
The community is smaller, it's more integrated. I think that probably actually changes the dynamic somewhat, right? You know, you screw up in the U.S., there are more consequences.
[00:19:00] Michael Eisenberg:
The unheralded hero of startups in Israel is the grandma.
[00:19:05] Sam Lessin:
Yeah, a hundred percent. I mean,
[00:19:08] Michael Eisenberg:
Who lives nearby in Israel.
[00:19:09] Sam Lessin:
Totally, there's that whole dynamic. I mean, obviously because of the army and the culture and kind of things like that, I assume that exactly when people are starting families versus their service, and startups, is all a little bit different. So different societies stack differently, but it is interesting. I'll give you an example.
I'm actually a big believer in providing social services, although I'm really pretty Libertarian in a lot of ways. But I'll tell you the reason why is because I actually think that if you have a strong social safety net, if you have a lot of the groundwork laid, you get more innovation, you get more people who can afford risk, right? And I think that's a good thing.
[00:19:46] Michael Eisenberg:
Yeah, I agree with that. So I want to make a couple of sharp turns here before I get to AI.
Sam Lessin:
Sure.
Michael Eisenberg:
You just posted an excerpt of your thoughts on X about Pavel Durov from Telegram. You asked, “Is it possible that Telegram was an elaborate ruse by the Russians the whole time that he's a double agent?
Yes, it's possible. And in our era, where a single person or a tiny number of people can use technological leverage to touch untold people–” that's what I was referring to earlier about the AI–”and impact massively important and scaled things nearly alone, more than 0 percent of tail risks is, as many people like to remind us…a lot of risk.”
And so what do you think about Pavel? He's like a hero of free speech?
[00:20:27] Sam Lessin:
I don't know. I don't know him personally. I know a lot of people who know him. And again, I agree with a lot of the things he said and a lot of the story around him. I also know a lot of people who think he's pretty nuts in various ways. And the basic point there that I would just push on is, I just don't think we can trust individuals in that way in this point in history. That's why I'm a big believer in crypto. If you think about where we've done the best, what are the places I've actually literally made the DPI that has driven our fund and a lot of things? It was coming down to having a better viewpoint on where the value was in crypto, and how to approach that as a small fund over the last 10 years. We seeded Solana, which was a 1200+X return to our LPs in three and a half years.
Like we seeded a lot of great stuff. I was personally a seed investor in Ripple, and early in Bitcoin. And for me, it all comes down to this whole thing, which is the faster you go, the more the formal rules matter. And I think that's one of the key things.
Michael Eisenberg:
Explain that?
Sam Lessin:
In the end of the day, there's two forms of trust, right? There's social trust and then there's formal trust, is what I'd say, like mathematical proofs. And historically the world's obviously built on both. There's a lot of places where social trust–what it means to have retribution, identity, reputation, the stuff I actually spent a lot of time at Facebook on–critical to how our society functions. And then you have literal mathematical formal verifications and their value, and things like that. And, you know, it opens from an open-source perspective. And look, again, leverage and everything is going up. You've never seen more super empowered individuals than you have with an Elon Musk or things like that before in history, right?
And I mean, they've been close maybe, but not to the same degree. And so as power goes up everywhere, if there is a one percent chance that Pavel was working for the Russians the whole time, that's an enormous risk, right? And maybe we should care a lot more about encryption, and decentralization, and things like that in a world where that's true. You have to think about what you really can't screw up.
[00:22:37] Michael Eisenberg:
So you make the point that there's trust, right, which is social, and then there's trust which is mathematical. There's a third area, which is regulation, which is not the same thing really. And then the power of state, which is what happened here. State power overwhelmed, call it individual power. If a state is kind of a license to use force, they used it and incarcerated the guy. And so, where does the line get crossed where you say, okay, this slightly-greater-than-zero-percent risk, a state should take action and jail the individual?
[00:23:15] Sam Lessin:
Well, look, the answer is, I think it's a long conversation. I don't know the details of the situation. I think it's good that the state has a monopoly on the use of force. I don't think we want people running around like private armies anytime soon. And look, I'll be honest, in my career, I've generally been, I grew up in the era where everyone was internet utopian, right?
We were going to stitch the world together, all the rules of the internet were the same. They were basically mathematical rules, although the dirty secret was they were always enforced by a social human layer on top of that, but we really tried to make it as formal as possible, and we were all going to talk to each other openly. That actually worked quite well for a long time, because I think the states did not understand the internet, right? They didn't read enough science fiction. And so they kind of like, let it be for a long time to develop. This is about the same, very similar pattern to crypto from the early days, right, which is, it was a toy. And because it was a toy, no one paid attention to it, as often the sideshow, and it got to develop on its own for a long period of time before anyone paid attention.
Obviously, states and communities now pay attention. They realize how important the force is, et cetera, and they say, “Hey, what is our say here? What is the democratic part? How do different communities want to regulate speech differently?” Things like that. The thing historically has been, I would argue, that the states have really been A. behind the ball in terms of how they understand and think about and act on these things, and then B. kind of random in how they apply their force, if that makes sense. And this is because technology moves so much faster than it is.
They're not designed to regulate or come up with rules at this pace. This is because the leaders have historically had no idea what they were talking about.
[00:24:57] Michael Eisenberg:
And not just historically. Currently.
[00:24:58] Sam Lessin:
I would actually argue there's, I don't want to name names because I couldn't, but my sense is, is that the people in government–I'll give you one name, like Pete Buttigieg–I think he actually basically understands the internet. There is a generation–but the problem is just because individuals in the government understand it doesn't mean the systems overall do, or they make good rules. And so you look at AI regulation, this whole tip of the spear thing, what a mess, what a mess.
And on one hand, I don't grudge the states for screwing it up or saying a bunch of asinine things in terms of how they're approaching it. Because they literally just don't, as an organ, have the capacity to process this stuff or deal with it. The problem is, honestly, that when you start watching what happens with some of regulation and how it's applied, is when actually powerful individuals, companies, things like that, start manipulating the states for their own ends.
And that's the thing I worry the most about, is not the states, which are kind of bumbling a little bit. They have some ideas that matter, but they don't really know how to enact them. What it really is, it's almost like you have trust in people. You have trust in math, and then you have this like 10,000 pound gorilla wandering around smashing things, and kind of doing the bidding of other actors in the system without knowing what they're doing.
[00:26:12] Michael Eisenberg:
Do you worry, certainly when the Twitter files came out, do you worry about the state influencing these mass media platforms, in that case? And the people who control them to influence public opinion? That's my view of TikTok for what it's worth, right? Which is the state influence. That's TikTok. I'm talking about democracies for a second now, which is like Twitter, Facebook, et cetera. Do you worry about government influence there at all?
[00:26:33] Sam Lessin:
So here's what I basically say. I mean, just, sorry, you and I are both very aligned on TikTok. Like the fact that the United States allows TikTok inside of our country is one of the greatest, unforced national security issues in the history of the world. Like, when we rewrite the history of the world in a hundred years or a thousand years, you know, this will be, like, gawked at, right? Like, how could you possibly screw that up, right? Is what I would say.
Michael Eisenberg:
I agree.
Sam Lessin:
That is different because it is literally, there are state representatives on the board. And it's very clear how the power structure works there. Here's the thing. I've spent too long on the inside of these large platforms to not believe that there's A. a lot of pressure around.
And we talked about the plates or like, you know, when there's so much pressure in one place, what happens? This is what's happened with speech broadly. If you think about it, fundamentally, speech is distributed. It used to be, you’d sit in a bar with someone. There was no actor in the middle.
There was nothing to influence. Now, as these platforms have gotten bigger, and bigger and bigger, all of human speech is concentrated on fewer and fewer comments and fewer spaces. And you know what? It's super easy to stick a hook in there. And instead of, the old joke goes about even in Russia, they just say, “In the USSR, people always had free speech. They just might not have free freedom after speech.”
It's obviously a little bit of a joke. The idea that you can now centrally monitor the way you can, and centrally control is terrifying in a lot of ways. So then, what are you back to? You're back to trusting either algorithms or humans, right?
If you trust the algorithms, or you're trusting kind of the transparency was–that's one thing, you can mathematically trust, you know, from a crypto perspective. Or you are trusting whoever is leading these things, and their organizations, the human side of it. Look, I do trust Facebook.
I think it's super complicated. I argued incredibly aggressively for certain policies that never came to bear in terms of how speech was treated internally, but it's a delicate balance and a hard game. And I know you have serious people who are trying to do the right thing, but yeah, I mean, that's the real question. It's kind of a crazy world where we're being asked to trust individuals and human organizations at this scale.
[00:28:44] Michael Eisenberg:
Do you worry that like, Zuck, or Elon, I mean, Elon's not just Twitter, right? He's got a rocket company, and a telecompany. I've argued that he's a state in a guy, right? He's got an energy company. And Zuck controls a large percentage of human media on the planet, human conversation on the planet that, you know. And WhatsApp is encrypted end to end. Do you worry that these guys walk into a France and get picked up and arrested?
[00:29:10] Sam Lessin:
Yes, I do worry about that. I think that is a problem for them and for the world in a lot of ways. Which again, you go back to the whole state that has the monopoly on the use of force. And that's not, I don't want that to change anytime soon, by the way, and it's not going to change anytime soon, but yes, I think that when you have powerful actors at that scale and tensions around those, those are real concerns, I think for them and for kind of where the world plays out.
And then the question is like, are they strong enough to stand up to that? Are they able–let's pretend Elon can never leave the United States again, right? You know, i is the U.S. strong enough to protect him? And then, how do you think about that playing out long term, I think is a really weird science fiction, but true narrative for sure.
[00:29:55] Michael Eisenberg:
I'm pretty sure he can come to Israel, as could Zuck.
[00:29:58] Sam Lessin: I agree. I agree.
[00:30:00] Michael Eisenberg:
I don’t say that facetiously, I say that seriously. I think this question of the rights–rights is the wrong word–what do you do with powerful people in a world that has become fundamentally borderless digitally? And that moves a lot of people physically. You can do what the UK did, which is shut down on, you know, “I'm sorry, you can't Tweet about this. You'd be arrested.” You can arrest the guys, you know, Zuck, and Elon and Pavel. And I'm not saying they're the same, by the way. I mean, I don't know Pavel from a hole in the wall.
[00:30:30] Sam Lessin:
I'll say it explicitly, they're definitely not. But thematically, I agree. I mean, like, look. I’ll give you a few things. One is, the real answer is you have to throw away the keys.
Michael Eisenberg:
Explain.
Sam Lessin:
Well, again, even take encryption. If you believe in free speech, no one in the world is going to be powerful enough to believe in free speech, and then encrypt and leave themselves an out or a backdoor, right?
It's just not how the world works. And this is I think, probably the Telegram mistake, free speech, asterisks is not actually end-to-end encrypted. That's not a value. You can't maintain that, right? Like, you have to have a system that formally says that basically you can say, you can say, “Hey, shoot, Pavel.
It doesn't matter.” Like there's nothing you can do, right? This, I think, is the kind of mentality you have to–so this is the thing I think is very difficult about compromise, and where we are as a society is that we are moving in a direction where a lot of compromises that used to work, the messy middle, you could get away with it. You could be like, “Okay, well, the speed limit is 60 miles an hour, but if it's an emergency, you go 70.” Here's the problem. You're in a self-driving car. It's formal. It can't go more than 60, right? Like you actually have to encode those rules. And those rules are human rules are squishy in the middle, in a lot of this stuff.
And so I actually think that's where we're going is like, if you're one of these people or in general, we have to have this really hard conversation. We have to rehash a lot of historical compromises. For a technological world where it is more determinist, you do actually have to have a hard line.
That's why I say, if you're Pavel, you have to throw away the keys. I'd also say like, the other way to look at this is the science fiction story angle. I don't know if you've ever read Snow Crash? It's a great book. I actually, when I was at Facebook, one of the things I instituted, I ran a lot of the PM on recruiting and onboarding for many years.
And I made everyone read what I call ‘the canon,’ and do a book club when they joined Facebook. This is well before Oculus or whatever. One of the books they had, every single new PM at Facebook had to read and then debate was Snow Crash. And in Snow Crash one of the characters, the way he does it is he has a nuclear weapon hooked up to his heart.
So if he got killed, there'd be a nuclear explosion. So that's what Pavel could do if he's not going to fully end-to-end encrypt.
[00:32:44] Michael Eisenberg:
By the way, I think we've actually hit the core of this conversation. I'm so happy we're here, which is what you're really articulating is that this new digital world, which operates in bits and bytes, needs a much more formal digital framework in order to manage it. And a lot of things that make us human don't make it in that world on some level, because it gets cut much harder, right? When you think about smart contracts and crypto, which is something we'll get into in a second, those are automatic, those are like automaton contracts. And there's none of this, “Oh, I'm really sorry. It's not exactly what I meant.” Or there's kind of, pleading my case is not okay, because it's not going to work. And when you have people of this scale, it's really hard to craft these smart contracts, because the power is so immense and constraining it could actually constrain freedom. You could constrain creativity. Like, this is tricky.
[00:33:41] Sam Lessin:
I'm liking this. I like that we're here too, because this is a really fun, and hard, and interesting topic. Think about it this way. There's two types of technology we now have. There's what we've had for the last 20, 30 years that I've been working on, which is effectively highly deterministic systems, right?
You know, one plus one equals two, full stop. And you can make it more and more aggressively deterministic, where everything can be monitored and kind of, there can be rules that are formal yes or no's on everything. Again, to your point, that's actually not how the real world has historically worked, but it is very powerful and scalable. So we've been trying to shove a lot of the world towards that formula. I will say, in the last two years, though, there's something else that's happened, right, which I think is a whole other thing, which is the AI thing, which is effectively non-deterministic computing.
And so our options really are, if the world has to go faster and faster and faster, faster than humans can process, certainly faster than democracies can process, you basically have to make this decision, or I have to choose between, is this a formal verification–where like the speed limit now is not sort of 60 miles an hour, it is 60 miles an hour exactly–and if you're wrong, you're wrong, right? Or do you hand it over to AIs, which is another whole thing which I'm very skeptical of, but some people are really into, where you say, “Eh, we can't exactly say it's 60, we'll feed it all into the AI and whatever the AI says is, is correct.”
But those are kind of your options.
[00:35:08] Michael Eisenberg:
I think there's a third option for what it's worth, and I wrote about this like a year ago, which is the Community Notes model, which basically says the following. There's this speed, there's this mass media that builds up, but I'm going to throw this other force called human beings at this problem at scale, and try to get to a truth that is faster than the corrections that mainstream or mass media make.
It is less deterministic than the algorithms of, call it Facebook or whoever, and this brings kind of a human element to it. I think the tricky part is it can't cover a hundred percent of what's out there.
[00:35:42] Sam Lessin:
I will say, here's what's interesting. This is what I spent all of my time at Facebook on a decade ago.
And so when I started at Facebook, I ran profiles. I ran a lot of the kind of product teams outside of newsfeed and messaging, which were always separate. But I started something called the Identity Product Group, and the whole point was this. The whole point was, anything written on the internet, who the fuck knows, right?
It's like, the whole point is, how do you connect the real world in a trusted way to the digital world? And what does that start with? That has to start with encoding properly identity, and true reputation. Cause the problem with Community Notes is you still don't really know who's writing them, who to trust. Is it a bot? It's a war of attrition, where there's always going to be such an incentive to–what you need to get back to is, these are real humans, which by the way, was the fundamental Facebook innovation. You go back to 2004 and the whole point was no one trusted anything on the Internet. Facebook hijacked FAS.harvard.edu to prove to all the Harvard kids that they were talking to real Harvard kids. They forced a real-name policy in photos. And all of a sudden you had the first trusted local digital community at Harvard that made sense and worked.
And so the history of the company was about that, which is how do you take a physical world trust, move it online, and then create this like place where there is real reputation. There is, if you say the wrong thing, it has real world consequences for your operations in flesh space.
Like all of the systems of human trust that work physically, we could transport online and use it at scale, right? So I love the theme and the direction. It turns out that as beautiful a dream as it is, and there are places that can work a little bit, it is unbelievably hard to do at scale.
And the number of technical and practical problems you get into when you say, “Okay, can we trace these humans all the way down? Are these real people? How do you extend trust? What is the value of social capital versus financial capital?” All these types of things get really hard. So I agree with you intellectually, but it's funny, is if you picked one problem area that I absolutely adore and through my career, I'm also incredibly burned out on, it is this.
[00:37:59] Michael Eisenberg:
I made the argument a bunch of years ago that ultimately we'd need to actually come into smaller groups of people to identify, to kind of coagulate around trust. Meaning at some point people will have default–and I'm this way by the way already, anything I see online is default, not true–anythingI see it in the media, default not true, period. I trust Sam, and so if I see you–go ahead.
[00:38:22] Sam Lessin:
I agree that's exactly what's happening, Michael, but I'm actually really sad about this.
Sure.
Let me tell you, like, here's a historical arc. It used to be, in like the medieval era, that's exactly how like you had your tiny little community.
You hired your friends’ kids. You worked with the ten noblemen you knew, because if they fucked you, you'd fuck them back, right? And like, their daughter was married to your son and whatever else. It was like this highly stratified, small system of trust in local loops. And what happened over the last 150 years, 200 years has been beautiful, which is capitalism–which says all dollars are the same. I don't care who the hell you are. A dollar is a dollar is a dollar. You want to buy this thing, you can buy this thing. That's a beautiful open system. And then interestingly, recorded media. So it used to be you, someone says things–like, who do you trust? I don't know if it's right or not. Like you trust the Church, maybe, right? The church was the broker of reality, right? Like noblemen were, you know, Joe Schmoe Peasant had no ability to influence, but now Joe Schmoe Peasant had a camera. He had a microphone, right? And all of a sudden, you know, this whole ‘truth can come from anywhere’–well, it's kind of true because it was sufficiently hard to fake recorded media that whoever had it had it, right? And all of a sudden reporting and truth could be much more open. The problem with AI and where we're going is, I think you're exactly right, which is trust is re-collapsing into local networks, everything is default untrustworthy, and I think it really actually screws with a lot of what I love so much about the last 200 years, which has been how open our system has been. We're going to get very closed again. Yeah, look, there are opportunities for this. My wife runs a publication called The Information that she founded, has grown up. It’s not as big as the Wall Street Journal, but it is big. And in its community, it's trusted, right? But here's the thing. If you look at the history of–you go from large media to high-priced private newsletters, this has happened before. This actually happened right after the printing press, which is the news got really scummy.
And no one trusted it. So anyone who actually had a stake in reality, the noblemen, whatever, they all started subscribing to these super high-priced Italian newsletters which were supposed to tell them what was actually going on in different courts around the world, not what the scummy popular opinion was. So, these things go in waves, but we're in a dangerous part of that cycle.
[00:40:52] Michael Eisenberg:
Do you ever wonder, you know, you mentioned that the church was a proxy, or an arbiter or a convener of trust communities. I've often said that, and I'm not the only one who's made this argument, that Capitalism in the United States was built on the Protestant covenant.
And I'd offer the synagogue covenant as well.
Yeah.
Michael Eisenberg:
So, do you wonder whether we need to get back to an era where there's religious convening in order to broker trust? I'll also mention, by the way, one of the things I noticed about Israel, and this year–from my accent, you can tell I wasn't born here–is that the army is a broker of trust.
Army units where people sweat together and hang together. Lack of military service in the United States has meant that that's a much smaller community. So you think that these things come back like service and church and synagogue in a way that enables trust?
[00:41:39] Sam Lessin:
So something has to come back, is what I basically say.
And I do think people need organizations and common nodes, is what I would basically say. And now the problem is obviously, just like we talked about with individuals going rogue, you can talk about like rogue Popes, is the fact that everyone agrees with the church on a certain topic doesn't mean it's right, but it does mean at least everyone's swimming in the same direction, if that makes sense.
Michael Eisenberg:
Right.
Sam Lessin:
And you do over-concentrate power sometimes in too few and scary hands, but yeah, I mean, look, I kind of was hoping that this would come out of COVID. I was kind of hoping that the world and the United States would have this moment, where COVID fucked up things so royally that out of the ashes of that, like some sort of new trust would emerge where we'd all kind of get behind something in a way that was coordinated and helpful, and positive and whatever that could be.
But here's the funny part. My kind of take on what happened in the U.S. with COVID is, especially in whatever year we are in, you know, the end of August, 2024, where the economy looks okay in the U.S. is what I would say–at least that's the popular narrative–is we kind of got through it too easily. You know, we printed too much money. People hung out at home with Zoom. And again, that's not to discount the millions of people that died. And again, I want to obviously do all the things so I don't get canceled on this. But I think like, when–
Michael Eisenberg:
Getting canceled is okay, it's fine. You'll get through it, you're fine, yeah.
Sam Lessin:
It's fine. I've been there so many times, I'm fine with it too. But basically what I'm saying is, you don't want to have to have a world war to get everyone on the same page. Like, that's really, really expensive and terrible. But the question is how do you, in a world of abundance, in a world where we can kind of print our way out of stuff, in a world of current American hegemony where we have whatever it is, I do think a lot about what are those moments that can re-bond the United States.
You know, candidly, when you get into it, you're in Israel–how much does 10/7 re-galvanize society in Israel? And I don't know the answer to that. Like, has that been a moment?
[00:43:43] Michael Eisenberg:
A lot in my opinion. I mean, there's still kind of the political discourse that is extremely divisive, but I think on the popular level, what you'll see coming out of this is a recombination of society into a different vector.
Sam Lessin:
Yeah.
Michael Eisenberg:
And unfortunately, I think it actually takes wars, for what it's worth. I think it's an interesting question, you know, World War II, for example, had a lot of draftees. I don't know if the same thing is true in the United States today.
But I want to pivot one second, exactly on this topic, to your investments in crypto.
Years ago, I had a back and forth on Twitter with Marc Andreessen. He said that crypto conveyed trust and I said, “No, it's a replacement for trustless societies.” Is that why you like crypto so much? ‘Cause you think trust is gone and now we have to go to mathematical proofs because there's no trust?
And while you're answering that question, thinking about it, I'd love to hear why you did Solana at the time, which is obviously this killer investment–you referenced 1200 percent return.
[00:44:41] Sam Lessin:
1200 times return. It's more than a percent.
[00:44:43] Michael Eisenberg:
1200 times, sorry. I hope you wrote a very big check.
[00:44:44] Sam Lessin:
It's never big enough, but big enough, big enough. And it's actually more than that, but it was, it was good. Look what I saw, so my interest in crypto, I'll be honest, it's actually funny. I have framed the first time anyone ever asked me about Bitcoin–someone sent me an email, a person I know reasonably well, but not incredibly well.
He said, “Hey Sam, have you looked at Bitcoin? What do you think about it?” I was working at Facebook at the time. And I wrote back this incredibly dismissive email. And this isn't like 2000, this was early, maybe 2010, 11. But I remember at the time, Bitcoin was trading around like less than a penny.
And I basically, my dismissive one-line email back was, “It's super interesting. It seems like the price has already run up too much.”
But I will tell you my path into it is a few fold, which is, not that many months later, a few things happened. One is, you know, I had seeded Venmo from zero. I was Venmo's only investor. And I'm really proud of that. Not because it was the biggest financial return in the world, but because Kortina and Iqram, the founders, I was willing to bet on them when–I mean, they're wonderful people, but even they would say they're weird.
And it took someone to see the brilliance. And I'd like to pat myself on the back. The second is their approach to how do you think about the financial graph, and things like that, was just very different and avant garde, and people didn't want to invest in it. So I was willing to write that check.
So I was always very interested in how do you move money. And money is a form of communication more than anything else. And so sitting at Facebook, spending more time on this, you know, looking at what was going on–you just saw all of a sudden, this was like a self-replicating system that was decentralized, that was fundamentally a communication system. Money is a form of communication. Now this is going to grow just like a communication network. And it was growing like a communication network. And so I was able to make a reasonably sized–Bitcoin, you know, sending Western Union money to Bitstamp, right.
Which was the only way to buy crypto at the time and things like that, the whole nine yards. And then it's kind of seeded Ripple and a bunch of other stuff. But yeah, I mean, for me, the idea that you can have global free communication between anyone in the world instantly, for free, and the idea that you can have global free movement of money between anyone in the world instantly for free in theory is just, they're the same story, right?
And if you see what happens, you know, with community, with communication, how that changes commerce and how people interact and the relationships that exist–to me, money is just another version of that, right? You know, setting aside everything else. And so that was very, very appealing to me and very interesting from an early stage.
Now, the interesting thing with–you asked about Solana in particular, and again, I'm proud of that. It was also a massive, massive return for our fund. The reason it was such a good return is, at the time, it was extremely uncool. Like the reason Solana was such a good return was that at the time there was, people were very excited about Ethereum.
That was kind of the story of Silicon Valley, and smart contracting, and this is it and dah, dah, dah, dah, dah. And then there were a few high-flying projects like Dfinity or whatever, which I mean, candidly, the A16Zs, the big boys of the world were backing, right? It said, okay, we have the right tech, we have the right pedigree of people doing the right thing, and we funded that up the wazoo. And the reality is the reason Solana was such a spectacular return versus an okay return is because on a relative basis, no one would give them money, right? They were weirder people, right? They weren't part of the Silicon Valley mainstream the same way.
And they had a really, really compelling vision of what the future of effectively a smart contracting platform could be, or how it could work, or what would really work that ran contrary to a few things that people were very dogmatic about right in Silicon Valley, right?
You know, then we can get into that if it's pretty technical, but the upshot is, I hilariously talk about relationships and trusting people. The way I ended up doing that investment was twofold. One is I had an anti-Ethereum thesis at the time, right? Which, I had invested in Ethereum like everyone else.
But I didn't think it was the be all and end all. It was not Bitcoin 2.0, right? There's a bunch of things that were different about it versus being the only possible outcome. And second, Raj, one of the two co-founders, I had known him through three startups before it was Solana, right?
This goes back to relationships, and people, and trusting people over the long period of time. So when he came to me and said, “We're doing this, it's on the thesis you believe in.” And I'm like, “Raj, you know, I've watched you through three companies. I'll back you on this one.” It was the right intersection at the right time.
But yeah, I think it does. And that, again, I'm proud of it partially because it was a good thesis, and I was right, and whatever else, but also, we talked in the very beginning about getting excited about finding those voices and those people that are counted out a little bit, or haven't had their big success or didn't work at Facebook, right?
But they're the right voices, and I think that was a really great intersection for me on those two things.
[00:49:50] Michael Eisenberg:
I sense some snark about the Silicon Valley elite, or you know, the A16Z club or….
[00:49:56] Sam Lessin: Well,yeah, I mean, like, look, as with all things, yes, I would say that I–there are asset managers and there are investors. And investors try to make returns, and asset managers are the ones that figured out that actually venture capital is an okay business. It's not a great business. What's a really great business is asset gathering. There are people who are trying to find the next thing or the next people or whatever to invest in. And there are people who are trying to move gross dollars and make a market, right? And I don't like the asset managers. I think they distort the markets. I think that it's a very cynical way to make money, because you basically, you're not trying to find anything.
You're trying to pay slightly more on a promise, and then convince a bunch of LPs to give you more capital. So yeah, I'd say again, as individuals, as individual deals, they're all people. Behind all the institutions are people, and there are people that make good decisions and bad decisions.
And there are people that have, it's more complicated. But when you think about institutional incentives and like where the world has gone, I am pretty bummed at what Silicon Valley has largely become in the last few years. And I think it's a discredit to technology. I think it's a discredit to innovation.
And, yeah, those guys do kind of annoy me, is what I would basically say.
[00:51:17] Michael Eisenberg:
What do you mean you're bummed about what Silicon Valley has become?
[00:51:21] Sam Lessin:
Well, let's take this AI thing. I'm very, very anti-investing in AI companies. Now, I want to be clear that I think companies like Meta are going to make an untold fortune on AI.
It's an incredible technology for specific applications. And there are clear ways to make money. AI as a technology is good. LLMs are cool. Like they can do cool stuff. They can help you build cool product features. You know, the reality is, the narrative in Silicon Valley the last several years that I've been pushing on is, because people are trying to deploy so much money, because there's this incredible incentive to tell everyone that everything is the be all end all, AGI, you know, multi-trillion dollar, everything is different this time, right? There's this narrative that there needs to be this next wave, and it's not intellectually honest. What it is, is people who read science fiction and are great storytellers building a market.
So they can deploy an enormous amount of money irrationally. And I just, I think it's disingenuous. I think that in some ways, this is my big thing with Elon, by the way–I find Elon to be such a complicated character. I'm a SpaceX shareholder for years. I think it's incredible. Some of the stuff he's done is amazing.
Every time he gets on stage and says things that I know he's too smart to actually believe. It drives me nuts because I think it's intellectually dishonest. And I actually think that's my current snark around the A16Zs of the world, the mega funds. It really comes down to that, which is, it's one of those things where they've gotten so big and they’re such good storytellers that I worry the people in leadership positions, the people who are actually driving these things, have kind of lost their souls and are talking a book, and a PR game, versus a real game of innovation.
[00:53:15] Michael Eisenberg:
But don't you think about AI in particular, you made the point about Facebook, right? So it is a pretty transformational technology, and certainly it's an application of efficiency in many areas, and its ability to scale things up with smaller numbers of people. I see that every day.
[00:53:32] Sam Lessin:
Sure. But that's just normal. That’s just more technology. I mean, the Facebook ad, again, the Facebook ad auction is going to become unfucking-believably efficient. It already has. It's going to keep–the ability to turn the crank on that with AI is off the charts.
The ability to turn the crank on–have you ever actually tinkered yourself and tried to run Facebook ads? Building copy, copy variant testing, all that stuff. It's like, you couldn't come up with a better technological step for making more money as an ads platform. You know, that's different than this AGI bullshit, right?
That's different than the NVIDIA is worth X trillion dollars, is my basic view.
[00:54:12] Michael Eisenberg:
What is my view of NVIDIA being worth X trillion dollars, which I don't have a perspective on, though, is, I'm old enough to remember Wintel, the Wintel monopoly.
[00:54:20] Sam Lessin:
Yeah.
[00:54:21] Michael Eisenberg:
Windows and Intel. I think NVIDIA is Wintel in one place, right? It's got the software overlay Cuda, and it's got the chip manufacturer, at least for now. But I think there will be other competitors as you know, there was over time to Intel,, but I'm certain to think like, iconoclast Sam is at work here and you wanna attack the establishment.
I don't say this in a negative way. Because you have a vastly different perspective, maybe coming from outside the Valley Network originally, New Yorker, went to Harvard. But I think there's, I'm trying to find the words for it, which is, you set up a smallish fund, Slow, you called it Slow–which is not fast, even though the world is speeding up–that obviously creates friction in the brand name. A16Z, which has become this brand, has raised large amounts of capital. You've done well for yourself. And so you sit kind of on the outside of it, (by the way, while your wife is the ultimate kind of insider whisperer in The Information) and say, “No, this has gone off the rails.” Do you relish that position?
[00:55:20] Sam Lessin:
Absolutely. So I'd say a few things. One is, let's be honest. I make my money on being an outsider-insider, right? And by the way, my bet structure, if you think about it–even if I wanted to be an AI investor, which, if you didn't get the drift, I don't–there's no money to be made in it for me, right?
Because anything good gets bid to infinity, and anything that's bid at infinity for a seed investor, if you're only deploying a few million dollars is like, it's just not worth it. Like the math doesn't pencil out right ever. And so it's incredibly self serving for me to say, to shit all over this stuff, is what I would basically say. And I'm transparent about that.
But I would say on the flip side, here's the thing. It's also, you know, the outsider-insider thing is like, you always do the best when you're kind of between ecosystems. You never want to be fully in an ecosystem. You want to be a translation layer, right?
That's where all the values created in the money is, when you understand multiple domains or sit between them. And look, I like shitting all over the big VC funds, but I also get a lot of value from the fact they always take my phone calls.
[00:56:32] Michael Eisenberg:
You know, your partner Will told me to ask you, is there anything that would get you to be a seed investor in an AI company?
[00:56:41] Sam Lessin:
Yeah, I would seed an AI company with–and I actually have done this–people that I feel like I know better than the rest of the world, and the world is discounting for stupid reasons.
And if the price was right. And if I had a compelling narrative story about why this time it's different. So I think it goes back to, you get to do this, I get to do this. It's the coolest job in the world. People spend years figuring shit out about the world. And then what do they do?
They come and they tell you their crazy ideas that people haven't heard yet, or like that are different. And you're like, why, you know, why you, why this, tell me something about this industry you've never heard of, and why, and then you get to learn professionally, and place bets where–here's the best part, Michael, is I love the freedom of being allowed to lose money. Right?
If you're A16Z partner 1005, and you're trying to write a hundred million dollar check, you better not fucking lose the money, right? Whereas I get to write checks all day long with very little oversight, to weird shit.
And when you're wrong, it's fine, because people only remember when you're right anyway.
[00:57:56] Michael Eisenberg:
I once heard that from Bob Kegel like 22 or three years ago. “People remember your winners. They forget your losers.” And that actually had an incredible impact on my career risk appetite.
[00:58:06] Sam Lessin:
That's true, unless you really fuck up, right? So the key is to not invest in FTX. The key is to not fuck up morally in a way, or to like drop the ball very, very, very publicly. But the nice part about being a seed investor, I mean, I'll give you an example. At Slow, we probably run about a 1 percent fraud rate. About 1 percent of the things we've invested in turn out to be fraudulent, and it's really funny–we get letters, for instance, as a firm every once in a while, from State Correctional Facility 3827 saying, you know, “Your prisoner, blah, blah, blah,” and it's because every once in a while we've screwed up and invested in something that's like straight up fraudulent, and the person's ended up in jail.
Right? And like, that's not okay if you're moving huge amounts of money, or if the person murdered someone. But if you're trying to find the edge, right, and the next thing, you're going to screw it up every once in a while.
[00:59:01] Michael Eisenberg:
So you're saying, it turns out that the line between crazy and criminal, or crazy and fraudulent, is actually pretty thin.
[00:59:07] Sam Lessin:
Well, I'm saying that I as an evaluator of crazy and honest versus crazy and fraudulent, am not perfect.
[00:59:14] Michael Eisenberg:
Okay. I like that framing. Speaking, by the way, of being like the iconoclast of the outsider. So you made this bid to kind of get on the Harvard Board of Overseers, your alma mater.
Sam Lessin:
Yeah.
Michael Eisenberg:
Why'd you do that?
You know, I mean, I was in favor, by the way, I would have voted for you. But Harvard didn't accept me.
[00:59:33] Sam Lessin:
So, look, Harvard had a tough year, if you hadn't, if you hadn't heard. What was interesting was I loved Harvard. I thought it was incredible. So many of my friends, my colleague, my wife, I met at Harvard.
Like I think it's an incredible institution. And for years, I had been the one in my friend group, everyone was saying, “Harvard's off the rails. Harvard has these problems. It's not the school you went to.” And I'd always been the apologist. I'm like, “Nah, you guys are overstating it. It's fine. There are some problems. There are always some problems in institutions of that scale.” And then after 10/7 it became, I was like, “Oh my God, I'm wrong. Harvard is off the rails.”And it's easy to just walk away at that point, but I said, “No, this institution really matters to me. I think it's important in the world. It's not just a personal thing. It sets the tone for a lot of higher ed in the U.S. We should see what the situation is and how to put it right.”
So I put this bid in last minute, you know, I got more votes than anyone as a write-in candidate for Overseer. I almost got 1 percent of all alums to vote me. I still didn't get on the ballot because you know, by 30 votes we can debate at a later date, not a big deal.
But what's interesting, Michael, is I pivoted that and we built the 1636 forum. We now have 20,000 plus alums subscribing. You know, most of the senior administrators at Harvard and members of the corporation all subscribe to our newsletter. We're helping people organize donations strategically.
Like we've kind of pivoted it into this really interesting outside alumni advocacy group. I've gotten more and more effective about, what are the real problems and how do you push it forward? So, you know, this goes back to tinkering. I'm a huge believer in tinkering, and I think one of the nice parts about being a venture capitalist is, in some ways, I always joke that it's kind of like being a tenured professor of capitalism. Which is like,
Michael Eisenberg:
I don't want to be a tenured professor!
Sam Lessin:
Well, of capitalism. And what I mean by that is, unlike if you're operating a company, you have like shit to do every day, like you can't cancel meetings. Like, you have people to manage and things like that. And the nice part of being a VC is, if you're like, I have a thesis I want to go learn about, or I want to try something, you can kind of reorganize your time to go try that. And you learn a lot doing it.
And so I'm constantly building software projects. I'm working on things. And this was the thing I was like, I'm going to try to push on this. What have I done? I've learned a lot. I think we're making a difference at Harvard. I think we have a direction to make more impact, but by the way, it's also a great lesson in 2024 advocacy, digital technology, and the pieces that go into that. What needs to be built? I actually am going to release a project called Standard Aptitude, which I built from a lot of the insights I learned about what's actually going on on college campuses for interviewing and hiring. So to me, it's kind of, you muck around in the world, and you wander into interesting corners and you learn things, and those inform lots of interesting directions in terms of what to invest in and what to build and how the world works.
Michael Eisenberg:
So is Harvard redeemable?
Sam Lessin:
I'm actually an optimist. So here's what I'd say. I'd say that it's not just like we talked about the formal one zero versus, you know, others, and it's not a binary yes-no. Can Harvard be the perfect institution of–no, and it never was, by the way, either. Like, there's gonna be a bunch of departments that are messed up, and weird policies in places, and things funding you know, endowed chairs that make no sense. Like it's going to be mucky. Can Harvard do good in the world again?
Can it be a beacon of truth and academic excellence and free speech? I actually believe it can. There's not a silver bullet. There's a lot of things that need to change to get there, but I do believe it's possible. And I think it's important. We talked about trusted brands before–I would argue that it's not the church in the U.S. It might not be the army. I would actually argue that the Ivy League played somewhat of that role for a segment of America in an important way, in leadership positions for a very long time. Which is, everyone knew each other, you kind of had all taken the same classes. There was a baseline of trust and consistency that really mattered. Not, again, for all American society, but the reality is for a lot of smart, very driven people that were in a lot of different pockets. And I think that's important, I don't want to give up on that. I think that's important.
[01:03:46] Michael Eisenberg:
Is there anything that would cause you to give up on Harvard?
[01:03:50] Sam Lessin:
yes, I mean, look, for me, the question is–it's a bummer, but it's always a question not of, is something possibly redeemable, but effectively thrust to wait, which is, you only have so many hours in the day and days in your lifetime, right? And where is your time best spent? You know, Harvard, if you had the wrong president and the wrong head of the corporation, and they wanted to go down a path that’s not focused on academic excellence, and they really made strides towards that. They, you know, were not interested in academic free speech, et cetera. And they really took a bunch of steps–my basic view is like, it's just pushing a boulder uphill, and that's the best way to ruin your life and waste your time, right? This is the thing you learn from startups. You want to be running, if not downhill, at least on a flat surface with a downhill on the horizon, is what I basically say. The second you look at a mountain and it's straight uphill, it's doable–it's possible to redeem it–but it's just not how I'd spend my time.
[01:04:52] Michael Eisenberg:
The academic year starts in a couple of days. You think we'll see more or less antisemitism on campus over the next year?
[01:04:58] Sam Lessin:
Great question. Well, I actually have about a hundred-page slide deck on, it was called ‘back to school,’ about what I've learned about the situation at Harvard and higher ed, and what the key leverage points are, that I'll be publishing hopefully in the next day or two, getting it out there. It won't be perfect.
Michael Eisenberg:
I will include that in the show notes.
Sam Lessin:
There will be typos, but I think it will have some good points in it. Here's the interesting thing. My default is it's going to be bad. If you think about it. Like none of the fundamental problems have shifted. You know, last year there was a lot of quote unquote enforcement of policy that kind of went awry, and ended up amounting to nothing. There's a lot of reasons to be skeptical.
I've actually talked to several current students, you know, people who worked at The Crimson, people who actually have much more on the ground, informed opinions, and they're actually optimistic. What they say is, “look, we talked about how a lot comes down to people. A lot of the key instigators are gone. They've graduated, right? They've moved on.” They've said that, you know, the reality is that while the enforcement was fairly weak at most of the schools like Harvard, one, a lot of the students were shocked there was any at all, and they all want to graduate and get jobs.
And then beyond that you know, the schools have really, they're prepared now in a much better way than they were. The last stuff, quote unquote, caught them off guard. We'll see. And then the last thing is, there's a whole thing about the graduate schools, some of which are pretty cray cray, and kind of their relationship on campus and things like that.
So, it's interesting. I was default expecting, even a month ago I would have said I think it's gonna be pretty bad. But, I've heard some hopeful things from students in the last several weeks of talking to them.
[01:06:32] Michael Eisenberg:
Given what you said, do you think it was correct that they let these kids off from the disciplinary action, ultimately letting them graduate?
[01:06:38] Sam Lessin:
No, I don't.
[01:06:39] Michael Eisenberg:
That's bad moral hazard, I would guess, right?
[01:06:40] Sam Lessin:
Well just to tie it together, we talked earlier about this whole formal versus informal systems, and enforcement. If you have a policy of one plus one equals two written down, and you do not enforce it, it is no longer a policy, right?
And so this comes out to like what we're talking before, which is like, how do you manage humans? How do you manage systems in highly dynamic, complicated situations? And one of the most important things–you have kids, I have kids. If you don't set a rule, you can negotiate.
If the rule is set, you have to follow through on the rule. It's like whether you want to or not. Now, I will say that this goes back to the whole media, what you trust and don't trust, always more complicated than it looks. It would be great if Harvard had a single policy with these things. It doesn't. Each school has its own policy. The policies conflict. People look at one school's policy and they get confused about the policy. So I'm not letting them off the hook. They screwed up, and they should have enforced their policies. But I'd also say it's not quite so black and white as people make it out to be.
[01:07:42] Michael Eisenberg:
I want to finish up with a couple of reasonably quick questions, although one of them I think will be harder. So, Arielle Zuckerberg told me I should ask you that, she said, “First of all, you have off the charts charisma and infinity hot takes,” and what she said I should ask you: what motivates you to keep going?
I think I know the answer to this question, but I don't know if they're your words. What motivates you to keep going?
[01:08:01] Sam Lessin:
Well I'll tell you the flippant answer I always give is, I love it when people think I'm wrong, but I'm right. And then I can later say, “Ha ha, I told you so.” So I'm mostly in it for the fuck you, I'm right.
And that's kind of what I care the most about, is if I can get to every year or two, get to point to and say, “I told you I was right, and you didn't believe me, and fuck you.” That's just a great feeling. And I would argue that capital is a great way to express that. It's like one of the most pure systems for that.
So, I mean, that's one thing. Look, I will say I really, I think about our portfolio and the things I'm most proud of, and it really, really all comes down to people. It's individual entrepreneurs who you were able to enable, and they remember it. And they're like, “You didn't do this, by the way. This is my thing.” And any venture capitalist who thinks they did something is kidding themselves. But like you opened a door and that made all the difference. And that feeling really, I absolutely love, especially when people are being constant. I mean, part of my anti-establishment thing is like, the kid who went to Stanford and Y Combinator and has 87 term sheets from every single fucking person on earth, like, fuck that kid. It's like, they'll either be fine or they won't be fine, right? But it's not like they weren't enabled. Like, I think the key is like, can you give someone a shot?
And then when those shots happen, and convert. and make a difference, that just feels awesome.
[01:09:32] Michael Eisenberg:
That line reminds me of my good friend Roland Fryer, professor of economics at Harvard, who I love dearly. So both Kevin and Will told me to ask you a question. I would have asked it anyway, by the way, because I knew your dad, who was an incredible person. And he was pretty legendary. And what I'd love to know is, how did your dad impact you, impact your career path and kind of some of the most important lessons you took from him?
[01:09:55] Sam Lessin:
Oh my God.
[01:09:56] Michael Eisenberg:
I know you were pretty close with him.
[01:09:57] Sam Lessin:
I was so many ways.
I mean, the sad, sad slash nice thing for my dad is, my dad had a lot of sickness in his life, and knew he was going to die young. And so I think that, you know, he died at 56, and I think he made a point of writing a lot, but also just spending a lot of time and really teaching me.
I felt like I was under his tutelage for years through the internet, you know, 1.0 ,and his career and things like that. I miss him a lot. There's so many lessons. I mean, he wrote a book, ‘Lessin’s Lessons,’ which is actually quite fabulous about his life lessons across the boards. I don't know.
I think that he really was a joyful guy who appreciated life, and appreciated quirky people, in a really deep and personal way, is I think one of the things I took away. I think I took away dreaming from him. And you know, and kind of seeing the big picture and a lot of amazing things.
The guy also loved, you know, The Post, and newspapers of world events and all sorts of stuff. So it's hard to distill, but I took a lot from him for sure.
[01:11:01] Michael Eisenberg:
Amazing. One last question. For the listeners who don't know, Sam is an incredible skier. An incredible skier. I'd love to know.
[01:11:08] Sam Lessin:
I got that from him. He made me ski a lot growing up. But yeah, go for it.
[01:11:12] Michael Eisenberg: I only saw him in the boardroom. That was, I guess, tougher than skiing with him. But he was an incredible board member. What are some of the lessons you take from skiing to your business life?
[01:11:24] Sam Lessin:
That's an interesting question.
I do love skiing. I ski a lot. I grew up skiing. And at this point, we do some pretty wild things. Here's what I basically say. I think one of the interesting–one of the things I think I've figured out over time is, people think that I'm, I think a risk-seeking person, right, in a lot of ways.
I certainly sound like one on TV. I actually think that by default I'm actually not. And so, I'll give you an example. The first time I went helicopter skiing. Fucking scary, right? Like, you're landing on top of the mountain, sometimes on a knife edge, there's like blades flying around, there's all this crazy stuff.
And your heart rate goes up and it's like, and it's scary. And now, again, you still need to respect the environment. You need to respect the machines. You need to respect safety. There's a lot that you need to be prepared for, but there's this thing of understanding calculated risk, right?
Understanding what actually feels unsafe versus actually is unsafe, because there's a difference there. And where that line is of your personal ability and what you can take out. There's just a lot of lessons of being around those types of things that now all of a sudden, you know, I can go around and have a blast.
And like my heart rate isn't necessarily going up. I'm not trying, it's not like–you're not at an 11, right? In a lot of ways. And so I think the difference between perception and reality of risk, I think, is like a really important thing from skiing. But I think it's something you practice.
I'll give you another example which is relevant, which is like last year I got my pilot's license. And here's the thing, I am not one of those people who got on an airplane for the first time and was like, “This feels awesome, I love flying airplanes.” I got on an airplane for the first time to fly it, and then especially to land it–landing's the hard part. I was fucking terrified. I actually, I have a Garmin watch, and I was getting an abnormal heart rate, as you're trying to figure this stuff out. And then by the time you get your license–again, I'm not the world's greatest pilot, I don't have 10,000 hours, but I can fly alone and I have a license– and you no longer have the abnormal heart rate.
And so I think the biggest thing, and I think a lot of life, but it's certainly startups, is just understanding, one, perceived and actual risk, which you certainly get in more extreme skiing. And then two, that learning to manage your emotions and your reaction to stress is a practiced thing, it's not an innate thing. I think are kind of really key lessons, and I think they apply to entrepreneurship a lot.
[01:13:48] Michael Eisenberg:
I love that. I wanted to ask you as a last question, what is one thing you think about the world and its future, or technology in its future, that you think nobody else asks or thinks? I think it's a perfect last question after your last response.
[01:14:03] Sam Lessin:
Oh, man. What is one thing that no one, no one's enormous, eight billion people, I'm sure someone thinks almost everything.
[01:14:10] Michael Eisenberg:
The unique Sam view.
[01:14:13] Sam Lessin:
So I'll tell you the thing that I'm not sure this is fully unique, but I've certainly been on for a while, and I'm really excited about that in terms of unique view is, I am really excited about entrepreneurship in America, but I think the model of Mark Zuckerberg's and Elon Musk's and, you know, Jeff Bezos's is in the rear view.
So let me explain what I mean by that, which is, you know, Americans love to be entrepreneurs. They love to build, so do Israelis, right? There's a freedom, the creation, the uniqueness. There's so many things that people love about it. But for 20 years, I would argue the narrative of what an entrepreneur is, has been captured by the Forbes headlines, the Zuckerbergs of the world or whatever.
And so you've had thousands of kids pouring out of universities and saying, “I want to be an entrepreneur. And what that means is I'm going to Silicon Valley, and I want to be the next trillionaire.” And I actually think that the big thematic thing that I'm very much after and I see happening is, I think that everyone's just as bullish about entrepreneurship–that era of chasing that is over. And the reason is that there was this moment in history with the birth of the internet. I don't think AI is it, although everyone wants it to be another echo boom of it, where you really could change the world and build platforms that had never been seen before on Terra Nova at that type of scale.
And I just think what we found is that there's like 10 of them. They're mostly done. And if you're going to spend your life pursuing something, pursuing that, and then almost certainly failing–sucks, right? Whereas what doesn't suck, and where there's enormous opportunities, is in the towns of America, having relevance in small businesses. All these things which are not going to make you into a 100 billionaire, but can provide incredible lives for you, incredible meaning in a community and purpose in all sorts of places.
And so I guess it's not a tight answer, but I'd say the thing I believe is that entrepreneurship is alive and well. But the narrative of what entrepreneurship is, is moving away from Silicon Valley and trying to be the next kajillionaire, to instead being really important and meaningful in your community, whether that's a community online of creators, whether that's a town, like it's really shifting pretty dramatically and pretty quickly.
[01:16:36] Michael Eisenberg:
Sam Lessin, thank you so much. This was an incredible conversation. It went in places I didn't expect it to go. So thanks for playing along.
Sam Lessin:
It was fun.
Michael Eisenberg:
It was fun! You can learn more about Sam on X. And that's @Lessin. That's spelled L E S S I N. And if you enjoyed this podcast, please rate us as five stars on Spotify, Apple Podcasts, or wherever else you listen.
And Sam has a podcast, also called More or Less, with Dave Morin and Jessica Lessin.
Sam Lessin:
And Brit Morin.
Michael Eisenberg:
Is that correct?
Sam Lessin:
It's the two couples. It's the More-ins, get it? The More-ins.
Michael Eisenberg:
Oh, I got it. I love it.
Sam Lessin:
And the less-ins. And the Morins think everything is great, and the Lessins think everything sucks. That's kind of the shtick.
[01:17:15] Michael Eisenberg:
And I highly encourage you to listen to Sam and Jessica's and Dave and Brit's podcast. Sam, thanks for joining.
Sam Lessin:
Thanks for having me, Michael.
Executive Producer: Erica Marom
Producer: Sofi Levak & Yoni Mayer
Video and Editing: Ron Baranov
Music and Art: Uri Ar
Design: Rony Karadi