Narratives
Narratives
125: Jeff Huber - What comes after modernity?
0:00
-43:29

125: Jeff Huber - What comes after modernity?

In this episode, I'm joined by my friend Jeff Huber to discuss just what comes after modernity. We also discuss the current moment, can we build a definite vision of the future, Peter Thiel, the role of Christianity, and a whole lot more. You can check out Jeff's work at https://twitter.com/jeffreyhuber

Transcript:

William Jarvis 0:05

Hey folks, welcome to narratives. narratives is a podcast exploring the ways in which the world is better than in the past, the ways that is worse in the past towards a better, more definite vision of the future. I'm your host, William Jarvis. And I want to thank you for taking the time out of your day to listen to this episode. I hope you enjoy it. You can find show notes, transcripts and videos at narratives podcast.com.

Will Jarvis 0:40

Well, Jeff, how you doing this morning?

Jeff Huber 0:43

Doing pretty good. For the call down here? It's kind of funny. Yeah, it's good. It keeps you awake. That's right.

Will Jarvis 0:49

That's right. It's good. Well, Jeff, thanks so much for taking the time to come on the show. Do you mind giving us a brief bio and some of the big ideas you're interested

Jeff Huber 0:58

in? For sure. So my name is Jeff Huber. I live in San Francisco. My day and somewhat night job is running a sort of a fairly new startup called Chroma. We focus on AI tooling. You and then my, my night night job, I guess, are some of the big ideas that I've been interested in for over a decade or more now is sort of around like, you wouldn't have a way to say it is like what comes after modernity. And I think like to frame that up a little bit. Seeing as many people have, right, it's really a novel observation, in some sense, but like, you know, seeing a lot of the like, apparatuses and like beliefs that were certainly true. And I was born and even growing up, like just fading away year over year. And you know, that leading to a lot of sort of weird and sometimes violent manifestations. And kind of wondering, like, what comes next? You know, like, are we at a few gay, amen. End of History? Is there are there new good forms of government? Do we just digress back into, you know, clans and, you know, more scorched earth? zero sum war, like, what happens next? And of course, it's the future. So, so nobody really knows. But maybe that's why it's the most fun thing to talk about. Because, you know, nobody knows. So there's truly no experts, right? Like, it's, it is truly anyone's best guests. And, and that's the fun part. So, I love that.

Will Jarvis 2:29

I'm curious, I want to get started in the first, Kia, it's something I've always been curious about, how do you think about defining pre modernity, modernity, and then something like post modernity?

Jeff Huber 2:39

Totally. So pre modernity, in my mind is, so one thing that I like to do is use this framework and define for any given time. What do they believe about the nature of man, mankind, the nature of the material universe, and then the nature of history or the meaning of history. And so in pre modernity, you know, man was, man was violent, you know, maybe sort of like a Hobbes in man, right? Nature was enchanted. You sort of know, we worship the rocks, we believe that there's sort of like divine and spiritual forces that are gonna run through all this stuff, the material stuff. And that like history was cyclical, there was no dialectic or progress to history, it was just sort of like who's in power this century, more or less, like that was the framework, right? And then, you know, kind of my reading of what happened is, really, because of, you know, Judaism, and then Christianity, and then Christianity being spread through Rome, and through the Roman Empire. It spread the ideas of, you know, man as, as rational, but the idea of like the material world as knowable, and study bubble, because there's a good God. And let the idea that progress is possible, because there's a dialectic there's a meaning of history and a progression, you know, in the sacred texts of Christianity. And so, you know, those ideas, secularized led to liberalism in democracy as we see it today, in the West, it led to the scientific revolution, because you can study nature and you can understand it. And it led to the technological revolution because we can make new technologies that improve our lives and make them more abundant, etc, etc. And I think that like post modernity, usually how it's defined is a skepticism of meta narratives. That's like one simple way to define post modernity there's much there Right? There's like your, your dairy does your Foucault's your Deleuze and Guattari. He's like the whole crew of like French postmodern thinkers, the Frankfurt School. And I don't actually think of them as after modernity, I think of them as hyper modernity. They actually took the skepticism that modernity had for meta narratives and They are the accelerationist of modernity, fundamentally and so, and that way they are not. They're not representative in my mind of what comes after modernity. They're instead representative of, you know, sort of late modernity, they are like the end of modernity, fundamentally, they are modernity for eating itself. And I think it's kind of obvious that, like, the nihilism that, you know, post modernism generally brings is not really tenable or holdable, by any human individual in practice. Right. And, and so, but, but still it you know, it represents this, like, hybrid majority in my mind. So, got it. Got it.

Will Jarvis 5:33

That makes sense. That makes sense. I'm curious. It seems really difficult to judge like the current moment from where, from, if you're sitting in the current moment? Yes. This is really difficult question. But how do you think about parsing where we are currently? And is it just trying to compare it to the past as best you can? It just kind of feels like you're always looking at the side of an elephant, you know? And it's like, how can you tell if an elephant if you're just right on top of it seems really difficult.

Jeff Huber 6:03

Yeah. Yeah. Or, you know, fish don't know, they swim, swim water, right. Like that kind of thing. I agree, it's challenging. That's not to say that it's impossible. And it's not to say that it shouldn't be attempted. Yeah. And very few people attempt it to be fair, right. Or people that do attempt it, I commonly find do a really bad job. So like, that's not to say that I'm doing a good job, I might also be doing a bad job. But you know, love, love to get that feedback. So yeah, it's sort of two questions, right? It's like, how do we how do we evaluate the current moment? And then like, what is the current moment? So maybe to take the second one first, like, what is the current moment? So I see that like, essentially, the belief as you know, man, as rational has started to fall apart. And maybe you can point to like Dan Ariely. He's, you know, predictably irrational, you know, the whole, like, behavioral economics movement, as like, hey, actually, people, we don't actually believe in this, we don't believe that man is rational. And people like, oh, yeah, actually, you're right. We don't believe that, you know, and like, Man is rational, obviously, has a deeply intrinsic idea that as managers, as individual as well, and, you know, increasingly, that's not a popular thing for like people to be individuals with, you know, sort of individual choices and individual rights and right, it's sort of, you know, more about, like, mass collective identity. And so, again, it's like, sort of exactly point to, like, where, and why and how this started. But I think that, you know, again, the topic of, of man, mankind and the philosophy of that, like, it's evident to me, at least, that people don't really believe that man is rational anymore. Another good example of this is like, if you believe that man is rational, then the best way to create a society is for make everybody extremely educated. Because like, that would mean that people are smarter, and then they can make better decisions collectively, right? Excited better. And yet, like nobody actually believes in education anymore, either. Like nobody, nobody believes that, like, well, a lot of people disagree with my political, my political points, but if they're more educated, they were better. Like nobody actually thinks that like, people just want to wield raw power, as opposed to try to, like, convince through, you know, rational arguments. They're sort of, you know, opponents are the next generation. So. So that's the rationality point. You know, if it was science, like, you know, there's just so much fraud, fundamentally, that happens in academic academic science, especially today. You know, you have pee hacking left and right, you have people, you know, having the research completed before they read the grant application, because then you get the grant application to work. And to get more money. You have people just doing marginalia, like over and over and over again, in the social sciences, you have people just applying, you know, critical theory to acts like one more time. And, again, we're making no real contribution to the future. And then in technology, you know, just sort of this is not unique to me, right. But there's sort of a general observation of like, a slowdown of like technological progress and ambition, right? And I think that also like slots in here. So that would be the sort of the ops the observation of like, what, how would you characterize how would you even come to believe that potentially, we're exiting this thing called modernity, in Jaffa modernity is to look at well, what were the values that we held during majority and that just look around you and like, ask your friends and like, look at the news and say, like, well, do we, you know, is it obvious in evidence? Or, you know, is it even apparent that in the world around me, you know, people believe broadly, that like, Man is rational, that the world is understandable? And that, you know, progress is possible, right? And it seems like on all those vectors, the answer's no. So that's one man's analysis, at least

Will Jarvis 9:18

I like it. I like it. Even this effort were part of this week AI grant. It's quite interesting, because it almost seems like it to your point, it's something like well, like you're thinking, the ability to reason all these things, rationality is too difficult. So we're gonna offload them to the computers that can take care of these things for us.

Jeff Huber 9:37

Yeah, there's definitely an reading of AI as not a, you know, deterministic, optimistic future but kind of a indeterministic. And, you know, maybe like, pseudo pessimist, pseudo optimist future. You know, how does he get better? We didn't really know. We just needed more data. Maybe we'll get better. Will it be good? We don't know. Maybe it will like, you know, alternate integrate Do and paperclips girlfriend or maybe we'll get like, you know, happy, you know, perfect society at the end. So, you know, I don't guess the only framing of AI but that's sort of that is the pop cultural for me of AI. Yes, it is indeterministic and that it is neither good nor bad and sort of in like a you know, the protagonist in many modern TV shows, right? There's no good or bad protagonist anymore. It's always complicated. Yeah. And I feel like that's true about AI too. It's complicated.

Will Jarvis 10:27

Or, ya know, what you make such great point there. It's like, we saw Sam Altman last night. You know, it's this, this vision of this, like abundant future driven by super intelligence. And then, like you said, like, right across the bay here in Berkeley, it's all about the alignment problem. It's definitely going to kill us. Maybe this will Peter down there. Have these people have the money more anymore? Yeah, if the explosion, but it does seem to be something like, wow, it either it's just absolutely going to kill us, or it's absolutely going to be this like crazy about it thing, but it's like unknowable what's going to happen, and we just need to push through it as quickly as possible.

Jeff Huber 11:03

Yeah, I mean, in general, I think that a more positive view of the meaning of history. And this kind of dovetails a little bit into, like, ideas that I have about what healthy forms of these views could come in after modernity is on progress specifically, is being just less utopian about it all. And neither saying that AI will cure all ills in the world and make a perfect society, and also, neither saying that AI will end the world and will be the end of mankind, right? Like both of these are very, like, you know, either they're both very utopian, or they're very dystopian. Yes. And, you know, I sort of fundamentally believe that, like, only 111 being has the right to start or end history. And that being is God and any attempt to play God is, you know, necessarily, blasphemy, basically. Yeah. And the call of the, you know, the person in the world is not to end history, or to start history, the call of the person in the world is to, like, work at the margins to make the world you know, a little bit better than they found it basically. Yeah. And it kinda goes into the the life extension thing as well. It's like, you know, should we make it so that, you know, humans can live forever, like, upload themselves to the matrix? Are we like, you know, are able to replace our organs forever and ever? And, like, I think that is essentially blasphemous. But I think that the idea in general of like, well, we should make give everybody five more quality adjusted life years is a good thing. So I think that also applies to basically sort of my views on AI, which is like, neither will it be this like utopian thing that makes perfect. Humans are not perfect is not gonna be perfect. Neither will it be this totally dystopian thing that like, won't be able to be controlled and will take us all over. You know, I think that it's, it's, it's probably somewhere somewhere in the middle. So and that's, that's the most healthy thing to do is just sort of like, you know, again, view view, the marginal impact and try to make progress at that margin.

Will Jarvis 12:49

So right, that makes sense. That makes sense. Does this feed into why you're working on AI? And watch?

Jeff Huber 12:55

You don't have it fully, but I don't think like fully connected to yet. Yeah. The Yeah, the streams are, are, at least in my conscious brain, they're a little bit different, though. Maybe there's like some subconscious, you know, psychoanalysis here and there, which, which has linked them together? You know, I certainly think that, like the future is open, that the future is possible. I believe that people do not believe in the future do not care about the future. Broadly, most people are extremely obsessed with litigating the sins of the past. That's really all I care about, or maybe you know, litigating the potential sins happening in the next month. But nobody believes that the far future obviously, long term termism is very in vogue these days. Even though it's financing, well, maybe a little, you know, a little more meager in the next, next next next year. But again, the I think that long term ism is not a it's not a new idea. And long term ism goes back to, you know, the individuals that worked 1000s of years ago to, you know, build, you know, large monuments across the surface of the earth. But including, you know, I often think about the cathedrals in Western Europe, you know, these, like giant structures that take over a century to build, take 10s of 1000s of man hours, probably, you know, equivalent, Li adjusted, you know, near a billion dollars to make. And, yeah, that takes some, you know, to work on something that like, you cannot finish in your own lifetime. Like it's a factor, I think it's a big deal. And obviously, now you have things like the lung, the lung, now clock and stuff, I think, are trying to create sort of a modern symbol of that. But yeah, so sort of back to the, I guess, question and like, you know, does it how does this interact with my, my day, my day to day work? Yeah, I'm not exactly sure. I think that, you know, I do think that I guess, I guess one other quick sidebar here is that I do think that like, and we talked for this last night, but you know, I think that fundamentally the two sort of primitives in the world are near free intelligence and you're free energy are sort of the two primitives are energy and intelligence and making those near free unlock new abilities for abundance in the world. Yeah. And and scale fundamentally and obviously like, you know, power. I think that there's really good to buy to from scanning Andy Crouch about about power, which I think is a useful framework, which is the two by two is power and sort of no power, lots of power, no vulnerability, lots of vulnerability. And people only ever sort of analyze power along the dimension of like, well, we understand what no power and vulnerable looks like, that's like the homeless person on the street. And we understand what power and no vulnerability looks like. That's the dictator. Yeah. But what is power and vulnerability? And, like, I mean, from a Christian perspective, that's probably Jesus, or God. And, like, both transcendent and imminent, right, yes. And, and I think it's like, it's underrated. People don't ever think about things in that term. And so when you think about, like technologies that could create a lot of power, obviously, the individuals that steward them is really important, but it's not it's not instantly, you know, bad for the world, basically, like there can be good incarnations of it. And it's not up, you know, it'll naturally just happen that way. I don't, the cards will just fall that way. But I think it's like up for us, like collectively to choose, you know, which way these things fall. And we have to choose.

Will Jarvis 16:05

Yeah, do you? This is kind of a weird question. But do you think it is possible to kind of create a new definite vision of the future? So you know, I, you know, Peter Thiel is always like hammering this, we need this, we need this, we need this. But then you ask him and he's like, I don't know. I don't know. And you're like, wow, you know, you're a lot smarter than I am. So I don't know how the heck yeah, was to figure this out. Right. But yeah. Do you think it's possible to do that at this point?

Jeff Huber 16:29

Yeah, it's but it's funny you bring up like, you know, Peter, you know, if you have one critique of Peter, it could be that, you know, that's the best way to get Peter, you know, none of this other stuff that people write about him really is a good gig. And that's pretty good. him is like, Okay, where's, you know, where's it? Yeah. What is it? Like? And like, I don't know, you know, maybe, you know, this isn't there, like, you know, generational aspect of this, right? Where he's like, kind of like, okay, my job is to like, you know, be the signpost not necessarily, you know, point the direction or whatever. But

to do it yet, I do think it's possible. I mean, anything's possible, I do think it's probable that it will gain global scale? No. Do I think that it's possible that all gain in some local communities scale? Yes.

And I think that, basically, what you have to do the turn, you have to make to get to that definite, the possibility of a definite version of a future is the, the turn from late modernity, to being open to what's next. So, you know, it's basically the turn from thinking that like, this is the end of history to accepting that there's gonna be new possibilities, new chapters, and new forms of government and new forms of science and new forms of technology, that there's ways to do those things that are like, new and different and new. I think, also, I want to be a little bit, I want to want to put some context around new. I think that like new, obviously, new and of itself doesn't work. Because newness is flighty, and newness doesn't guarantee that it will resonate at scale, right. So I think that has to be enriched and formed from history. And it needs to whatever population you're talking to, it needs to like almost like tap into their, like psychosocial primitives to, like, be able to have traction and get scale. And then I think also, like newness is a it's just some ways overrated, and that there's a lot of really good ideas in history that have been lost and ignored. And I think, probably to a large group, and just, like, pick those up and repackage them. And, and that's good enough. So yeah, new is maybe newest sort of, I mean, new is, you know, so intrinsically at a modern word, right? Like, the idea of newness is inherently this sort of, like, it's like, coincident with like, modernity, right? Um, and so, but the idea of like, these hybrids of like, the past and the current, and the moment in the future, I think is maybe more of like a after modernity framework for thinking about what is new. So anyways, that got a little like, meta pretty quickly there. But it was fun. So,

Will Jarvis 19:06

absolutely. I was like that. That's cool. I do see Christianity fitting into this whole puzzle piece.

Jeff Huber 19:13

Yeah, I mean, think of the West, like, mean, Christianity is still like the water in which we swim. You know, that's not to say that, like you can, I don't think you can really make this argument by looking at like church attendance or, you know, polling or whatever else. You know, more what I mean by that is like, you know, the work that like Tom Holland has done in his book dominion, and other people have talked about hasn't just been him, obviously. But the idea that like the value still, which, like we, as the West hold, are intrinsically like Christian, basically. So, yeah, I think it's the water in which we swim, is still the culturally dominant thing. Even if I mean, you get one. There's some arguments for that, you know, atheism is also Christian in some sense, because it was only the belief that like nature could be DNA. It did, which lead downstream to the idea that atheism is possible. But I'm sure atheists are gonna hate that. So please do not do not DM me on Twitter about that, I will not argue with you. But, um, but yeah, so I think is really important. I think that like, when I think about like, okay, what are the possibilities around like this definite optimism, view, philosophy, theology, the future, it being looking for founds, or sources of thought, as I mentioned earlier, these like, sort of hidden lost ideas in the past that have the intellect that could particularly fall on like, the rich, fertile ground of where we are right now, I think looking at Christian history is a really good way to do it. So a lot more to unpack there. But that's the that's the overview.

Will Jarvis 20:46

What kinds of things do you think are useful to unpack from like Christian history that we you have any specific examples that that come to mind?

Jeff Huber 20:55

Yeah, I mean, on the one kind of, there's a bunch of vectors here, one vector is the vector away from rationality as rationality as God at least so I think it's a good thing that people are generally rational. And people are irrational, it kind of gets under my skin. You know, I got to get under my own skin pretty frequently for that, you know, for that reason. But so, yes, it's not it's not that we want people to be more irrational. It's that we don't view rationality as the end all be all as God basically. And I guess observationally like that is one one way that you could observe this change is our people on the whole getting more mystical. Interesting. And, to me, the answer is definitely yes. Yeah. You know, you've heard many people say that, you know, new Atheism is cringe now, like, it wasn't 15 years ago, all my friends are becoming more weirdly mystical in different ways. Burning Man. Like, it's huge now. Like, I think, to me, it's self evident, I guess that like, people, there's almost like this new hippie movement, which is much more open to like, I mean, one way to say it is like other dimensional possibilities. Yeah. Besides like, the cold hard facts of rationality and atoms, right.

Will Jarvis 22:20

It's interesting. So zoo, we should lean into that more as we go. Well, it's

Jeff Huber 22:24

gonna be both good and bad. Yeah. So I don't Yeah, I don't I mean, you will both have probably, you know, new new Jones towns, you know, like, you'll have new crazy colts, right? Like, I do think that definitely gonna happen. So that'd be the bad side. But the good side is that, you know, one of the lies that I think the enlightenment and modernity said about mankind was that he was sort of this you know, fundamentally Gnostic bag of flesh, right, and not want to be like a gnostic, because I mean, that he's primarily a thinking being, and the body is sort of this like, sub sub human necessity for like the brain to move around. You know, people joke about like, this being true by things like Presbyterians, or Protestants, where he's like, you know, it's like the Brit. Yeah, the body is just like a bag of flesh to move the brain around, basically. And that's just a very, like, modern idea. So I think that's like, basically just, that's just not true about humanity, like humans need transcendence, like they need, like mystical experiences, right? That's just a very important part of being human. Some people call that obviously, like, religion, God, whatever. And so I think that, in that sense that being more open to this as a culture means that more people will explore this and use it more people will be able to be more authentic to their humanity, which I think is probably is probably good. And then there might be some, like very specifically, very, very good flavors of it as well, I guess beyond just like, the generic, the generic goodness. But

Will Jarvis 23:58

that makes sense. That makes sense. How do you think about, you know, like, how do you think about the this, this problem where, okay, like, we understand now that the scapegoat is in a sense? Yeah. And is it bad? You know, people still try to do it, you know, we're still trying to skip the scape gutters or etc, etc. Yeah. But how do you think about what comes like, you know, do you have any thoughts on what we do next with that?

Jeff Huber 24:33

Yeah, so, you know, for the audience, maybe your audience is already quite familiar with Girard, Medic theory, etc. I do think that like, basically, that's one strong candidate for a view of mankind post modernity, after majority is management medic. Man is rational. You know, the exact political implications of that I have not unpacked. I do tend to think that the view of man as a medic is sort of a post liberal red red pill in hiding. And, you know, post liberalism has a bunch of crazy people in it. So that's like, I mean, yeah, that's again, it's sort of a trigger word in some ways. But yeah, so, you know, the idea of like, what will happen to scapegoating? I don't I don't know. I mean, Gerard would say that like, scapegoating will become more and more prominent become because it becomes less and less effective. So it's like, the less effective that it becomes, the more that's unveiled. Like the more that it has to try to like still concert assert that it still works. And so the more extreme it gets, yeah. I don't actually know if that's evident to me that there's like a progression like that. But I think another way to think about this is like, one thing that I would like to see, and whether this is possible or not, whether this would be on net good or not, is another question for sure. But I would like to see a world that is more, more local and less global, simply because global contagion is bad. And, you know, global contagion, basically, what will happen, maybe eventually, is that like, we will have a truly globalized world, and there will still be violence, and then, you know, a single person will come come along, named the Antichrist, who will promise that He will restore peace and security to the, you know, the face of the earth, you know, sort of in exchange for our souls, basically. So this is like the, again, this the Christian God, the Antichrist, and, you know, basically global the cage, you know, just look at, you know, pick your, quote, unquote, the current thing, right? Like, it is increasingly becoming a global contagion. And I think that that is very dangerous, like extremely dangerous. And so, in some sense, like, if you believe that mathematic, you'd rather have a relocalization of communities simply to prevent the risks of global contagion. And so there might still be scapegoating. But at least it would be at like the community level, which would just seem less bad than scapegoating at at the global level.

Will Jarvis 27:03

Yeah, definitely less scale less, you know, industrial methods of doing it, etc.

Jeff Huber 27:07

Yeah, exactly. And also, just like less, you know, these things don't necessarily line up community over community, right. Like, they'll there'll be different variants of them, people will feel justified for different reasons. Like, it won't be like, you know, it feels like today. You know, and, again, this is dangerous to say out loud, but yeah, there's certain, let's just say there's like certain, like political leaders in the world that like, it feels like if they got taken out for some reason. And it's like, this is meta enough that like, it can fit almost anybody, right? That if it got taken out that like, we would all be pure again. Gotcha. And the world would be the world would be good again, right. And we would all be like, pure and good and clean again. Yeah. And like the, the blindness that will come from a world that felt all in one moment that they were purified, is terribly frightening, right?

Will Jarvis 27:55

Like, this is a huge, huge red flag. Yeah. So do you think at some level, people are overriding, you know, existential risks then? Because they like, you know, if you if you get obsessed with these things, they can become all consuming, and then you want to have some solution to fix it. And that looks like

Jeff Huber 28:17

Yeah, I mean, I think there's, you know, I guess the question is, like, what is the right amount to care about? Because never paying attention to it at all. Never taking it seriously at all, is probably bad. Taking it way too seriously, and not being able to get out of bed in the morning, because AGI is going to turn us on to Greg who three years from now, is also bad. But it kind of goes back to frankly, this view of of zero a broader idea of like, you know, is is progress possible? Do we like get out of bed in the morning and work hard to make it happen? Is it happening without us? Which is gonna happen? Or is it like all? Yes, it's related to how hard we work basically, to like, try to try to try to address it. So anyways, back to your question about x risk, like, I mean, it's, it's likely to be overrated, by those means. This is sort of a generic way to say there's very generic point, there's like likely to be overrated, or there's a talk about it as like the underrated by those that don't make sense. And you know, exactly what the correct amount of rating is, is, is anyone's guess, but again, on the margin, has people that are obsessed about x risk, could be a little bit less obsessed about it, and people that are paid no attention to it could pay a little bit more attention to it. Yeah. The problem is there's many eccentric X risks that that are oftentimes overlapping, and oftentimes at odds, right, and they're also the unknown, unknown risks. Nobody's paying attention to because they're on notice. And so, you know, and maybe there's some sort of, you know, I won't try to you know, pick a pick the analogy here, but there's some sort of thing where it's like Okay, if we focus entirely on AI risk, we, we miss the other risk, right? That that truly is gonna take us out or that so? Yeah, I think I think I think again like x obsession of X risk is just fundamentally like kind of almost a religious eschatology. It is a is a narrative about the end of the world that the end of the world happens to really be helpful, you know, people having a monetized eschatology happens to be extremely helpful for giving you focus, meaning, yes. However, you know, it's there's a reason that people say like, you should not in monetize the eschaton. Because when you analyze the eschaton really bad things happen. It justifies a lot, right? When you think the eschaton is imminent, and when you were working to monetize it, basically, any violent act, or any act of fraud, all of a sudden becomes totally fine. Right. Right. And I think that's, that's bad.

Will Jarvis 30:55

reps to take all your customers money, put it on red, and yeah, I

Jeff Huber 30:57

wouldn't make it. Yeah, hypothetically, hypothetically, that might happen. Yeah.

Will Jarvis 31:03

That's a bit of a danger. But I'm curious if any thoughts on this, do you think there's something wrong with like, how a lot of people have been thinking about expected value calculations, like, like, presses around long term ism, or like, just in context of the FTX blow up? It seems like perhaps you can get these things like brutally wrong or you underrate the risks of doing these things or something like that.

Jeff Huber 31:31

Yeah, I mean, obviously, expected value has in the word guests. Just basically, yeah, I mean, guests expected is the same thing as guest. So I've no problem people. As long as people have skin in the game, I have no problem with them having making their own Eevee calculations, basically, it's when it puts like society at risk, or it puts, you know, innocent people at risk, right? Or the person who makes the bet it doesn't have skin in the game. Right? That's like kind of the more bad version of that. So yeah, I mean, I think obviously, there's a classic economist, like all expected value is perfect, because there's no $20 lying on the ground. Yeah. You have like private equity and hedge fund groups that have been extremely successful, like finding, you know, mispriced assets, basically, and like, you know, whipping them into shape. And yet, the average trader, so can't beat the market. So it's like, kind of unclear. It's like, is anybody got this? We want to say no, but people have been like, extremely successful. It's like, Does that just mean they've been lucky in their time? You know, the time hasn't come yet to be lucky. So yeah, I think the broad video is basically like, I'm totally fine. People like having to interpret the value calculations, as long as they like, bear some of the downside.

Will Jarvis 32:49

Make sense? Make sense? All right, going off that, like, I will talk about efficient markets for a minute. No, clearly, you're somewhat skeptical because you're working on a startup. And so yeah, yeah. So you do believe it is possible to find these things. But at the same time, it does seem to be very difficult to beat the broad equity market. There's like some tension there. How do you think about finding $20 bills and sidewalk? And how have you like, tried to systematize that, you know, when you when you're building the new venture here?

Jeff Huber 33:20

I mean, the idea that there's no $20 bills lying around? Market is completely efficient. Yeah, I think is just intrinsically end of history. Got it. But there's no, yeah, there's nothing left to be done. Yeah. Which would be good or profitable. Yeah. And so we should just rent seeking chill. Yeah. It's a good show. I love it, which I'm not about that. So. And it's not true empirically, right. Like the machine has any study or just ever proven it to be true. Yeah. So it's like kind of this like collective disillusion that for some reason, it's like, you know, this sort of like math contagion, that we all believe this. People believe this. When it's just like, not true at all. But I think it's it is coincident with again, it is coincident with modernity, it is very, it is a very rational idea to believe the market could be efficient, is very rational idea to believe that a more efficient market over time, you know, will lead to more scientific, technological progress and all the good things. And so, yeah, it makes sense. It makes sense. Like people will leave it like in context, but right when you actually look at the facts, I feel like it doesn't make any sense at all. Makes sense.

Will Jarvis 34:25

Makes sense. I want to go back a little bit and talk about localism, globalism, couple of these things. Yeah, we're sitting here this this great global city in San Francisco. But you know, it seems like localism is something you're quite interested in. How do you think about building community in a place like this? Is it has been difficult? Has it been pretty straightforward? Have you think about that?

Jeff Huber 34:52

It's your question. I don't think about San Francisco as as a global city, actually. I mean, maybe it's because of all the NIMBYs that are here, but it tends to be quite local. Right? And quite navel gazing, for better for worse. I mean, if it's good does feel like a small place, once you've been here for a while, and I've been here for over 10 years, you know, you start to just like, see people, you know, on the street randomly, like, you know, has it can start to have a local community vibe. Yeah. That being said, like, you know, I've had many sets now of like, best friends that have just, like, moved away. And like yesterday, you kind of get used to that. And obviously, because of like the internet, you can, you know, stay in touch with them. Right. And I do. So, I have some sympathy for kind of the local, you know, Balaji network state, like, no communities will be digital, primarily versus physical. I think that kind of makes Yeah, that that's underrated. Again, that's underrated. But the physicality is probably still under still underrated. So that's, as well. So yeah, I don't know, I don't know, if we're now you know, have kind of this dream. And, you know, maybe be in the next 10 years, or maybe it won't, but, you know, had this dream of like building like this kind of monastic structure, somewhere in like rural California. And it being a place where there's both like full time, almost, you know, monks or nuns, where you can think of like other kind of variants of that. And then also has is also explicitly set up for, like groups of maybe like five to 10 families and their kids to catch up and spend, you know, three to seven days in the same place together. That's cool. And that being a tool that you allows people to build some of these like deeper, closer community connections, and it's kind of nothing like that today. So like, you could go all go to Lake Tahoe and get Airbnb is next to each other. But that's not the same thing. Because once your kids are asleep, you have to be in the same house as them. You can't night yeah, or you can try to get like a giant house. But there's like a few of those in a very expensive and ugly design. So if there's like, there's like an under this is like this is again, this is one of my $20 I think is like laying on the ground, and maybe not like a profiteering perspective. But certainly, like a cultural impact perspective is having like more places or spaces that people will leave can like, get away to, right? I don't think that it's important that like, every single day, people will be living like this necessarily. But you know, the idea of like, you know, being a man of the town, and then like, you know, retreating to the countryside is oldest time, right? So, but there's not, there's not you have to build community in that way. There's not really good tools right now. So anyways, I'm kind of unrelated, but that's one want to get that I've had,

Will Jarvis 37:45

I'd be like that I do. I do quite like that that a lot. Do you think a physicality, you know, will become less important over time? Or have we reached the local Mac, Maxima with remote work, etc. And like the efforts that matter are kind of in vain to make these things more immersive, to the point where it just the the physical doesn't matter anymore.

Jeff Huber 38:11

I think it's kind of interesting, because I think, in some ways that digital will matter more and more, but then maybe also the physical will matter more and more. Gotcha. So maybe like I don't someone's like, it's the opposition or that, you know, setting setting up I was like this limit of like, setting him up as like oppose forces, I think is is, is what I would disagree with. And that's, you know, that's yes, you're disagreeing sort of a false a false duality. Yeah, I think that physicality will become, again, like much more mystical, much more sacramental gotcha, much more spiritual, makes it on one vector.

But at the same time, you know, it's like, I think foolish to say that, like,

VR will never be good enough for me to spend an hour a day in because I hate it. And I don't like Mark Zuckerberg, you know, like, that's just stupid. And, you know, maybe it's not worth a quarter trillion dollar investment from one company, but but some, you know, entrepreneur that nobody knows about, or maybe hasn't even been born hasn't even been born yet. Like, we'll we'll figure it out. And I think that, yeah, there's all kinds of, again, really bad things people use that for, but also really good things. Sounds great. That's great.

Will Jarvis 39:27

Well, Jeff, what do you think about the next 10 years for yourself? What does it look like?

Jeff Huber 39:33

Yeah, I am pretty pretty focused on, you know, building this company right now. I think it's really important to the future of, let's say, machine learning. I still like the phrase that you know, machine learning is done in Python and AI has done an excellent bedsheets. And for whatever reason, like AI is becoming like less taboo to say in recent times, but I'm just gonna say machine learning. So yeah, I think that the project I'm working on Chroma is is really important to the future of machine learning. Ai taking it as we say, basically taking it from like alchemy today, which is like there's a few people that can figure out the right magical cocktail of ingredients to get us to do something and change it get into more of an engineering where it's deterministic possible to make progress about the model better get results, be able to trust it, make it make it a line, if you want to choose those words, make it safe. Do you want that word? And yeah, we think that Chroma is is the way that that will happen. And we're building it in open source. And so it's like, kind of gonna be this community thing. So that's a big, that's a big emphasis. I think that over the next 10 years, we can we can get there, we can actually make that a reality. So that's exciting. Yeah, I mean, maybe we'll build that monastery the next 10 years. Should it should this should the funds, should the funds emerge? And, yeah, I mean, I'm, I think that like, one thing that's been, you know, a lot of my intellectual sort of pursuits, or efforts, like, are really done for my own sake. And then because I don't see anybody else. Like, I take these things really seriously. And it seems like nobody else is thinking about it, or caring about it. And so it's like, well, crap, someone asked you, I guess, I guess I have to. So, yeah, it still feels like, there's certainly a lot to unpack, like, if it is true that modernity is is fading away, and that it is true that there is a an open future of what comes after maternity. Right, then there's a lot of work to be done. Yeah. And, you know, there's, there's a lot of work to be done, you know, just purely in like formulating the ideas and building communities and like bringing people into the fold and like, you know, for lack of a better word, evangelizing some of these ideas. And so right. Yeah, I don't know, what role is mine to play in that, if any, maybe I just, yeah, I don't know. But But that'd be. I think that would be you know, I certainly think it's like extremely important. So definitely as as the opportunities rise. Yeah, I think in 10 years, we can, I think in 10 years, we can have some embryonic communities that are truly living day to day, week to week with some of the values that will define the next 50 years and sort of like how you imagine like the early hippie movement, you know, like, there were here there were hippies before hippies, yes. You know, and, and they were doing the commune thing. And they were kind of like they were they were living out life. And in a way that would become like a mass phenomenon, right. And I think that something like that is going to happen again, it's not the hippie movement, but something different and new. And that there still will be germinated by small communities of people living in attentional ways. And I think that like getting to that milestone in the next 10 years is also possible. So definitely, those those are the two big projects.

Will Jarvis 42:35

I love that. I love that. Well, Jeff, thanks so much for taking the time today. Oh, where can people find you? Where should we send them?

Jeff Huber 42:42

Yeah, my Twitter probably is the best place to do. So at Jeffrey Huber on Twitter. Nice. And I don't tweet that much. So I tweet like once per week, so made me uneasy. I'm a low commitment follow a little bit. I'm not gonna fill your feed. So yeah, I'm busy.

Will Jarvis 42:56

Awesome. That's great. Well, Jeff, thanks so much again. I really appreciate it.

Jeff Huber 42:59

Thank you. It's been really wonderful.

William Jarvis 43:01

Awesome. Thanks for listening. We'll be back next week with a new episode of narratives.

Will Jarvis 43:13

One quick note before I let y'all go, we didn't really get into it in this episode, but Jeff's company Chroma is doing some really cool work in open source AI, and they're hiring. If you want to learn more, visit their Twitter at try Chroma.

Transcribed by https://otter.ai

0 Comments
Narratives
Narratives
Narratives is a project exploring the ways in which the world is better than it has been, the ways that it is worse, and the paths toward making a better, more definite future.
Narratives is hosted by Will Jarvis. For more information, and more episodes, visit www.narrativespodcast.com