Engineering Society & the Human Experience | Michael Sacasas
What's up everybody? My name is Demetri Kofinas, and you're listening to Hidden Forces, a podcast that inspires investors, entrepreneurs, and everyday citizens to challenge consensus narratives, and to learn how to think critically about the systems of power shaping our world. My guest in today's episode is Michael Sacasas, the executive director of the Christian Study Center in Gainesville, Florida, and the author of The Convivial Society, a widely read newsletter about technology, culture, and the moral life. I asked Michael to come onto the podcast today because I feel strongly that we are living through a critical moment in the evolution of human society. Where choices are being made today, in some cases by us, and in other cases for us, that will have an irreversible effect on the future of the human race.
Many of you are probably wondering, "Is he talking about AI?" Not necessarily. Advancements in artificial intelligence may very well be the catalyst. But if so, then AI is just the apotheosis of trends that have been in place for decades, if not longer, and which are fundamentally changing the nature of our lives, our societies, and our politics in ways that are not necessarily desirable. For many of us, understanding these trends is not enough. Now where they threaten the values or traditions that we associate most strongly with human civilization and the flourishing of the natural world.
And so my goal in today's conversation is not only to help us understand the nature of these trends, and the technological and social structures bringing them about. But also to meditate on what we can do about them, how we can effectuate change, and on what scale we can hope to make a difference. You can find related podcasts to this one on this week's episode page at hiddenforces.io where you can also access our premium content. Including transcripts, intelligence reports, and key takeaway videos, by joining one of our three content tiers.
All subscribers gain access to our premium feed, which you can use to listen to on your mobile device, using your favorite podcast app. Just like you're listening to this episode right now. If you want to join in on the conversation and become a member of the Hidden Forces genius community, which includes Q&A calls with guests, access to special research and analysis, in-person events, and dinners, you can also do that on our subscriber page. And if you still have questions, feel free to send an email to firstname.lastname@example.org, and I or someone from our team will get right back to you.
And with that, please enjoy this absolutely wonderful and inspiring conversation with my guest, Michael Sacasas. Michael Sacasas, welcome to Hidden Forces. Thank you, Demetri.
A pleasure to be here. I tried to work in the Cuban pronunciation of your name. It's great having you on Michael. You're long overdue, actually. You're several weeks late. My fault.
Not your fault. We actually had you scheduled to come on the show the Monday morning that our episode with Steven Kelly published on Silicon Valley Bank. And so what happened was I was doing these episodes. One I did with Jon Askonas one on Consensus Reality and what's happened to Consensus Reality. And the next one was with Iain McGilchrist on the nature of reality and how our brains modeled the world.
And yours was the next in a series of episodes that I actually had planned, and that got interrupted because I felt like I really needed to cover financial markets. So we're back on track. Hopefully, I don't get pulled away from this track that I'm working on again anytime soon because I really want to focus on it. Before we get into today's conversation, which as I was telling you is very important to me and I think it's very important to everyone. I would love for you to tell me and our listeners a little bit about you, your background.
What do you tell people that you do if you meet someone for the first time? How do you describe what you do and what you're interested in? That's a great question. So I have two lives professionally. So the easy part might be to say that my day job, I'm the director of the Christian Study Center of Gainesville, Florida. So we're kind a para-academic institution, adjacent but not part of the University of Florida. We host lectures, symposia, reading groups, classes. And so my job is wonderful job of being able to put all that together, so have kind of a foot in the academic world without being subject to some of the disorders of that world or the pressures of that world.
Alongside of that, and even prior to that, I write about technology stuff. I kind of just generally describe it as technology and culture, technology and society. Sometimes I add the phrase technology and the moral life to kind of clarify what interests me most. And that was an interest that probably goes back 20 years now to a time when I first encountered the writing of Neil Postman, and Jacques Ellul. Later other theorist kind of became important to me.
But that work kind of highlighted to me the way in which we are shaped by technology. There's this realm of human interaction that is not merely intellectual, but has to do with the way that our material culture organizes, the way we relate to one another, channels our thinking in certain ways makes certain things possible, forecloses other possibilities. And all of that is happening it seems to me at the time, rather un-noted by most people. So we carry on our day, we tend to think certainly. So I can be from a theological or religious studies background. So we focus on ideas.
If you come from a philosophy background, I think that's the tendency as well, ideas of consequences. Which of course is true. But then there seemed to me all of a sudden to be this additional layer of material, technological reality that was playing a really important role in shaping my life, the life of the communities I was a part of, the life of the wider society. And I thought of time, it'd be important to try to understand that.
So that is a longstanding interest. I was in a PhD program that was somewhat aligned with those interests, but somewhat not. But I finished without completing my PhD, so I was ABD. But I've continued to write my writing, was kind of born out of my experience as a grad student wanting to find a place to think out loud.
It took the form of a blog called The Frailest Thing around 2010, and now takes the form of a newsletter called The Convivial Society. So yeah, that's what I do in a sense, try to figure out how technology interacts with our expressed desires for the kind of life that we want to lead, the kind of communities we want to inhabit. That's so interesting. So there are actually two branches of what you said that I would like to touch on. One is this notion of ideas. Ideas have consequences.
And juxtapose that with the observation that actions also have consequences. And actually, you could argue that actions are more consequential. So that's one interesting thing, and that really is kind of a way of asking how much of you is an intellectual, a curious thinker? And how much of you is an activist, where your ideas lead you to certain moral convictions that you feel like you have an obligation to try and manifest into the world? That's kind of one branch. And the other is this interesting observation, more to the point of what you said. Which is that ideas, philosophy, culture, yes, we can think about and talk about what the ideal world is. But there are forces around us, in earlier times, those were natural forces.
The rhythms of daily life and nature. Now, those are kind of residual and evolving, humanly manifested outcomes in the form of technology, that are shaping our lives, whether we like it or not. Maybe we can focus on that first. On this idea that there are technological and systematic forces in the world, that exhibit their own logic in steering us away from the lives that we actually want to live. What is it that you're really touching on there? I'd love if you could elaborate on that. Sure.
So very concrete example that I sometimes use to help put a little bit of flesh in that concept. So I'll start with the fact I'm a parent. So parents are notoriously sometimes lacking in the kind of patience that they want to exhibit towards their children. But there are all sorts of kinds of situations where we might find ourselves having this desire to be a little bit more patient with the situation, with the people we're dealing with, etc. But yet we can't quite get there.
So we wake up the next day, we think again, "Well, I ought to be more patient in these situations," and it doesn't happen. So we have this expressed desire to be a certain kind of person, a more patient person, or to exhibit this virtue patience. And so it seems to me we have an option. I suppose one option is to continue thinking hard about it, or kind of gritting our teeth in the morning and saying, "I will, I will, I will." The other option it occurred to me is to look at how our habits are being shaped.
So I should maybe clarify that I think of our moral formation, the formation of our character, along the lines that are sometimes labeled virtue ethics. So we have habits. These habits become inclinations and dispositions. These become virtues and vices, and that's our character. So it matters a lot what we do repeatedly, the way our practices structure, our actions, and even our thinking. So if I look at my surroundings, if I look at the way most of us lead our lives, at every point at which a device, or a technology, or a system promises to save me time, I take it.
Just by default. It's not even a question. So the default virtue there, a default value is efficiency, time-saving. So one of the chief ways in which technology has been packaged, and sold to us, and deployed over the last 100+ years is this will save you time.
Great. What that means in practice is that there is no point at which I am actually cultivating the disposition to wait in an unhurried or flustered manner. I'm not cultivating the skill I want to call patience through my habits. I'm in fact undermining my capacity to develop the virtue of patience because I'm systematically eliminating all the places where that habit might be cultivated in me. So I gradually am actually making myself less patient, less able to wait for situation to unfold, or for somebody to speak their mind. Whatever situation you think you need patience in, by the way that I have structured the material conditions of my life.
And some of those debt structuring, we'll probably talk about this at various points, but I have more or less agency in some of those situations. There's some times where I can't avoid the way that a life has been structured for me, but then there are other choices that I do have in terms of adopting certain technologies, or the kind of disposition I bring to a situation, or a system, or whatever. But that's kind of the idea. And I think you can multiply that across a variety of virtues that somebody might aspire to, or the kind of shape you want to give your community. There are things we desire. Express beliefs, preferences, values.
And they may or may not align with the implicit formative nature of the material culture that has arisen around us, and that we have to various degrees of intentionality and agency adopted into our lives. Does that make sense? It does, but I do want to dig deeper into it and kind of make it maybe more obvious for people. But yes, I think at the heart of what you're describing, there is a question of alignment. Between the world that comes into being, the material world that we live in, that arises from our actions. And the world that we want, that we desire, that's maybe consistent with who we are. And that raises the question of who are we? And I think that's a question that there's no easy answer to.
But what's perhaps more concerning to me is that we don't seem really engaged in asking that question today. I'd like to maybe probe a bit deeper on this observation about technology and how it saves us time, because absolutely it saves us time. And the knock-on effect of that, or what derives from that is that it saves you money, right? Because time is money. And money buys you wealth and material things.
And it seems that where all of that leads to, whether it's time savings or money, is ease. And there's something about the way in which all of our lives and the material world is oriented toward furthering human ease that I think is worth questioning the desirability of. So before we do that though, maybe it will be best to ask you to take a moment, Michael, and tell us about The Convivial Society, which is the name of your blog.
What do you write about there? And why the name Convivial Society? What does that refer to? The Convivial Society is a reference to two books by two thinkers who have been very influential to me. The first is Ivan Illich, and this is the most obvious reference to a little book called Tools for Conviviality that appeared in 1973, was early '70s. The second reference, which is a little less obvious, is to the Technological Society by Jacques Ellul.
That book appeared in the 1950s. It's a big fat book that basically has one point, which is that what we today might think of under the heading of optimization, the imperative to optimize everything is the most important structuring principle of modern society. So those two thinkers, Ellul, Illich, have been very important to me. So when I was thinking of a name for this newsletter, I thought, "Well, that works."
And so it gestures at their impact on my thinking. And I think of one Ellul being very much the critic. Illich is the critic as well. But that idea of conviviality also gives us something to aim at. So not just critique, but here's the alternative vision for human flourishing, if you like. That we can perhaps aim at, that might give us some way to not just say we don't like these things about the way our life is structured.
But to say, "Here's what we ought to be more deliberately aiming at." What does conviviality mean? Great question. So Illich has a funny line in the beginning of that book where he says, "I had a lot of friends counsel me against using this word," because in the American context, it refers to tipsiness. And so I think he had at the time, maybe had some association with maybe drinking a little bit too much, and getting a little too inebriated, and happily.
But he's using it I think, initially from its sense in Spanish. And generally one way of thinking is it does involve a kind of social engagement with others at a human scale that leads to people being seen, understood. They find a kind of measure of joy in that kind of companionship.
So human fellowship and companionship. Maybe not the kind that we imagine in the situation of the family, the household, but across a more public range of relationships. So joyful friendships. I think that's maybe one way of thinking of it. Now he takes it and use it a little bit more specifically to speak about tools. So can't think of tools being friends in that way.
But he's looking at tools. There are various ways that he clarifies the term. So these are tools that we are able to use rather than being used by them. In other places, he says, "These are tools that I have the liberty to pick up and to discard.
I'm not compelled to use them. Nobody compels me to use them. They are tools that foster independence rather than dependence. So I remain autonomous in my ability to take them up as they serve me in my ends."
They don't require any certification, so they don't require high levels of expertise or training where only certain certified users are able to deploy them well. They operate at a human scale. They also have a tendency to bring people together rather than isolate. So these are some of the features of these kinds of tools. And it may be helpful to say that there are two points here I think that are worth making. One, he's trying to develop an alternative to industrial tools and institutions.
Writing in the 1970s, he sees basically the trajectory of a fully industrialized society being environmental degradation, social polarization, and what he calls psychological impotence, by which I think he means something like what we today might generically call mental health disorders. So he's trying to think of what is an alternative to these kinds of institutions and technologies. The other thing that I think is worth mentioning here is that he later comes to think that his own analysis was incomplete at best, at this stage in his intellectual journey. And that part of the problem was that he had not already at that junction, the early 1970s, he had not reckoned with what was already unfolding, which is a transition away from tools or instruments to systems. So there are a handful of interesting cybernetic thinkers that kind of become a part of his circle in Mexico in Cuernavaca at the Center for International Documentation, or CIDOC, which became the hub of his intellectual activity at that time.
And I think he has initially the sense that there's a path towards what he might later call conviviality here, but I think he becomes more suspicious of that as time progresses. And he has no one place where he develops this contrast between tools and systems. But in the places where he does kind of gesture or hint at the distinction, it seems to involve the idea in his mind that you can imagine yourself standing in a position of mastery with regards to a tool or an instrument.
Whereas the system envelops you in such a way that's impossible to stand apart from it with that same level of autonomy. Yeah. Someone that comes to mind in this context is also Ted Kaczynski, because these are kind of some of the concerns that he expressed in his industrial society and his future. Ted Kaczynski was a great reader of Jacques Ellul. That is for certain. He's quoted repeatedly in his manifesto.
Yeah. So a lot of things to comment on. One, I want to just kind of reiterate what you said about optimization, because I think that that's powerful. And it reminds me of someone we discussed in our conversation, in my conversation with Iain McGilchrist, because I think he would see very much the kind of left brain imprint on society there, this need to systematize, to categorize, to put in order.
And there is this kind of weird thing. I don't know if you ever saw The Man in the High Castle. No, I have not.
It's actually a really beautiful show. I think it was on Amazon Prime. And the premise of the show is what the world would look like if the Americans had lost the war to the Japanese, to the Nazis. Who in this particular scenario, created a compact. And the Japanese basically had the Western territories, the Nazis had the eastern part of the country.
And you saw two very different philosophies. The Japanese philosophy was more continuous, more analog, more in touch with the natural rhythms of the world, and capable of engaging in with the nuance, the so-called right brain, so to speak. And the Nazi world was so cleansed of humanity, so systematized, so regimented. And I don't necessarily want to try to draw direct parallel between that and the world we live in today. But I just kind of did want to find a way to mention this because it does feel like there's a kind of OCD compulsiveness that's driving the optimization in our world today away from the human being. And even in the work of Ivan Illich or other scholars of that period that you mentioned, there was a recognition that even then, we were moving away from what was human.
And again, not to pile one on top of the other, but another thing that I often think about Michael, is to what degree are we really aware of behaviors or changes to our world that are universally bad? And to what degrees does this just reflect the loss of one generation's sense of what it means to be a human, and the transition to something new? So that if someone from the 18th century or the 19th century came to America today, would that person really fall into depression and not be able to live, because he or she would see us and not be able to communicate with us or be able to really be in presence with us? So does that resonate with you? Where would you take that? I think that's a great question. I think I understand the heart of it. And so it seems to me often if you talk critically about this or that technological arrangement or this specific artifact... I should say, this is important I think to say. This is not about being pro-technology or anti-technology, right? I think that's the way sometimes the discourse wants to frame it. This person is a techno enthusiast, or this person is a Luddite, or whatever disparaging term they want.
What I advocate to whatever degree it is possible is simply to think, just to think, to take the time to consider the implications of these tools. Because in fact, they are powerful. They can subtly and sometimes not so subtly change the world we're living in. And so the temptation I think sometimes is to say, "Are you against progress?" If you hesitate about the adoption of certain technologies, or if you want to raise some questions about it, are you against progress? And the idea is we always adapt, right? In every generation, people complain about X, Y, and Z. And the Victorians complained about novel writing, and now we think novel is this incredible, fine.
And I think you probably find examples like that. Now, two curious things. Almost all of those examples when people make that kind of move where they say, "Look, these people complained about X and we were fine." I mean, I think there are two things to say about that.
One is the idea of survival ship ice, right? We're fine. That doesn't necessarily mean everybody was fine. And the fact that you adapt to a situation, doesn't mean that the situation is good for you. So there's that.
And the other thing is that many of them stem from the mid to late 19th century. And this is a point where industrialization radically reorders the human life world in a way that has no historical precedent before that. Now sometimes, if you're talking about media specifically, there's one prior example that comes to mind.
And that is Socrates complaining about writing and the effect that it would have on memory or recollection. And I think in many respects, I want to say, "Well, he wasn't wrong." And then there's something deeper I think going on with memory there that Plato was concerned about whatever.
But most of these examples of technologies that we sort of grow accustomed to now and think that the critique is invalid, because we accommodated stem from a very specific period. And if I want to be really contrarian about it, I would say, "Well, I'm not sure how well we have adapted." So to question whether or not we have actually acclimated as well as we think that we have... So to me, Jacques Ellul is very interesting about this.
And I think part of the reason a lot of this mid-20th century, early 20th century thinkers wrote in the way that they did, in ways that I think we can now go back to and still benefit from, is that to some degree they had a foot in both worlds, in two very different life worlds, and were sensitive to kinds of changes that we now sort of take for granted or aren't even aware of. But what Ellul would say is that this way of ordering society for the sake of what he called technique, and he used the French phrase la technique, which basically becomes this drive to efficiency, to optimize, an imperative to optimize. And not just to optimize things that it makes sense to optimize, but just to optimize everything. He would say, "This is an inhuman way of ordering society."
And people in it get broken by that system. They're not well. And so you have then the last set of techniques that Ellul says arises are what he called human techniques. And these are the techniques that are necessary to keep the human component of the technological system functioning. And they present as being very humane. In fact, this is what is talked about as the humanizing of technique.
That it is for the sake of the worker, for the sake of the human being. But in fact it is not from his perspective, it is for the sake of the system. And so even in the 1950s, he's saying these include things like pharmacological interventions. They include entertainment as a way of soothing the restlessness or numbing maybe. Even the restlessness that is isn't a result of this kind of society that we've created. And I think of that, I think we can make a similar analysis today.
There are all sorts of things that we do to cope. Whether these are pharmacological interventions, a growing percentage of people that are on some form of medication for mental health disorder, anxiety. Ellul wrote 30 years before Neil Postman's Amusing Ourselves to Death. There's this kind of sense of getting through a day and being capable of little more than binging on Netflix, because this is just where we are mentally.
And then the other layer of that are things like the wellness industry. So these all, to me, very neatly fit into this category of humanizing techniques. And they're necessary because the default setting of the society that we've created, that we've generated, necessitates them. It does not of its own accord, provide for our well-being, for the formation of satisfying communities, for mental health, etc., etc. So one simple question that we could ask is, are you well? Are we well? Or even, like one of the characters in Fahrenheit 451, Ray Bradbury's I think brilliant little book from the '50s simply asks, "Are you happy?" And that triggers this series of thoughts and a protagonist that leads to the development of the plot of the book.
But it's a trite question at one level, but I don't think it is, right? And so we can simply ask, "Are you well, are you happy?" And by what measures, right? Because I think it is obviously important to acknowledge by many material standards people are well, right? Modern medicine has in many ways improved the quality of our life. Illich has an interesting argument to make about that, which we can talk about maybe. But there are many measures by which modern society has improved the lot of human beings. But are there other measures maybe that are not as easily quantifiable, that are now actually not working the way we would want them to work? So that's how I think I would diagnostically just consider your situation, a situation in your community, your neighbors, and ask, "Is this the way we would like it to be?" Is it fair to simply say, "Well, don't worry about it.
We will adjust, we will acclimate." In some respects, the experiment upon which that assumption is based, it's a remarkably small segment of human history. So if you imagine human history extending for tens of thousand, hundreds of thousands of years. And then all of a sudden, you get the industrial revolution about 200, roughly to 150 years ago, depending on how you date it.
This is a remarkably small slice of human history, of sample size. And I think in many ways, the relevant judgments are still with regards to the human being, the environment, etc. Maybe we might think differently about it given more time.
The experiment may not be playing out as well as we think if we extend time forward. I'm not sure. Yeah. Most of my conversations, Michael, are very structured. I feel like even my conversation with Iain McGilchrist and with Jon Askonas, which is more in kind of line with what today's conversation is about, we're pretty structured.
But I'm more inclined to feel my way around this conversation today for whatever reason. So I hope that turns out well for the listeners, but I just kind of want to kind of put that out there, guys. I love what you said about coping. I think this is so interesting.
I'm reminded that quote by Krishnamurti that, "It is no measure of health to be well adjusted to a profoundly sick society." Yes, exactly. That's perfect. Yeah.
And it's so important to point out that we become used to things very quickly. We become acclimated to things. And if you look around, it's profoundly sad and alarming.
I don't quite know what word to use. How many people are on antidepressants and anti-anxiety medication, that's not normal. That's not normal. And everyone's being told that it's because people have all sorts of reasons. But the kind of sanitized Western reason is it's something about your biology. Okay, maybe that's true.
But why is that biology thing getting activated? Why was it not getting activated to this degree in the past? If you go to a polluted estuary, isn't it normal to feel kind of sad? And there are a lot of things in the world that I think understandably make us sad and make us anxious. Even if you look at the anxiety that people lived within the early 1950s when they were dealing with the onset of the nuclear age. Look at where we are today. Is it not normal for people to be waking up anxious, worried that if they check their phone, something might have blown up in Eastern Europe. And this brings me back to the thing that I kind of brought up initially with you, Michael, this distinction between the intellectual part of us and the activist part of us.
Because I feel like we're living it during a time where it's absolutely more important than Everett to be contemplative, because I feel like a lot of the space for contemplation has been taken away, and contemplation leads to wisdom. And we need wisdom if we're going to survive through a period in time where we have more powerful technologies than ever before. And that, by the way, speaks to something that even Illich just talked about, which is power and how power transforms society. When you have these powerful tools, it shapes a hierarchy, and it impoverishes people. Both the wealthy and the poor become sort of impoverished in a way, living in this new paradigm. But without wisdom, we're not going to make it through.
And yet we need to be able to contemplate. And there's this role for the intellectual. But also, at least the way I feel Michael today, is I feel like we're at a point now where we need to take action. Those of us that feel strongly about this, we have to find a way to take action. And one of the things that I would love to be able to do in this conversation at some point... Doesn't have to initially be now.
But is to kind of talk about how do we begin to take action? Because it feels like also, the things that you talk about and you write about so eloquently... And I really do want to tell this to listeners because I don't know how much of today's conversation is going to be a showcase of your writing and all the different thought pieces you've put out, but you've just written so many beautiful things, and they're so thought-provoking. So one are the consequences of the modern technological age is how it has undermined our political systems and our ability to act together as human beings to try to work through the issues in our society. It's created this kind of stasis and immobilization. And I don't quite know how to fix that. I don't know if you think about that at all.
How do we come together as a society and build the future we want, and not be overtaken by the current of this kind of optimization, obsession, efficiency, market fundamentalism that has driven every decision at least since the 1990s? Obviously, brilliant question. I wish I had a 10-point plan to overcome this. I heard you have a 41-point plan apparently I have 41 questions, which is very different. So here's the way I think about this, maybe to address this for myself. So I think of having a very little niche corner of whatever. I don't even know how to put myself on the map.
But some people read my stuff, and that's wonderful. I think what I am trying to do is to articulate as best as possible, both a sense of what is wrong, or what is maybe not working the way it ought to work. But at a deeper level than simply a question of, "Well, this is a glitch in the system. Let's fix the system." I'd like to maybe just question the system altogether a little bit in some cases. And then also kind of hinted some models of the good that we ought to be striving for.
Because I think maybe that's part of the problem. Part of the problem is that whole litany of things. Market fundamentalism, optimization, efficiency, these have become the default values. And people will aim at them, whether consciously or not, our say is ordered in that way. So you do need that moment of reflection where you question the thing that everybody takes for granted, and you have to stop and ask even the most radical question possible. So Illich is famously critical of compulsory schooling in the Western world.
So part of what I liked about Illich is that there are a lot of things people were complaining about regarding the modern world, and he takes on the two things most people would say are the good things, medicine and schooling in any case. And tell people who Illich is, because I don't know that you mentioned- Yeah. Maybe you did. You kind of briefly mentioned that he wrote Tools for Conviviality in 1973 and that he was a critic of modernity, but who was he and why is he so important to your own thinking? Sure. So who was he? He was born in Austria. His mother was Jewish.
He became a Roman Catholic priest. He had a famously conflicted relationship with the Roman hierarchy by the time that he was in the Western Hemisphere working in Mexico. First New York, then Puerto Rico, then Mexico. He was a brilliant man, a PhD in history from University of Salzburg. He was conversing in a dozen languages. And he wrote these scathing critiques of modern institutions.
Schooling, medicine, transportation, technology. And tools in a sense of institutions in Tools for Conviviality. And he's striving for a society where human beings can flourish. That phrase itself, human flourishing now, is becoming weird in the discourse. But that is good for human beings given the sorts of creatures that we are.
And so he has a really interesting set of connections, and various intellectuals come to him, interact with them, are part of the center in Mexico. And he has this moment of fame as a public intellectual throughout the '70s. And then things kind of turn. It's interesting. I think it turns maybe to some degree because through the '80s, maybe what we sometimes call the neoliberal revolution, prosperity on the same terms seems to come back. And so some of the evident ways in which industrial society was heading in a bad or worrisome direction gets papered over.
And we can talk about whether those were real genuine fixes and improvements in the system, or just a kind of patch. Now Illich is I think in some quarters having a little bit of a moment, because I think what we are finding is that a lot of those critiques kind of still hold. But without going to any greater detail, that's who Illich was. I find him to be a very powerful and compelling thinker. And again, in part because he is willing to ask the question that others might find absurd to ask. And so it was a little anecdote I was getting ready to tell.
So a friend of his... He died in 2002 by the way. Illich did. So a friend of his was telling me that they were at a conference about education. The friend is himself a lifelong educator, and they're walking. Illich says, "Why do we think we need to be educated?" And so the friend is telling me, he says, "I thought he just meant what he meant in Deschooling Society, which was that compulsory schooling was a problem, that the way we were doing education."
But in that moment, he simply just meant, "why do we think we have to be educated?" And so that seems, I don't know, to my ears at the time, a ridiculous question to ask. But I think it's that kind of question. What assumptions are embedded in the way that we do the things that we do and why we do them? And I think we've lost sight of why we do things. The purposes are just not tacitly stated. I don't know.
I'm going to go with your thing, that this is just going to be a wide range of conversation and somewhat unstructured. But to some degree, I think this is built into the way we even politically end up arranging the modern world so that we bracket these questions about the human good, about ultimate values, about what particular shape the good life should take. We put those off the table. We create a kind of procedural arrangement, whereby we try to maximize fairness and minimize harm.
And this is again, fine and good. But we do it in part by bracketing those large questions, that admittedly people fight about and sometimes get violent about. And I think this was part of the impetus to put those questions off the table and to strive for a maximal amount of pluralism, which is great. Killing people is not good.
But that mindfulness maybe keeps us from thinking most deeply about a question that I think now is being raised by our technological milieu. What is it to be human? Do we privilege the present configuration of what it means to be human? It goes back to your previous question. I mean, for some people the answer is, there is nothing to privilege. We are on the march, and we are going to happily race towards whatever the post-human might be, right? So I'm not on that boat as it were. I think it is important to ask what is a human, what is good for a human being as we are presently configured? And whether or not the system that we have created serves those ends in the best possible way.
And again, my sense is that the way I summarize it is the human-built world is not built for humans. We've built a world that is built for economic optimization. It's built for the rationalization of our relationship to the world. It's built for power. I think this is explicit in the founding moment of the modern scientific technological project, in the early modern period. In the work of Francis Bacon, for example, who does so much to give the modern technological project its marching orders.
The point of knowledge is power. It is to deploy power over the world, to control it, to manipulate it. It was interesting when you were describing the way that show, The Man in the High Castle, the world it creates. Anytime somebody says there are two kinds of people, that's probably a sign that that's probably wrong.
But if we can say there are these two dispositions towards reality, maybe you can loosely characterize them as Western and Eastern. I think they're traces of it in the Western heritage as well. But one is to see the world as a realm for management, for control, and domination. Exactly, right. And the other, the way I put it, it is to see the world as a gift, first and foremost. This language is in Illich.
It's in Hannah Arendt, another thinker that's been important to me. They both have different ways of understanding the nature of the gifts, but that there is an integrity and a goodness to the world that we inhabit, ourselves included as part of it, right? Part of the problem, as well as something that we through the modern period, imagine the human being as something kind of distinct, totally separate. That we created these dichotomies.
Mind, body, human, non-human. But in fact, we're all enmeshed in various orders of being together. And we bring that spirit, I should say, that spirit of mastery. Not just to the world, but ultimately to the human being itself.
To ourselves, right? To others, and then even to ourselves. So that spirit of mastery tries to find more and more levers by which to control reality, to remove the contingent, to make things as predictable as possible. So this is the spirit that has driven, I think, the development, and deployment, and the terms of adoption for the technological package that is Western modernity. I think it could have been otherwise, right? In principle, it could have been otherwise. But this is where we are. And so it's the tools.
The tools themselves have a bent to them. I think part of what I try to also challenge is the idea that tools, or technologies, or systems are neutral. That what matters most is simply what you do with them. And I think that of course it matters what you do with them, but that's not the end of the story.
The tools themselves will create certain possibilities, foreclose others. They will induce you in certain ways, encourage you in certain ways. They will frame the world for you in certain ways.
They will shape your perception in certain ways before you've even lifted a finger to do a thing. And those implicit framings, the shaping of perception, the inducements, or if you like the temptations, those are important because they are part of this package of often unacknowledged forces that are shaping and conditioning. Not determining. So I try and say, I'm not a technological determinist. But I think it is impossible to not recognize the way in which technologies over time do condition how we see the world, what we think are the possibilities of action, and maybe even foreclose the horizon of possibility to us.
Keep us from imagining alternatives because they simply becomes... So Illich later on goes on to say there are these certainties that keep us from understanding the world in a better way, recognizing we have some agency where we think we don't have any. One of these certainties for Illich was the idea of scarcity, or the promise of efficiency, or control. So anyway, there we are kind of arranging all over the place. But I think this is kind of the challenge before us is to recognize that there is not only this material reality that is just now structuring our social world.
But that material reality came embedded with certain assumptions about what it means to be human, and the good life. That because it was embedded in our technological kit, became for us the default way of thinking about the world. So these two interlocking ways in which I think we have been shaped by modern technology. Does that make sense? Yeah, it does. And I think it's appropriate to use this as an opportunity to work through some of these themes. Because otherwise, I feel like we're in danger of introducing a rigid structure onto the conversation that really eliminates what maybe is most important to discover.
I think there is this sense in which we have proceeded forward characterizing what our assumptions or what our normative choices, and expressing them as absolute truths or natural laws. So this idea of you always pick the most efficient outcome. You always move towards progress. These are choices. They're value statements.
They're not actually indomitable forces of nature. But we seem to think of things this way. So progress is this thing that we need to progress, we need to disrupt. That's another big one. Disruptive technology.
You got to disrupt. I also think what you said about tools and kind of this organizational logic that's in embedded in them, I think is true. I think also though, I want to draw a big distinction between analog tools or industrial technologies, let's say like the railroad. And a social network. Because in the case of the latter, there's much more deliberate, conscious logic instantiated in the technology.
And we have been led to believe that we have this model in our heads that modern technology is kind of like a hammer or a railroad. So that, for example, when we spend our time on YouTube and YouTube gives us more and more videos that aren't just more of what we like, but also that themselves lead us to develop habits that are themselves more predictive and therefore more controllable. And we're all becoming more controlled and more frictionless button pushers.
That's a choice that's being crafted at a very high level by a narrow group of people, who are getting rich by turning us into basically cows. And just controlling us, controlling our minds. And there's this really evil thing that's happening in society today, and people are aware of it on various levels. And it's resulting in, I think a lot of the friction in society.
Again, to use this very technological term. I think the quote technologists or people in positions of power, let's put it that way, that benefit materially from this technology, but who are also spiritually impoverished. To go back to the Ivan Illich observation about power, because they're part of this dynamic. People are rebelling against this. Our systems feel like they're breaking down. We are increasingly at each other's throats.
And yet somehow, somewhere, the solution isn't coming out. And again, this is bringing me back to this observation about politics. I don't know how much of it is because the world we've built is so punishing for people that try to make a difference. What good person wants to be a politician? Who wants to go run for office, and be torn down, and have their life destroyed? And again, I'm kind of spiritually in this place, Michael, where having started the show in this place of curiosity and knowledge, I've really moved to a place now where I want to focus on solutions.
I'm heartbroken and distressed at what's happening in our society, truly distressed. I see even recently a somewhat separate conversation, but it still speaks to this. I have friends who are younger, and single, and out there. And you know what? They're really having a hard time.
They're having a hard time with the way in which people come together in the world. And dating is a great example of how these tools have transformed the landscape of intimacy, and romance, and socialization, and getting to know one another. And when I was growing up and beginning to date, I had to go find girls, go somewhere and speak to them in a bar, in a classroom.
Work up the courage to go talk to them, get rejected, get a date, get a number. And there was so much more of who I was that they were getting to meet, and so much more of who they were that I was getting to know. And we have this world now that's been regimented. To me, it's insane. The online dating world is awful. Why? Because it's good for a few people? Yeah, it's good for a few people.
You can get the progressive argument that will say, "Well, there's all these people that previously couldn't date, and now they can date because of the dating apps." Okay, but we don't make decisions for society based on what's good for a small number of people. That's not the measuring stick. And that's also a problem, Michael.
We've lost the language to really decide what is good, what is not good, how do we come to a consensus about anything. And again, to bring it back to the practical, because I really do want to have some part of this conversation to be practical and solutions-oriented. I don't know. I don't know what to do, man.
As caring adults in this world, how do we address this problem? Yeah. I mean, I think we definitely lack a language for the common good, of the common good. A lot of the structures through which we might seek it. So the default public sphere, if you like, if we think of it as social media, is obviously not conducive to meaningful conversation or to dialogue. It's questionable whether it really even functions as a public sphere at all, or rather a grand distraction from the ways that power will then be used. It's an evil form of mind control.
What is Twitter today? How much of Twitter is actually a public good or something that's helpful? There are aspects of social media that are helpful. I don't want to deny those things. I mean, how much of it is actually that? And how much of it is just making people angry at one another? Yeah, right. No, it's a disintegrating force, I think, at the social level.
And like you, I think I sometimes find myself talk about my use of Twitter as a kind of devil's bargain. So I acknowledge I've made some connections that became, I think, genuine friendships. My work to some degree has benefited from it. I don't know, but whatever. So there's all that. But then yes, there's the sum effect- Start to interrupt Michael.
But I want to say this because it's so important, because the framing is important. I think we're stuck in this framing of, "Well there's some good things, but there's some bad things." But the bad things aren't necessary. We can get the good things.
It's the logic of the algorithm that produces the bad things. Those aren't necessary. A lot of those things just stem from the fact that we never had payments in the browser. And in order to monetize a lot of the software and services that exist on the internet, the default function was to build a marketplace of advertising and selling people's attention.
But that doesn't have to be. Because at a higher level, there's a political architecture that exists that remains in our society. We have a federal government. We have state and local governments. We have a system of laws.
We have a constitution. We have the infrastructure in place to address this problem. But we seem to have somehow lost the political language to understand any of that.
And everyone's just kind of either checked out, or you have some of these alternative solutions, which is, "We'll build an alternative utopia on a blockchain." Which is ridiculous fantasy land that has nothing to do with actual- Yeah, absolutely. And look. Even by, I think it was 1980 something, Jacques Ellul's article where he argues that technological study has already eclipse democracy in important vital ways.
And whether we think of that in terms of the capture of politics, by money, by wealth, which of course to some degree are standard human problems. I was just listening to Tom Holland talk a little bit about the end of the Roman Republic, and there's a lot that's familiar about that in terms of wealth, and greed, and power, whatever. Degeneracy.
Yeah. So that is the question of, where is the political power in the most positive possible sense? But to such a great degree, we've bureaucratized power. My political theorist of choice is our Hannah Arendt. At one point, I think she favors citizen councils at a very local level. But I think part of it is the problem of scale. So if you scale society to a certain point, bureaucracy just becomes necessary to keep it functioning at that scale.
I will only hint at this, because it's not a thoroughly developed argument on my part. But I think there's a sense in which part of what was happening in the 1970s is that analog bureaucratic state was beginning to kind of collapse under the pressure of its own weight. And the cybernetic revolution or computers basically saved it by ratcheting it up its power to not change the terms by which it ordered society and thought about the good life, etc.
But to simply double down on those ideals of control, and manipulation, and prediction, and power. But it allowed it to operate more efficiently, at a higher scale that grown beyond the scale of analog institutions to manage. To some degree, I think that's maybe the function of artificial intelligence that's now touted it is to save the institutions again, rather than to rethink them and reimagine them.
That is another line. I'm going to say this. And every time I say it, I am fully aware of how absolutely impractical it sounds.
There are two things. I cannot imagine a way forward that is not on a very large time scale, like 100 years, 200 years. Because I think we need to start at the beginning in some respects. And it can't happen, there's no way of doing that at scale. And that's part of the problem.
That's I guess, part of one of those certainties that everything must be done at scale or it's not going to work, or it's not sufficiently powerful or whatever. But I think the face-to-face encounter is primary. I just think we have to remember how to be human in that face-to-face encounter again. I think a lot of what you're describing, the way in which the public series is warped on social media, the way dating relationships are warped by the dating apps.
Part of what is happening is that we are alienating ourselves from one another. We're displacing the very risky, emotionally charged, potentially fraught, and dangerous face-to-face encounter for the safety, and the ability to manage control, the digitally mediated encounter. And so recovering that I think is really important. Recovering the face-to-face encounter.
And with those who are not already perfectly aligned with us, to begin to learn how to live with these differences, but to continue to recognize the human in the other, even when their values or ideologies don't line up perfectly with ours. And so Illich has this moment where he is talking about one of the other certainties is the certainty of value. So he wanted to eliminate the word value.
I've tried to follow in his footsteps. Because it is already a way of the economic colonizing all of society, even our language. So he asks an interviewer, "Can we overcome that? Can we do that? Can we change the way people think about the good? Can we recover the language of the good?" Not of value, but of the good, in the face of all of these social forces. And Illich, very pointedly says to him, "Between you and me right now, yes."
And I think that's a beautiful moment. Because it points to the fact that if we focus on the largest possible scale, we will be demoralized and discouraged. And we will find that impossible to actualize, and stifling.
I think that's a choice we have. We don't have to do that. I can say to you, in this face-to-face encounter, it is mediated, right? I mean I see you are on Zoom. Right? But it is mediated.
But nonetheless, there's a level of the personal encounters being preserved here. Out of the limelight, right out of the attention economy, if you like, out of the stifling nature of the public sphere, Arendt has this wonderful place where she talks about how all living organisms need darkness to thrive. And instead, we throw everything into the glaring light of the observable, the measurable, the public.
And I think we've eliminated some of these spaces where human life is incubated in private realms, in realms that are not subject to quantification, not subject to become fodder for our public news feeds. And so to preserve these, to reimagine where these encounters can happen, to preserve them at that level, at that scale, I don't know. If you ask me what is to be done, I know that there are many other things that other people can work at the level of public policy, at the level of law, at the level of regulation. If you are the CEO of a startup, you have agency that I don't have, etc. But in terms of recovering the good, the human, of reminding ourselves of what is most important and what we should be ordering our life around, I think it has to happen in that face-to-face encounter and our willingness to be vulnerable in that encounter. Because again, so much of this boils down to the, you mentioned ease all the way at the beginning of our conversation, right? Yes.
We want what is easy. And I think maybe people always want what is easy. But they had some sense that there is value and struggling through that not everything can be made easy. And if you try to, you're going to kill that thing. Well, it's the difference between what we want and what we need.
Yeah. Yes. Right.
There's a big difference between what we want and what we need. And also, what we want in the moment, and what we really want. I've used the example of ice cream. I might really want ice cream right now, but what I really want is to be in shape. I want to be thin, whatever the thing is that comes with not eating ice cream.
And the world that we've built today, not only does it strive for ease and immediate gratification. But the systems of control use that as a way of getting what they want, as a way of controlling people. Because it merges the insights of behavioral psychology with the most advanced technologies of control. Right. To generate desire. Yeah.
I mean, completely, a hundred percent. I think with Jon Askonas, we talked about Charles Taylor's notion of the buffer itself. I think that's something that you've talked about as well.
You might have written about it in the analog versus the digital city. I feel like more and more people are voicing this desire to create spaces in the world. And I wonder in the same way that the nuclear age created a world that was less likely to go to war because the consequences of war would've been total annihilation, mutually assured destruction.
I wonder if not in the same way at this point, just rapid acceleration of technology, and lately things like deep fakes, and the expectation that in a few years, you're not even be able to authenticate who you are by your voice or by your image. That this is going to, by necessity, create a world where people stop doing certain things online. Because they simply can't trust them anymore, they've lost their sense of self. I do feel like there is this kind of a tension here or this battle line that's being drawn between those who want to continue doubling down on this world, and the people that want to break from it. Yeah. You know? Yeah.
No, absolutely. I think you're right. And this thing about scale I think is also interesting, because we have this idea of perhaps this notion in our heads that everything needs to be done at the same scale.
So either we all scale up or we all scale down. It doesn't necessarily have to be that way. We can decide what things we have to do at scale. One of them, for example, is yo