What Succession Tells Us About The Tech Industry
Are you concerned about the rapid growth and unchecked influence of AI and other scary sci fi technologies? I certainly am, but more on that in a second. Succession is a great show you should definitely watch, but it's kind of a hard sell because it's hard to pin down exactly what it's about. I guess it's technically about business and stuff, but there's not a lot of things that I personally find less intellectually stimulating that finance, and I LOVE this show. A more nuanced reading might say something about family and intergenerational trauma [We could talk about...our feelings!] Or maybe about money and the banality of evil [I happen to be a billionaire.
Sorry] I think that the big thematic takeaways somebody gets from this show says something about that person and what they're... generally occupied with. Which is why when I watch it, all I hear is: Tech. Tech. Tech. Tech. Tech. Tech. Tech. Tech. Yes, my projection
of what succession is about is the encroaching dominance of Silicon Valley. I know, it's a reach. See, for the past two years, I've been doing research in science and technology studies, so whenever I'm watching the show, *I* get caught up in all these little side plots about the tech industry and try to make sense of them in the context of my research. The only thing more compelling to me than the tech stuff is these two: [I'd castrate you and marry you in a heartbeat.]
But something happened in the recently aired final season that had me seriously questioning my theoretical, academic understanding of the tech world, as well as my beliefs regarding a solution to all the algorithmically assisted exploitation it's become notorious for. So whether you're con-head feeding for more con-tent about your favorite show, or just someone who wants to learn about power dynamics at play in the tech industry and beyond, stay tuned for some spoilers about who really won Succession, because it sure wasn't this guy. [an 8-bit version of the Succession theme plays] [get it? bc its Succession but also computer stuff] [like, if there was a NES game about the Roy siblings, it would sound like this] Is it giving quiet luxury? I don't know, most of my clothes are from Goodwill. I should do some Kerry arm movements. This is for TV! So if you literally don't know anything about the show, there's the super rich, mean guy who owns all the things, and he has some kids that are all useless because they grew up as literal billionaires, and the main tension of the plot is who's going to take over this company that runs everything from amusement parks to an in universe Fox News when he dies. [If he. Well, no Roman. It's not an if.]
If? [He won't die.] Apparently succession is also about climate change. His three kids, Kendall, Siobhan and Roman Roy, being the self assumed obvious choices, are the main characters, and we spend most of our time following them in their fight to win the company and/or the love of their father.
[Ohh, daddy doesn't love his little carrot top, even when he does a little daddy dance.] The tech industry is more of an ominous background character than a lead, but it's...always looming. From Kendall's failed attempt at acquiring an in-universe BuzzFeed in the pilot episode, to losing the family company to an Elon Musk/Daniel Ek hybrid in the finale, computer nerds are constantly making life difficult for our favorite corporate family. The characters themselves have mixed feelings about tech. For Kendall, It's the next big business venture.
[We can get a tech valuation for a real estate proposition on this. I think it's hard to make houses seem like tech because we've had houses for a while.] But for his dad... [Tech is coming.
Tech is here. Tech has his hands around your throat.] it's perceived as more of a threat. As for the show's portrayal of tech, I would say it's critical in that way where a text just lays something out so bare that the audience can't help but form the criticism themselves. Kind of like how the show is critical of meritocracy. It doesn't really spell out a counterargument to this idea.
It just shows you how clueless and insecure the people making big world affecting decisions are, and you leave realizing that the people in charge of things don't know what they're doing anymore than the rest of us. [Guess who just didn't kill anyone, but maybe only lost a couple of thumbs? I dunno. This guy!] I actually made a whole video about that idea in the context of scientific research.
Believe it or not, this isn't the first time I've talked about Succession on this channel that's ostensibly about science lol But more on topic, there are several points where the show critiques tech by just showing the absurdity of the behind the scenes decisions leading to the real world consequences we've all seen play out. For example, Kendall eventually does manage to acquire fake BuzzFeed, and there's this really chilling episode where he goes from this: [I believe Vaulter is the future of this company. You can't question my belief in what you've all built here. We do not want an adversarial situation here.] to this: [I'm afraid I have to inform you, you are all dismissed. Hand in your passes, security will be coming around now.
Health benefits will be terminated at the end of the month. I'd like to thank you all for your hard work] In case you've missed it, tech workers have been experiencing massive layoffs in recent years, a profit saving move that was really always inevitable in a sector so hostile to unions. And while plebes on Twitter are arguing about why it should have happened like this or why it had to happen like that, the mundane reason for labor abuses like this is because they're probably convenient for someone with power.
The genius of succession is that that episode wasn't about all the laid off workers or the precarity of their jobs-- it was about how Kendall gutted a whole staff because his dad told him to, and he just wants his dad to love him! [Why? Because my dad told me to. Because your dad told you to? Because your dad told you to???] And that's not a joke. It's literally because his dad told him to. [You think you can gut Vaulter for me? Or do you need help? I'll take care of it.]
Babygirl! We see another critique with the character of Lukas Matsson, the tech billionaire who buys the family company in the end of the show. When he's introduced, he's just another autistic-coded wunderkind who's made a wildly successful app. But as the show progresses, we learn more about his genius... [We built his whole rep. He's not even a real coder.
Someone gave him, like, a box of tech, and he took it to market.] more about his app... [There's a little issue that we're looking into with subscriber numbers being a little bit bullshit.] and more about his... [I was seeing this girl, and after we broke up, I sent her some of my blood.]
Wait, what was that last part? [Half a liter frozen blood brick] Ummmm [I just kept doing it] Let's just put Matsson on the shelf for now. Point is, they're taking something we already understand on some level--that the most successful tech bros tend not to be the geniuses they present themselves as, and also do some pretty questionable things-- [As a joke] but instead of focusing on the problems or effects of that, the show instead chooses to highlight the personal interactions and relationships that actually maintain their whole... [Just don't, you know... Don't what? Scream people are data and stick my d*ck in the guac?] thing. I think these critiques--of labor rights in the tech industry and the pseudo geniuses that dominate it--are pretty easy for anyone to take away from the show if they have even a slight awareness of the real life counterparts.
But the one I think is a little harder to pick up on-- the one that had me show up during my first watch-- is the fraught prospect of regulating tech. Get used to the thematic whiplash, this video is just going to be like that. I feel the need to disclaim: I am not anti-technology. I get the impression that some people interpret any criticism of technology as opposition to the whole human endeavor of making things, but I promise, I don't want to smash all the machines and send us back to the Stone Age. You can be critical of something because you like it and want to make it better.
So please, not interested in your comments about all the wonderful things technology has done for us. I'm aware. But uf there's anything I've learned in my stint studying technology and society, it's that you should be scared of an unregulated tech industry.
[Ready to spread some regulatory anxieties? Yeah, sure. Let's spread those surveillance capitalism heebie-jeebies.] Let's review some of the hits from our last decade of boundless *innovation*. No one read the terms and conditions before clicking 'I agree' on Facebook, so Cambridge Analytica decided to start selling elections to aspiring despots.
Gig companies like DoorDash and Fiverr basically used the logic of "But we're just an app!" to create a whole new class of workers that will cost companies a lot less money to exploit. And computersm which increasingly run everything, can be racist now! There's the continuous flow of misinformation, apps that are designed to literally addict you, and the huge can of worms that is currently being opened with the advent of AI. Let's open the AI worm-can with even more AI! What ethical issues does AI introduce to society? [AI introduces several ethical issues to society...] Oh wow, that is way too much, there's no way I'm about to read all of this. Can you make that shorter? [AI introduces ethical issues related to privacy, bias...] UGH so boring. Now say it like a finance bro.
[All right, listen up, folks. AI brings a bunch of ethical challenges to the table. We're talking privacy concerns, bias issues, transparency headaches, job displacement fears, and who's responsible when shit hits the fan? Plus, there's a whole security risk thing and these pesky autonomous weapons deals. Oh, and don't forget about manipulation and fake news messing with our heads.
Amirite?? But, hey, we're not just sitting on our thumbs. We got to collaborate, make some regulations. Got to work together to make sure AI is used the right way: responsibly, and for the greater good. You know? So let's get those brains and brawns together, folks and keep this AI party in check. To the moon! Bang bang! All right, team.] That was enlightening in...more ways than one.
But it did hit on some important issues, including some we've seen in passing on Succession. Like the growing use of deepfakes, [Bring the cruise ship experience to dry land and double the earnings of our parks division. That's really well-edited.] and the increasing ease of mass surveillance [We're listening?? It's complicated, but, but yet it seems that we are sometimes listening quite aggressively.] Considering all this, it feels safe to say that the tech industry in its current form can do some pretty serious harm to a lot of people.
Besides an oversimplifying 'because capitalism', the *why* of this conversation is outside the scope of this video. I got one coming down the line That's going to be more about the why part of this conversation, so subscribe and hit the bell if you want to learn about the technocratic ideology that's ruining technology for everyone! But for now, I'm more interested in the reaction to all this. How do we, *as a society*, respond to tech companies that are clearly not operating in our best interests? [Regulate and strangulate.] I'm inclined to agree with Logan on this point. And the industry's aversion to regulation certainly supports it.
If tech companies weren't concerned that regulation would stifle their attempts to exploit literally everyone, they wouldn't be spending so much money on anti-regulation legislation. Like I remember voting on Prop 22 back in 2020, which was a proposition in California that would let app based delivery and transportation companies continue to classify their workers as "independent contractors" rather than "employees", enabling that exploitable gig worker class I had mentioned previously. Turns out Uber, Lyft, DoorDash, Postmates and Instacart spent over $200 million on campaigns supporting the proposition because of course they did. You can also see the threat of regulation strike fear into the hearts of tech CEOs in their mass exodus from Silicon Valley in California to states like Texas and Florida that have more of a pro-business regulatory environment.
Despite Elon Musk supposedly being in favor of regulating AI, he's already moved Tesla, the Boring Company and himself to the Lone Star State to get in on that sweet, sweet corporate freedom, and he's not the only one. Scholars have even argued that the discourse around tech ethics-- via companies' ethics statements, or academic conferences, or even government commissions--largely function as a form of pseudo self-regulation that makes *actual* regulation seem unnecessary. Think of all these conversations we've been hearing where it's like, 'What if AI become sentient and then it's the Terminator???' when the possibility of *that* kind of intelligence is a complete nonstarter and distracts from the very real and already relevant ethical questions AI brings up in its current form. YouTuber Shanspeare
actually made an excellent video on this topic recently that illustrates some of the self-boogeyman-mythologizing the Silicon Valley guys are pulling on us and frankly, it's iconic. So if evil tech companies are putting up copious amounts of money and effort to avoid being regulated, surely regulating them would be a good thing, right? I mean, I thought so... Until I watched the last season of Succession. Okay, we're back to the show.
So for context, we find out in like, the second episode of the show that Waystar, the family company, is kind of on the decline. And Logan, the mean dad, is looking for a way to boost shareholder value. Because...that's what business is, I guess. I don't know, I'm just a scientist.
Logan's view of the tech industry is... [If I find some tech f*ck gets landed before us, I'm going to kill him.] not the most positive? Early on in the show, Logan declares that his dying media company is somehow going to be able to withstand the changing times. [In three or four years time, there'll only be one legacy media operation left.
Let that be us. One firm to stand up to tech.] But over the course of a few seasons, things don't really work out for him. So eventually, taking an attitude of 'if you can't beat em, join em', he tries to acquire a tech company called GoJo. This is the company founded by Lukas Matsson, the fake computer genius I mentioned earlier.
[You're the genius. Damn right I am.] [I sent her some of my blood.] After some flirtation with the sons, Logan and Matsson meet to talk about the deal, and somehow they come out of it with the tables turned: GoJo is going to acquire Waystar. Then--spoiler alert--Logan dies, and somehow no one saw it coming. It's really wild how this show that centers on an old man who refuses to name a successor despite his failing health somehow caught us all off guard when they finally killed him off. That's how good the show is.
In the fallout of his death, Kendall and Roman become interim co-CEOs, and they decide that actually, they don't want the bajillion dollars they would get if the deal goes through; they want the power of still owning a legacy media company. So they plot to tank the deal. Now with the GoJo deal self-sabotage plan in effect, the show reaches a plot point it's been building up to for a while: the US presidential election. The show really took its time setting up this episode.
The first couple of seasons plant the seeds with familial ties to the sitting president, as well as Connor's decision to run. [I hink I finally found a job I want to do. Okay. What is it? President of the United States.] I didn't really mention Connor because he's kind of like the non main character half sibling, but the things you need to know about him are that he's the eldest boy, and that he was interested in politics from a very young age.
We also see Logan leveraging ATN's political power to curb tech as early as season three. [I have a line for you on the president that we'd like you to start rolling out immediately. We just feel like our general ideological sympathy has maybe let them off the hook on too many specifics, right? Like big tech.] I know that clip was Shiv and not Logan,
but it's from season three, which was when she got to be the favorite kid, so she's kind of the proxy here. We see this dynamic play out in an even more explicit way a few episodes later, when the family stops by a conservative political conference to consider options for a viable presidential candidate, or, as Tom puts it, [Selecto el presidento]. Logan, who was still alive at the time, goes into the event with a pretty clear strategy: [He wants everyone hitting anti-tech positions hard. They're getting too powerful, everybody knows it.]
He has media power, the government's corruptible, he uses media to make the government stop tech. It's pretty straightforward. The president picking episode is also when we're introduced to soon to be Republican candidate Jeryd Mencken, who's about to be pretty important to our point here.
For context, here's how the show describes Mencken: [YouTube provocateur bullsh*t, aristo-populism, r*pe is natural, it's all red pill, baby!] And I think that's pretty apt. To keep it in internet vernacular, Shiv is like the woke snowflake sibling. [You don't want a rocket launch? Would it help if it was a rocket full of Muslim granola?] Her brother Roman is something of a foil to her as the family's fashy edge lord, and it's him who convinces Logan to support Mencken [My team's playing your team, it's only spicy because of my team wins they're going to shoot your team.] Kendall's more of a politically neutral, let-the-market-decide, kind of center between the two younger siblings, and we really see this political family balance come to a head in season four's very ironically titled episode 'America Decides'. It's finally Election Day, Mencken and his opponent are neck and neck, and in a twist that's uncannily reminiscent of the disaster that was the 2000 presidential election, Kendall finds himself in a position where he has to choose who his news network will call as the winner.
I'm going to talk about the plausibility of this scenario a little later, but for now, just go with it-- Kendall has to make a call that will likely determine who becomes president. As the decision approaches, his siblings are like an angel and devil on his shoulders. Shiv argues against helping the proto fascist take over for, you know, democracy reasons, [Every vote must be counted. Eh.] and Roman argues for Mencken because his anti-tax stance will help the brothers get out of the GoJo deal they had with Matsson. [Mencken will block the deal. Straight up.
He'll kill it. New law, foreign ownership tightening, whatever, he'll screw it up. In exchange for our support tonight.] Can you guess who they go with? If you said the neo-Nazi who's going to be tough on tech, congratulations! Your prize is more validation for your disillusionment in the world. Yay.
In Kendall's defense, he did try to see if the non terrifying candidate was amenable to his anti tech agenda, [May the best man who will, you know, protect American jobs and reign in tech and is called Daniel win, you know?] but the dem didn't take the bait, so really Mencken was the only option. Now, I'll grant that it's actually a little more complicated than that. Part of why Kendall goes with Mencken is because she had betrayed him by working with Matsson in secret.
And in fact, she set up a conversation between Matsson and Mencken that ultimately resulted in a finale where Mencken is the president elect *and* Matsson still gets to acquire Waystar. [What was your philosophy exactly? Privacy p*ssy pasta.] I'm not going to get into all that here, but it is more interesting illustration of how these massive decisions are being made on the very human whims of these unimaginably powerful people. [So because we had so much chicken when we were kids, I have to like the fascist? Yeah.]
But here's the takeaway I want us to think about: Some billionaires wanted to maintain control over their mega media conglomerate that's being absorbed by a tech CEO, and in order to do that, they throw the sizable political weight of their news network behind the 'regulate tech' guy. Who happens to be a neo-Nazi, but that's not even the shocking part for me. That's just realistic at this point. The thing that left me really troubled was that regulation, that thing that I had for so long thought would be our solution to all the problems we see in the tech industry, was being wielded for consolidation of corporate power. It's not 'regulate tech' so they don't take advantage of people; it's 'regulate tech' so they don't become more powerful than corporate America. And even though the show is technically fiction, this point gave me a reality check that I've had a seriously hard time contending with. I'm going to talk a little bit
about what I think this reframing of regulation means, but first: Where do you watch succession? I can tell you what I definitely don't do, and that is watch it along with basically everything else I could want to consume online for free with the help of a particularly useful subreddit that rhymes with r slash retiracy. If I did do that, I'd probably want to set up a VPN first so that I don't get an angry letter from my internet company, but don't worry, I'm not trying to sell you a VPN. In fact, I'm probably the only YouTuber to ever recommend my viewers get a VPN purely out of my desire for you to be safe online.
Look, I've even gotten emails from a couple of VPN companies asking me to advertise them. But no, I'm not doing this for them. I just think cybersecurity is something you should bother to figure out if you're going to engage in..."retiracy". Which you shouldn't! But if you did, it would certainly free up your budget to take the money you would have been spending on streaming services that are already making billions of dollars a year and instead use it to support independent creators like myself on Patron! On my page, you can find "Office Hours" videos where I answer questions Patrons have asked about the stuff I talk about here on YouTube, and a bunch of interviews I've conducted for video essay research. Memberships start at $3 a month and all levels get to contribute questions to Office Hours, access to those videos along with ad-free versions of my normal video essays, and sometimes get updates about this whole process of *making content*.
Check it out at the link below if you're interested and able, and otherwise, let's get back to this video that's definitely still about Succession. Okay. So now we're at the part of the video that's less me sharing knowledge I'm comfortable and confident in, and more me publicly grappling with what I think I know, so please bear with me here-- we're learning together. First, I want to preempt the criticism that I'm reading too much into a fictional television show. Succession has become known for a hard earned verisimilitude made possible by the use of consultants. Like the wealth consultants who advise on how real billionaires dress, act, and talk.
[She's brought a ludicrously capacious bag. What's even in there, huh? Flat shoes for the subway? Her lunch pail?] But they also use consultants who are experts in law, business or most relevant for the election storyline, media and politics. Like a guy who served as National Council on both Bush-Cheney campaigns, and a former president of CNN. Serious people. When asked about whether the election episode was true to real life politics, Eric Schultz, another consultant on the show and former deputy press secretary under Obama, says this: "Yes. The reason why I think the show has a lot of power is because it is set against a backdrop that is entirely credible.
Even though a lot of the scenarios, of course, take dramatic license. It was important to the writers that those events take place in a context that felt entirely plausible." So while I know that the show is a made up story, I do think that what happened in it is within the realm of possibility enough that analyzing it to understand real world power dynamics feels...fine. I mean, the other tech storylines match up with real life to a pretty uncanny degree. The whole Vaulter storyline matches up with what's happened to a lot of internet companies beat for beat, and, as for the whole character of Lukas Matsson... [You had this cute little valuation and your numbers just came out as gay!] Do you think Grimes ever finds blood popsicles at her door? So given the, perhaps not real, but at least realistic premise of a billionaire influencing an election in order to regulate tech *just* enough to keep his legacy media company, let's look at what this says about the real world.
I believe what we're seeing here is an example of something called interest convergence. Interest convergence is a principle developed by lawyer and critical race theorist Derrick Bell to explain why the Brown v. Board of Education decision didn't *really* end racial disparities in American public schooling. Because critical race theory is an academic framework for thinking about how the legal system is influenced by racial dynamics and not what you tell middle schoolers that slavery is bad. According to the research of Bell and others, the government had decided to desegregate schools not because Black activists had finally convinced them that it was the right thing to do, so much that it was a good move considering the optics of the Cold War. See, it was really hard for the US to count on its branding of the most free country when the rest of the world could see how African-Americans were being treated here.
[Well you've been a democracy for like 50 years, so-- What? I mean, well, okay, Not unless you don't count Black people, which is kind of a bad habit.] So to prevent the USSR from being able to argue that actually, no, YOU'RE the authoritarian evil one, America made some concessions. This analysis of Brown v board, as well as the broader geopolitical context, led Bell to propose in his 1980 paper that "the interests of Blacks in achieving racial equality will be accommodated only when it converges with the interests of whites." Extrapolating his principle beyond race relations, we can understand interest convergence as the idea that social change for disempowered people tends to only happen when their interests align with the interests of more powerful people.
So in this reach of a metaphor, the Cold War is the GoJo deal, Russia is Mattson, the American government is Kendall, and Black schoolchildren are the masses of people unprotected from the tech industry. [If I start second guessing, it collapses.] An important feature of Bell's theory is that changes made via interest convergence may not be as successful at achieving the goals of the disempowered group as it is at achieving the goals of the more powerful group. Again, looking at Brown v. Board, you *could* say that the US technically desegregated education. Though if you've watched my last video, you know that that didn't really happen, despite what the law became.
And back in our Succession story, you *could* technically say that me and Kendall agree that tech should be regulated, but 'regulate tech' is an extremely broad goal, and it could absolutely be realized in ways that serve Kendall's purposes, [Stop tech eating our lunch.] but don't serve mine. Being hard on tech could mean requiring social media companies to prevent mass misinformation-- something they absolutely have the capacity to do--but generally don't because not profitable. But it could also just mean requiring tech companies of a certain size operating in the U.S. to have an American CEO.
Which is basically what happens in the show; Matsson gets Mencken to acquiesce by offering to have the new GoJo/Waystar super company headed by an American CEO, which ends up being none other than the sesquipedalian. Tom Wambsgans. [Ludicrously capacious] Speaking of Tom, it is WILD to me that anybody interpreted the finale as Tom winning succession. Tom, the Pain Sponge, Wambsgans? [Would that be a problem?
Yeah. No, man. No, I could do it.] That's dark! Tom didn't win succession; the collaboration between fascism and technology won succession! But I digress. I think the point that's dawning on me in all of this is that, if we--you and me and all the other non billionaires of the world-- are relying on powerful people to help save us from the exploitation of other powerful people, the impact of that help will always be vulnerable to the fact that our interests aren't the only ones being considered. Like when I see something like this: [I'm in favor of a regulation because I think it's is a risk to the public, and anything that's a risk to the public, there needs to be some kind of referee.
The referee is the regulator.] I agree with literally all the words he just said. But I have a feeling that if he got HIS version of regulation enacted, it would do a lot less to help people like you and me, and a lot more to hamstring his competition in the emerging tech hype bubble that is AI. And it's actually more than a feeling, this is basically what's happening whenever you see a tech CEO advocating for regulation.
Go listen to this interview with AI expert Gary Marcus to learn more. I promise. It's a thing. Point is that to a degree, regulation is a form of change making that comes from the top down. And while that doesn't render it completely impotent, it does mean that it will generally be enacted in ways that serve those at the top. And that makes me a lot less willing to say that regulation is *the* solution to harmful tech.
And like, realizing that interest convergence is a thing isn't the only reason to question the efficacy of government regulation. Regulatory capture is a thing. Like when net neutrality policies were reversed back in 2017, the head of the FCC at the time was literally working for Verizon before becoming the guy who got to decide whether or not companies like Verizon could choose which bits of the internet to give you access to. And at a certain level, governments just aren't going to enact regulation that works against them.
Google of "don't be evil" origins has made billions in Department of Defense contracts working on things like drone footage analyzing software. Do you think the US government would do anything to regulate the questionably ethical work done at the behest of its military? [That was funny. I enjoyed that.] So, if as everyone on the show won't stop saying, [Tech is coming.
We are over. Okay? Make your accommodations] and as Kendall's disaster of a sabotage attempt has shown us, we might not be able to trust regulation to protect us from its increasing power, What can we do? It's funny, a while back, I had this little debate with a libertarian leaning member of my family about AI, and he was like, AI is so cool, and I'm like, yeah, but it's also kind of dangerous. And he's like, well, do you not want us to make stuff like this? And I'm like, no, we should. But we should also regulate it. And he proceeds to hit me with an intellectual punch to the gut when he asks, Do you really trust the government to do better? And in that moment we became that epic handshake meme because, no, actually, I don't generally trust the government to do the right thing.
Surprising, I'm sure. But I DO think they have, you know, infrastructure and oversight and process and a certain institutional slowness that I trust a little bit more than the tech CEOs whose self-impression as cowboy innovators in the new digital Wild West is apparent from company slogans like 'move fast and break things.' Like, maybe that was cute When Zuck was saying it ten years ago, but now that Facebook's gone and broken some things, the motto hits a little different. I also know that despite the profit motives of exploitative technologies tending to corrupt regulatory frameworks, there are absolutely people doing very hard work to make said frameworks work in favor of all of us non-billionaires. I recently went to a seminar in science policy where my main takeaway was that if you have whatever kind of scientific or technical training, enjoy talking to people and are really good at being diplomatic, there's probably a lot you can do to make these regulatory frameworks work more in the favor of normal people through a job in policy.
*I* certainly don't have the stomach for that. But some people do, and I think that's a good thing. And of course, regulation can happen at scales other than the US government. There are cities and even whole other countries that have curbed the power of apps like Airbnb and Uber through local and national legislation, often brought about through worker strikes and public protest. It's a great example of collective action pushing tech CEOs interests into convergence with those of the people they exploit. So despite the part of me that is, dare I say, a bit of a libertarian sympathizer, [All I want is a fair, flat tax.
Same for all Americans. But headed down to zero within the decade] I don't think my take here is that regulation is bad or it doesn't work; it's more that it has limitations. The STS scholar in me is saying regulate, but my inner abolitionist is like, we need to come up with something better. So rather than asking 'will regulation work?', the question I've been pondering recently is, 'what would it look like to work towards a tech industry that's actually beneficial for society?' If you saw my last video, you probably know where I'm going with this. [We need a more dynamic strategy.
Now let's call it for the sake of clarity, The strategy of a thousand lifeboats] So this clip is from a moment where the company is in crisis, and Ken is trying to get everybody on board for developing solutions. And as little as I agree with him on the purpose of tech regulation, and as much as the speech is made out to be a failure in the show, [I said lifeboats not iceberg!] I *do* think he's onto something here. He gives this rousing speech to his team where each part of the company, each little tiny piece of the whole, can help save them all together.
And that's kind of how I think about social change in general. There is no one true solution to any big societal problem. I think that real change is brought about by many people, in many places, doing many things, on many scales, all towards a common goal. [Vaulter us a lifeboat. ATN citizens is a lifeboat.
VR can be a lifeboat. VR is a...bubble. But yeah, no bad ideas] Technology theorist L.M. Sacasas applies this logic to the pursuit of a more humane tech world. "We should ask not only what is to be done, but who is going to do it. This at least acknowledges the fact that we may all have choices to make and some measure of agency to exercise, but we do not all have the same resources, skills or standing.
In other words, what a policy wonk in a senator's office can and should do is different from what a programmer working at a startup can and should do. Their scope for action will, in turn, differ wildly from that of a single parent struggling to pay rent and feed her kids. It's not just that she has far less power; is that her responsibilities and obligations are also different. That she cannot do what the legislative assistant or the programmer can do does not imply that she ought to do nothing."
I really like this framing as opposed to a singular, top-down solution because it helps people to see how they can be involved. Which of the thousand lifeboats they might want and be able to contribute to. And regulation *can* be a lifeboat-- but it's just one of the thousand. So both to relieve any digital dystopia despair, and as an exercise in collective ideation, let's consider some lifeboats together! In a video I made about Pluto a while ago, I ended the video not with a conclusion on who's right about Pluto, but an open question for the audience.
And wow, did y'all deliver in the comments! So let's do that again! The question of today's video is: in a world where our increasingly advanced technologies are being wielded primarily for profit, and often at the expense of humanity's collective well-being, what are our lifeboats? [Bring me a thousand lifeboats. Bring me a f*cking armada of eyeballs.] I'm going to start things off with a few answers in a second, but first, I want to invite anybody watching this right now to pause the video and leave a comment sharing what you see as an avenue, no matter how big or small, to prevent, counter or mitigate one or some of the harms we've seen brought about by big tech and their boundless *innovation*. If you did see my essay on systemic change, consider this an exercise in the logic of that video.
Understand the system as a whole, and then identify your points of intersection with that system to figure out your accessible avenues for affecting it. And if you didn't watch it, go watch it. All my videos are this good. So maybe when sharing your life boat include, as Sacasas suggested, who's steering it? What kind of positionality--in career, resources, identity, whatever--is conducive for that particular thread of the collective systemic change effort? I'll pick a few interesting answers to share at the end of my next video, prioritizing commenters who also respond to other people's comments because I'm trying to get y'all to have a conversation here.
[Look, this isn't a brainstorm, all I'm saying is everyone's invited.] And if you want to hear the Pluto comments and my thoughts, stick around to the end of this video because I'm finally getting around to them! Okay, so here's my three lifeboats to get things going. The first one is the kind of frustrating one, which is our own personal engagement with technology. It's one of those annoying things like recycling or going vegan because, you know, on some level there's a larger systemic problem that you opting out of probably won't do much to change, but also you kind of can't get around the fact that an individual action can be a part of a collective action that does have a significant impact on whatever system. Twitter is kind of like eating meat.
It's not that great for you, and your participation in the market does a teeny little bit to hurt everyone. I don't want to get *too* into that particular dialectic today, but for a deeper discussion, and what I think is a pretty good take on personal engagement with technology, check out 'What Is To Be Done', the essay by L.M. Sacasas that I quoted a minute ago. Okay, so we got individual action. What's another lifeboat? Well, there are certainly people who work within tech that are positioned to be some kind of lifeboat. I have spent enough time in the Bay Area to know there are a ton of socialist coders out there, and by that I do mean people with socialist politics who also work as software developers or whatever, and not people who write socialist code. At least not yet! In a really interesting article on what he called 'the engineers predicament', Ben Tarnoff discusses the contradiction between many tech workers' progressive ideals and sincere desire for a more collectivist world, and the inherently capitalist framework of their work.
Rather than criticize them for being in this increasingly common and honestly understandable position, he describes the redirection of their dissonance into subversion. "So another desire develops to put their politics into code. They want to write software that will facilitate the creation of worker co-operatives, seed the internet with self-governing platforms, and equip movements and municipalities with tools for democratic decision making and participatory governance.
You may not care about how to write socialist software; the idea may even strike you as ridiculous. But if you are at all curious about the possibility of life beyond capitalism, you have a stake in the engineer's predicament." The question of how to write socialist code is a complicated one, so definitely check out the full article to learn more, but I can at least speak to one aspect of it in my final lifeboat: education! This one's definitely the most in my wheelhouse, being that I've studied all this technology and society stuff in the context of education.
So there's a lot of interesting research that suggests that in the United States, technical education for things like engineering and programing aren't as politically neutral as one might hope. and more likely promote something of a techno-capitalist logic along with all the pure...math and stuff. Imean, if you still think the point of school is just personal educational enrichment, then yeah, it sounds ridiculous. But if you're thinking of the education system as being in place primarily to produce more workers, then it makes total sense. If you don't like my conspiratorial logic here, a particularly strong piece of evidence for this is Erin Cech's finding that "students' experiences over the course of their undergraduate programs decreases their interest in the public welfare considerations of engineering work." Of course, as statistically robust as that finding is, it still doesn't tell us anything about the WHY.
Again, topic for another video, subscribe if you want to hear more. But the important point is that the way future coders and designers are being taught currently is clearly not conducive for producing Tarnoff's socialist engineers, which means that teachers at every step of the educational process who can help counter these gross logics we pick up in STEM ed are absolutely going to be a lifeboat. So many lifeboats! Okay, that's enough of me talking. Hopefully I've given you enough to get the ball rolling. So if you haven't yet, go ahead and leave a comment as to what you think could be a lifeboat in this whole 'tech is cool, but also hurting us for profit' world that we live in.
I can't wait to see what you come up with! [All right. Thanks, everyone. Lifeboats!] Succession is about a lot of things. And I think one of them is tech. Clearly. I also think it's about duality. Like the duality of how my reaction
to this guy [Brownstones, water towers, trees, skyscrapers, fire fighters and Wall Street traders.] is to simultaneously hate his guts and also want his dad to love him so goddamn much. Or the duality of hearing something like this [Is he a Jew, by the way?] Oh, come on. What? Just a simple, friendly, slightly racist question.]
and think that he's absolutely depraved, but also you just want to give him a big hug. It's the same duality we see in new technology is like Chat-GPT or whatever AI thing that is simultaneously, you know, cool and impressive and exciting, but also very reasonable things to be worried about. Or the duality in understanding that fixes like regulation can help and hurt; the duality of accepting many simultaneous avenues for greater change.
Two things can be true at once. Or I guess in our lifeboat analogy, a thousand things can be true at once. Far reaching systemic problems like technologically facilitated exploitation are too big for any one person to solve.
But that doesn't mean that the way each of us engage with it won't have an impact on how it develops and what our futures will be like. Supposed tech geniuses will pretend that an AI singularity terminator doomsday scenario is the inevitable outcome of humanity's computational evolution, because then we can't blame them when [Bleep bloop guys are going to data mine us all to death] But where their story fails is in the missing duality. Yes, we live in a world where apps have eroded labor rights and websites have undermined democracies. Where computers can fight wars and hunt people for us. Where our entertainment addicts, polarizes, and isolates us. And yet, we can still imagine, design, and build those technologies the aforementioned Tarnoff calls 'class traitors': "Some technologies refuse their inheritance, or use it to subvert the system that provided it.
Others are simply more malleable, more open to redirection. The challenge for a socialist software engineer, or really for anyone with an interest in piecing together a post-capitalist future, is to find a way to use the space opened up by these breaks, to effect the leap to a new technological system." I know that not all of us are going to design the computers that bring about our liberation, just like not all of us are going to become regulators that make those technologies work for us or educators that teach people how to make such technologies. Changing systems is a team effort, and while some players may have a bigger role than others, there's room for everyone on the team.
[And when a team, is a team, it can't actually physically be beaten. It's impossible.] It IS impossible! Gotta love that productive delusion. Go team! Hi. Thanks for watching my video.
This one was pretty different for me, I don't usually do media analysis, and I'm not even sure if this qualifies as that, but I hope you enjoyed it! In a second, I'm going to share some comments from my Pluto video, but first, I got to pay the bills. As I've mentioned already, all of my videos are made possible by the wonderful subscribers of my Patreon whose name should be going down the screen right now. At the present moment, YouTube is my job, which is terrifying, but also made possible by the fact that some viewers like you have appreciated my work enough to sign up to donate a few dollars a month on my Patreon. In exchange, they get access to a bunch of exclusive content, ad-free versions of all my video essays, some interviews that I've conducted for them, and my favorite feature, "Office Hours". Anyone who's a member at any tier of my Patreon can contribute questions whenever I record one of those, and they're a really good time, I really enjoy being able to continue the conversations from my videos with the viewers on my Patreon. I am so grateful to all my patrons
and just want to thank them for continuing to support me in this wild online experiment of a career pivot. And if $3 a month isn't in your budget, or you just don't think it's worth it, that's fine too, you can do the algorithm things to help me out: like the video, send it to someone, leave a comment, subscribe to the channel-- these things only take a few seconds to do and they help me out so much because these goddamn algorithms are optimized for one thing and one thing alone, and that is your engagement. Okay. Enough of that. Time for the Pluto comments!
So if you haven't seen it, that video is about how the question of Pluto's planethood is actually quite complicated--scientifically, politically, epidemiologically-- and instead of trying to close the video with like, a conclusive statement about what is the truth of Pluto, I use it as an opportunity to ask the audience for what our other "Plutos", as in, what are the other topics where we learned one simple thing as kids, but in actuality, it turns out that that thing is a lot more complicated than just the middle school version? Honestly, I learned so much from the comments, and there were actually a couple that had information I wish I had known before making the video. Like how there are literally studies that use Pluto as a litmus test to understand how people's conceptual knowledge changes, or like, how Eris, the planet that's kind of responsible for Pluto being demoted, was literally named after the goddess of discord and strife, which is just too perfect. Like, how could I have missed that? But as far as the actual Plutos people brought up, I think the most common one was about sex and gender, which honestly makes a lot of sense. PrincessElinori left this comment saying "the science behind sex and gender is so complicated and weird, even for humans, let alone other organisms.
My focus is on marine biology, and it's funny to me how few of the organisms I work with are definitely male or female." Hopefully it goes without saying, but this is actually a very important misconception because some people will use like, middle school biology arguments about sex to argue against the legitimacy of transgender people, which is not only like, horrible and mean, but also scientifically unfounded. And people also brought up gender in contexts other than biology, like the socio-cultural interpretations of it that stick so hard because of our desire for psychological comfort, or even how the linguistics of gendered pronouns varies drastically between languages, for example, Arabic, which is highly gendered, and Mandarin, which is apparently very ambiguous when it comes to gender. Besides all the gender stuff, another common Pluto people brought up that I kind of anticipated was history.
Cucuserpent4 said that anything people have learned in elementary school about indigenous Americans definitely falls into the Pluto effect category. And yeah, it turns out that all those stories about pilgrims and Indians that didn't portray U.S. colonization and expansion as genocidal were kind of bullshit. A bunch of other people brought history up as well, but in a more discipline-wide sense, Like Gilliantheadventurer, who talked about how real history research *seems* like revisionism to people who don't understand that history is more than just memorizing events and dates, or hilmaperson6168, who used their experience in fashion history to explain how people incorrectly tend to make sense of history, like a line of progression rather than a web of interconnected processes. Another kind of whole discipline Pluto that was brought up was economics, kayodesalandy3846, an economics Ph.D.
student, says that while they were teaching, they found that many students described economic models as a fact, despite there being research that contradicts these models. Yeah, totally not concerning that. People studying economics are learning neoliberal logic says fact. Yay. That example, like all the others I gave, are very informative, but also kind of a bummer, so I want to end on a less socially harmful Pluto and that is instrument categorization.
This one was really cool. Locsoluv94, who actually left a comment several videos ago that provoked me into making the Pluto video, is apparently an expert in music and according to them, musical instruments are categorized by the mechanism that creates the sound. For example, they say the saxophone is made of brass, but it's not a brass instrument.
It's a woodwind because it uses a reed to make the sound. In the same way, the piano is not a string instrument because it has strings inside. It's a percussion instrument because the strings are struck with hammers. That comment was already very interesting.
But then estherscholz8400 left a comment saying that, because certain letters and sounds are created from different parts of the mouth, like friction, air or resonant frequencies, you could categorize them into percussion, woodwinds and brass as per locsolucv's categorization and therefore, language is a band. So basically, I guess I've spent the last hour performing a symphony for you all. Hope you've enjoyed it.
2023-08-07 03:55