Post-Science Civilizations
This episode is brought to you by Brilliant. For centuries now, we have pushed ever forward with science, unlocking new truths and new mysteries at every step… this journey has come to define our modern civilization… but what would happen if one day, that journey stopped, having no further place to travel forward to? So, I’ve occasionally discussed in other episodes that there are fundamental reasons to think that we will get an answer to all the basic universal laws of physics and from there, all of science in general. And indeed, this might happen before we even get into the depths of space. By default we tend to think that’s absurd, that there will always be more questions to solve.
That science is like the mythic many-headed hydra fought by Hercules, which sprouted two new heads every time one was cut off. And indeed this might be the case for knowledge in general but for science itself there’s less of a reason to expect this and indeed, it is quite to the contrary. It's not that we’re creating ever-new questions, but that we’re speeding up the rate at which we answer a set and finite number of them. The hydra, in this case, may sprout two heads each time we cut one off, but it’s only got a thousand to sprout, and each new pair popping up from a stump just makes it faster and easier to whack two off. That the
complexity of the universe is not infinite, and so there's only so much science to figure out. To a society living several centuries into the explosion of knowledge and science that brought us to this modern time of technological omnipresence, the notion that this could ever stop seems almost impossible to believe. Humanity might blow itself to smithereens with its technology and thus end scientific exploration, but, run out of science? Inconceivable! So, shortly before I wrote this episode, I’d mentioned this concept of ending science, by learning it all, and someone asked what a post-science civilization might be like. And I realized that even though I often say otherwise, that science might have a finite number of veils we can lift and end, I don’t really believe. I think most of us think science is never-ending and I presumably believe it deep down too, because otherwise, I would have already done an episode on the topic. Especially given that I often think of it as an event likely to occur within a few centuries or millennia, maybe even sooner, while I think of colonizing even our nearest neighboring stars as on the same sorts of timelines and the eventual galactic sprawl of civilization as not even getting fully underway for a million years or more. Usually, when we even consider the idea of maxing out our learning or technology,
the assumption is that it’s the other way around, we get out in the galaxy at large in the next few centuries, like Star Trek, but an end to learning would still be millions of years from now. I don’t want to spend too much time justifying this conclusion that science could end, so our handwave would be that a civilization might get so far along with its science that it is hitting big walls of ever-slower growth of learning or they are finding that new discoveries are mostly only being made by intelligences that are beyond anything human and the civilization either doesn’t want to embrace that path, or they are afraid they might destroy themselves if they learn any more, but they still might not opt to go anti-technological either, as we discussed in our Techno-Primitivism episode. Or that we are looking at a civilization that’s coming off a collapse and has got a mix of old technological artifacts and primitive tools they can still make, like we looked at in Techno-Barbarians. So I want to examine 5 possible cases of Post-Science Civilizations or close to it, which are: 1. In a very distant future, eons from now, knowledge of science is completed. 2. In the relatively near future, we learn all the core rules of physics, and in the following
centuries we get through the remaining questions in other sciences, possibly slowing as only more obscure and less important questions remain. 3. Scientific and technological progress never halt, but that the effort needed for each new step involves more and more effort and so we stop seeing a civilization where each generation’s technology is vastly different than before. 4. Those new scientific mysteries have reached a point where only an inhuman and unfathomable mind can really progress at them, and this is seen as undesirable or dangerous, or perhaps when those minds are created they just leave or become unresponsive. 5. The civilization in question feels that pursuing any additional science is likely to make it nearly impossible to survive, as it just makes it way too easy for any anomaly to wreck them.
One lone lunatic can mass-manufacture antimatter bombs in their basement, or science seems on the verge of proving that Free Will and Purpose are illusions, or it might simply be that they’re constantly dodging bullets from all the technological marvels they’re making and they think that, eventually, one of them is nearly guaranteed to get them. Now, while I’ve listed five cases here, they can overlap I think, and we’re not really planning to discuss each case in detail today, it's more that you can envision a civilization around each and they would come in myriad forms. One thing to understand is that virtually all physics that we do above the quantum scale and beneath the galactic scale, even today, is very nearly identical to what it was before Einstein arrived on the scene, and most of the core principles of biology, chemistry, and geology are things you could still learn if you had the textbooks on the topic that were available when Einstein was a student. We’ve pushed the frontiers very far indeed, but many areas haven’t changed all that much and there really is no reason to think that there’s an infinite number of physical laws and scientific mysteries to learn. As best as we can tell, all branches of science are emergent properties either of physics directly or from another field emergent from it, like biology being emergent from chemistry which is in turn emergent from physics, and from which medicine or neurology or sociology is emergent from. But a firm knowledge of Quantum Mechanics isn’t really going to help you win at chess much, even though the game and everything used to play it are emergent properties of the basic physical realities of the Universe.
However, there really are a finite number of strategies and tactics for that game, and indeed, a finite number of possible meaningful games since the game itself is finite in size, positions, pieces, etc. Knowing all the rules to chess, which takes little time for most folks to pick up, obviously isn’t the same as being a master of it. And so, when I say that knowing all the physical laws of the Universe, knowing all of those emergent systems on top of it would come later. Physics is the first field we would be likely to master, and then probably chemistry, as there’s a finite number of ways you can combine the finite number of elements with each other and under various conditions of pressure, temperature, and so on, but it wouldn’t seem possible to do all those conditions without knowing all physical options first. If we live in a finite Universe, it stands to reason that the rules governing it must be finite. Alternatively, an infinite-sized Universe doesn’t imply infinite rules. It may be that there are unknowable numbers of alternate realities and multiverses, each with different rules, but we have no way of proving or experimenting with those other places yet. If we do, that changes things,
and a complete knowledge of science might also include the ability to change the rules or make new places with different rules. Whichever the case, we have no strong reason to think science is unending, and yet, we really are very convinced that’s the case, and so it makes me instantly worry. Extreme confidence as a society that something is true, potentially leaves that civilization vulnerable to having that truth kicked out from under them, and their whole worldview with it. Many things can destroy civilizations,
some utterly mundane and expected, but extreme worldview shifts are often the true juggernauts, crushing and smashing all before them, and I think a civilization utterly convinced at a root level that science will always keep turning up new things and inventing new widgets, then finding out that wasn’t the case, would undergo just such a very big shift, equal to or maybe greater than those caused by things like cars or computers or civil rights and voting. I think it would be a Singularity or maybe an actual Out of Context Problem, the terrifying big brother of a Black Swan that we discussed in that episode, and would represent a Fermi Paradox Solution. Indeed, as we discuss this today, I’m going to argue that it maybe should be added to our short list of Late Filters of the Fermi Paradox, those stages a civilization such as ours still has to pass through, as opposed to all the many prior filters we got through between our galaxy and world forming and us inventing the first rocketship. Now, starting with our first case, that, in a very distant future, eons from now, knowledge of science is completed. That doesn’t help us much with the Fermi Paradox; it is essentially the default point of view, we either assume knowledge is infinite and science along with it, or that science at least might be finite but will need uncounted millennia to solve.
Such being the case though, it rather implies that classic sci-fi situation where you just keep making ever better engines for spaceships that let you move more ship, and faster, which lends itself to a rapidly settled galaxy and also begs the question of where all these leviathan civilizations are and what happened to them. As this is the classic sci fi and space opera case and the one probably most envisioned in Fermi Paradox discussions, we can largely gloss over it, and it often ends with something like an Ascension Scenario, where the aliens go to a higher plane of existence to continue their pursuits, having solved everything of interest in this Universe and ready to move to the next one. Or that they delve too deep and go mad or unleash some ancient terror that consumes them. See our episodes: Gods & Monsters, Godlike Aliens, and Aloof Aliens for more discussion of that sort of scenario, but it comes down to handling the difficulty of a civilization that’s reached that level by basically moving it somewhere else, out of play, either disinterested in pursuing resource acquisition as we might expect them to, and disinterested in communicating with, or even acknowledging other lifeforms in the universe, or are simply bound by non-interference rules, laws and regulations, or they have been wiped out by either themselves, or a sudden unexpected scientific discovery, or by some kind of cyclical natural event that we haven’t been around long enough to observe. Their goals no longer resemble our goals and thus, if they remain at all, we can’t spot them by obvious signs of their activity because they’re not doing the obvious stuff. It does still remain a problem if we assume the Rare Intelligence camp of the Fermi Paradox is valid and that we’re the Firstborn civilization in this region of the Universe though, an example of a Grabby Alien Civilization or K4 Civilization, our topic for next week.
So, over the next few centuries or millennia, we master every world in the galaxy and since our science exceeds what modern theory permits, we presumably can create things that make even the greatest of Megastructures we discuss on this show look petty and small. And since some of those make use of entire galaxies to achieve their goals or could move galaxies and harvest and consume whole stars to fuel themselves – all using only modern science - we should assume that a civilization whose science keeps expanding constantly for millions of years has had the time and power to achieve even greater feats. Probably those that make a Dyson Sphere seem as trivial as a child's terrarium. Or perhaps not, we shouldn’t assume an ever greater knowledge of science implies ever greater power and control, that really is not how this has worked thus far, where we still use engines functionally similar to the ones we had centuries ago, using fuel to make steam or hot gas to shove things, and really only solar panels and thermocouples create power differently and they also use principles discovered by folks who are all long-since dead.
This needs emphasis perhaps; knowing the complete rules of a system does not mean you’ve mastered that system. You can know all the rules of chess, football, basketball or poker and still have folks learning new techniques and strategies centuries later, especially as dynamics shift. Shoes with better cushions or cleats on them, just from something like rubber being discovered or improved, can shift the dynamics of the game of football a lot but the rules didn’t change, and that’s basically what physical laws are too. That’s more of what we mean with our second case, that we discover all the laws of physics and how the Universe works and maybe even this century, so that before the first interstellar spaceship arrives at our first colony to be formed, we already know all the rules of how the Universe works. I suspect this is the case too, that while we have many more complex emergent systems branching off from physics, that your baseline astronomy, particle physics, relativity, quantum, and probably chemistry will be settled out in a few more lifetimes.
What does this civilization look like? Well, I suppose we need to ask if a post-science civilization is one where all science is complete or all technology is too, or even if all available scientific mysteries are currently solved or in the unsolvable bin. As an example, things like string Theory and Multiverses, popular as they are in sci fi and speculation in general, lack any proof for or against them, and have only a very few vague notions as to how they could be tested in this Universe. Often, those suggested experiments are leviathans of effort and time. As an example, we might find that we have a new particle in our theories but to test it would require a supercollider a few thousand light years across, powered by several thousand stars and needing such precision to the collider’s track that we must empty out an enormous torus of space outside the track, many light years wide, of anything even as big as a modest asteroid that might perturb the particles gravitationally. A civilization might decide that is a good cause, but to get the resources to be able to build that without throwing everything they have at it, to build something of that scale on the proportional resources of a modern supercollider to modern civilization, might require a fully colonized swath of space that’s ten thousand light years across, that’s sufficiently settled and built-up that, most of those billion or so star systems don’t mind footing a bit of the bill and inconvenience. Thus, that’s the sort of thing that might happen in the year 100,000 AD, if we can get ships out into space able to move 10% of light speed. The construction time might be tens of thousands of years too and the first experiment
on completion might need ten thousand years just to let the first set of particles run the track. On getting that data, it might be that the answer is clear, easy and complete. Done, one simple paper, simultaneously written by some physicist in each system, independently, a few weeks after the signal carrying the experimental data reaches them. And then back to another hundred thousand years of waiting while the next anomaly is found and experiment is suggested and then thousands of years are taken for the paper suggesting that experiment to rattle around through the galaxy to gather interest, discussion, and funding. Let's be blunt, this is a civilization where nobody does science full time anymore and honestly isn’t one where you probably have a lot of full time instructors of science at the higher levels. Let me give an example case, it’s the year 3000 at the University of Mordor on Charon,
Pluto’s largest moon. Humanity numbers some 100 Trillion people, all but maybe a billion of which live in this solar system, closer to the Sun than we do, at Pluto. Our University got its funding principally to explore the ice and cryovolcano effects of the Pluto-Charon system, and three centuries after they got that mandate, they are still putting out papers. Indeed since this is the year 3000, many of the staff are the same folks as first were hired for the job, being biologically immortal. Other than new researchers or assistants, nobody comes here to learn from us, we do videos and papers on the topic – or possibly straight brain upload packets – and even the 100 million or so folks living in and around Pluto, many of them with IQs well above 200 on the modern scale and curious and scientific by nature, do not particularly care about our research.
So, it’s not a full time job – I mean why would it be? It is the year 3000, if you’re not post-scarcity by now, then odds are, your civilization is some dystopian wreck or the extinguished ruins thereof. It is entirely possible that humans of the year 3000 aren’t much different from the way they looked in the year 2000 or 1000, but odds are, you’ve got bits and pieces of machinery glued into you in some fashion. The typical person probably is at least as smart as Einstein and probably has had access to learning tools that make anything we’ve got now look ham-fisted. I don’t know what someone with an IQ of 200 at age 200 is like in terms of knowledge and interest but the odds are that is on the dumb and young side of what will be around then. Post-scarcity civilizations probably aren’t prone to our typical view of hedonism either, loaded to the gills on various forms of the seven deadly sins, and lounging in utter sloth. We get overweight and lazy as a result of more calories in than out or having low energy levels, and more likely than not, someone of the year 3000 has an Olympic athlete’s body, if not a superhero’s, and has probably been raised by something that would make modern helicopter parenting and intense mentoring look negligent. So, we’re not talking about a civilization wherever everyone
stopped doing science because they spent all their time glutting themselves on base pleasures and never got educated, we’re talking about a society where the baseline education of most folks is the equivalent of a modern Ph.D. in every field we’ve got, and they just can’t make a full time job out of science because it’s slowed down and there are so many folks in every field and they don’t need a full time job. I think you’d still have noted experts on topics, but with a lot less formality than today, our current University and Academia setup for that, already seems a bit unstable going forward and I’ve no great insight what that will look like next century, let alone next millenia, but peer review and accreditation get very tricky when you start adding in century-long light lags between planets. We’ve got a telescope a thousand light years away, it transmits back to Earth and many other places, and some colony 1500 light years away from Earth, but 900 light years from that telescope, is responsible for figuring out the theory a century ahead of when Earth hears, and its trillions of scientists figure it out, name it, and re-broadcast it to the civilization extending out from it in a rough spherical blob. And maybe ten thousand years later, after most folks are dead or had moved on to other interests, someone finally properly sorted out who did what and when, and duplicated it, or first figured it out or figured it out tenth, but the fastest after getting the original data. Fundamentally, this is not a very incentive-heavy environment for science, and realistically, society isn’t begging them for new tech to make very easy lives even easier. Odds are
everyone is very curious still, but the driving methods of effort and funding will have shifted, and as we get problems that might need trillions of scientists, laboring for thousands of years to make any progress, I just don’t think it would be as big a draw to folks looking for a challenge. We have an image of scientists challenging the impossible but scientists tend to get exhausted and discouraged by no progress and peers telling them it's a waste of time. These things get worse if you have accelerated consciousness too – indeed, I suspect just being bigger brained would lend itself to impatience, even if greater courtesy and willpower might override that, but when we contemplate civilizations living the transhuman or uploaded mind speed superintelligence life, where the augmentation is aimed at simply speeding up how fast we think and experience things, then light lag becomes brutal. A scientist who experiences time at 100x faster inside their virtual world than a modern human currently experiences time, still has to wait a thousand years for that signal to come from that telescope, but to them it feels like 100,000 years instead. Alternatively, they can create and model entire simulated universes of amazing complexity in that virtual environment. Hypothetically, a scientist might freeze themselves to wait for results,
and if that were commonplace it might make for a different twist on post-science civilizations, a hiatus-science civilization perhaps. Now, faster than light travel – depending on how it works – obviously changes all of this, as would portals to other planes, dimensions or timelines, but we don’t know those exist and have no current theories on even determining how to discover that, and that may continue forever, that’s the whole End of Science thing, Light Speed and Entropy just get stuck in there with no plausible hope of ever overcoming them. If they are rigid laws, unbendable, and no other Universe is available, then you might have folks keep up the faith even millions of years since the last truly new discovery, but it's unlikely to be viewed as we view it now, a thing that happens so often it will surely happen again. And, a post-human society of super-geniuses might be very bad at believing things that are more ‘matters of principle’ than they are evidence. We often worry that if things like Free Will and Purpose are just illusions and conceits of us. If they are, then such civilizations might be very bad at lying to themselves and turn hyper-nihilistic.
Everybody dies of despair or suicide except for the ones who turn into mad hedonist mixtures of Dark Eldar and Melniboneans, sacrificing people to Slaanesh while always seeking ever crazier and darker means of satisfying themselves, like drug addicts. Something like an end of Science might look like that, especially as new technological innovations dried up, or it might just be a culture that is more like we were several hundred years ago that just didn’t have that focus. Those folks were not short of mental challenges, riddles, or puzzles though. We all play games, puzzles are probably common with this audience, I like crosswords and logic puzzles and my wife loves Sudoku and logic puzzles too, and she is one of those people who stubbornly won’t look at the back of the book for the answer, but I’ll do that, especially on crosswords, if it is some word I know I’m not going to know. Difference of styles, I mostly like puzzles as a method of doing brain warm up and stretching with my morning coffee to get me ready for working on this show, so speed matters, and it's just practice not a challenge. Neither one of us is under any illusion that us solving those puzzles helps improve humanity the way new scientific breakthroughs do, and somebody made that puzzle and mass-produced it so we’re hardly the first ones solving it.
Same thing, when I’m reading old science from back before even my mentor’s mentors were alive, I’m not expecting to get deep revelations out of that, just a perspective from a different time and sometimes it’s a helpful aid to me, given what I do for living. And this love of puzzles and riddles and challenges long predates any formal science, and I think it would exist long after the end of science and technology too, and would probably fill that need for folks pretty well. Most humans aren’t scientists, even most curious and puzzle-loving ones. Knowledge isn’t necessarily infinite either, but it’s a much bigger realm than just science, and it really wouldn’t seem likely anything of even vaguely human mind and duration could ever run out of knowledge. You could spend a thousand years just reading everything printed on science
in the typical University library, as they were at the turn of the century when I used to wander the bookshelves at my own college, being amazed at how many rows were given over just to my own field of physics. It's been growing ever since and none of us really expect that to end anytime too soon. Now, a super-intelligent mind might get bored way faster and easier than the average human mind tends towards. There may be an infinite number of books you could write, but the fiftieth fantasy series you read that seems to mimic and differs only superficially from Tolkien’s Lord of the Rings or the classic King Arthur tales, gets pretty boring, and a supermind might get bored with it even more quickly than we do. We shouldn’t assume that it definitely scales up, that superminds can make puzzles and fiction that would occupy them as well as contemporary material can occupy us. That could go either way, that superintelligences can create exponentially more new and interesting things than they can all absorb without getting bored, or that they can’t create them enough to prevent each other getting bored.
Now, one thing that comes to mind, I said a moment ago, it – science - has been growing ever since I was a student and none of us really expect that to end anytime too soon, but what if it did? What if there was literally not a single new discovery after today? Just finish up a few projects and data and say “Oh, that’s where dark energy comes from, and that’s what a dark matter particle is” or “that’s why we can never get the answer to X, Y and Z question”, and then, done! No new solutions or answers. Imagine a world in which, as of today, no new scientific discoveries were ever made. Firstly, it would likely take a few generations before we started to fully believe it was done-done. I mean,
I can talk all day about the end of science and in our lifetimes, but even as I say it, and with good reasoning at my back, it’s an ethereal concept I view as unreal. So I can’t see it causing a rapid apocalypse, folks would just shift from turning our brightest minds to new research, to principally technological innovation and then to other fields of endeavor. A cabinet maker in their garage utilizes tools and techniques mastered long before they were born, to make a piece of woodwork that’s probably nearly identical to those that many others have made , and is not moaning in despair about a lack of creative options and lacking a feeling of accomplishment, ditto the gardener or chef. But it’s not likely to magically end in one moment, and with fewer accomplishments in science every year, I think you would see a slow drain-off of folks, over a few centuries probably. Some might be left behind against a perceived need to always keep trying to find something new, and I’d imagine some would view it as a ceremonial role and others would be half-crazed followers of any random newly-reported phenomena, trying to stir up some hype for it.
That’s a sad way to think of science ending, but more likely, it would be a posthuman era, but there would likely still be a handful of dedicated old masters of the field who kept their eyes open, and met occasionally, to check if anything new was in, or if the textbooks needed revising, so to speak. They’re the originals not great-great-great-grand students of the last scientist to produce real new science. A sudden end to science would be more like in our cases 4 and 5, where that civilization is fearing that any new science is going to wreck them, and they then ban science, or alternatively, it did wreck them, and Earth’s entire surface got turned into a glowing hot radioactive cinder. Or more anyway, Earth is already a glowing hot radioactive cinder with a thin cool skin of life wrapped around it.
Now, we’ve got plenty of fictional examples of cases where something like this happened, Ray Bradbury’s Fahrenheit 451 or Walter Miller’s A Canticle for Leibowitz come to mind, but while they are good stories; contemplating post-apocalyptic scenarios, I didn’t find either to be super-realistic, and they don’t contemplate a civilization calmly and rationally, choosing an end to scientific research because in sci fi, any faction opposed to scientific research are automatically, by default, crazy fanatical Luddites. It is harder to make that case if you have a civilization in which simplistic AI, robots, automation, and 3D printing have basically brought you to post-scarcity, with folks easily able to feed themselves and enjoy many luxuries, and do so sustainably for billions of years, where folks really only need to work maybe five hours a week, but most opt to do more, where disease and old age are basically things of the past and nobody lacks for any real need. In that society, and that’s probably things circa the year 2100 AD, if someone points out that the newer science is having very marginal improvements to people’s standard of living, both materially and in more existential respects, but that terrifying and harder-to-control dangers keep appearing on the edge of exploration – Superhuman AI they don’t need, posthumans who might view them as insects, nuclear bombs people might make in their basements, weird space or time bending stuff, then maybe when they say “How shall we fix this?” someone would say “Maybe we need to stop doing any new science or at the least cut back on it?” and when they’re done exclaiming in horror at the concept, they might say “Actually, maybe we should.” I could really see that, but I don’t see that happening out of the blue and I’d have difficulty seeing that appearing in the form of immediate total bans and witch hunts, though those could follow if some faction ignored that rule and nearly killed everyone with crazy robot armies or something. I could see this happening though, as we are not in times past, any rule we passed limiting further development is not going to disappear in the ages of time, we have digital records and easy backups, everyone will remember the why and how. So, civilizations that basically pause at 22nd century technology might not ever lift that pause and might not really be well set up for interstellar colonization or even inclined to do it. We talk about getting all your eggs out of one basket,
but in truth, there really is no plausible natural disaster that can hit a solar system, or even a single and alert planet, that can wipe it out, except intelligent ones. Fear aliens or AI, not asteroids. And the reality might be, or be seen to be, that spreading colonies to countless other stars is more likely to spawn dangerous new threats to your civilization than is justified by any minor and improbable risk averted, that just colonizing your own solar system doesn’t avoid.
Also, giant space telescopes and transmitters and dishes of the kind we discuss for talking across the galaxy to aliens don’t serve much point, you assume that like yourself they either aren’t interested in new technology or they are and thus represent a threat, because they’re recklessly playing with science and might spawn some galactic menace before killing everyone, including themselves, so why bother talking to them at all? However strange they may be, they're not likely any weirder than some of the things you’ve dreamed up in virtual worlds and giving them your name and address in exchange for theirs is potentially a big risk for little tangible reward. And if that’s the case, it might be a valid Fermi Paradox Solution, better than many I can think of and maybe even a top-ten entry. But in the end, I think that a post-science civilization is probably the one that most intelligent life-forms live in. Most of our distant colonies as we settle the galaxy and beyond, aren't going to be focused on science because they will assume that the vast dyson swarms of the Inner Sphere near Earth will already be working on any new bit of science, and be much better equipped and with greater/cheaper resources available than they have, and indeed quite likely to have already solved it before those colonies had even heard of it themselves, just due to light lag. This is likely what most of humanity outside maybe the next few
millenia will be like, busy, and pro-technology, but not focused on doing new science themselves, and so probably not too unlike our own cultures of the past, where things still changed, but the culture didn’t revolve around the idea that everything is redone every generation. It becomes a society where folks are probably more willing to build something that lasts ten times longer but costs five times as much, which we tend to be reluctant to do in this era of entirely new technology every generation. But I don’t think it would be a stale society, void of progress and stagnant or hedonistic, anymore than those in our past were, and many prior cultures or communities appear not to have added any new technology, let alone science, to humanity’s pool. Though, likely many did do at least some, and it just hasn’t gotten recorded, and what counts as science can be debated. Nonetheless, many were amazing and vibrant peoples and places all the same, so we shouldn’t assume a post-science civilization is a bad place to be.
Indeed, as a civilization that’s really only been heavily science-focused for a relatively short time, odds are pretty good that we’re the unstable anomaly, and that while learning is probably eternal, a civilization focused on science is a brief period between when a civilization thinks stars are just points of light or distant gods, and when it has settled them all, and dwells around them for billions of years to come. And those might be glorious civilizations, shining as brightly as the stars they’ve come to surround, but I’m glad to be living now when there’s so many mysteries still to answer about those stars and so many other enigmas. As we looked at today, it is possible one day we’ll run out of science or abandon it, but I wouldn’t bet on it anytime soon, and in the meantime a strong knowledge of math, science, and computer science is of uncalcuable worth in our civilization. Odds are pretty good if you’re watching this show, I don’t have to convince you that’s true, but learning math and science can be intimidating to many and the key to learning it is good explanations with handson examples and practice, and that’s why I’ve been recommending Brilliant for years for folks looking to level up their knowledge.
They’ve not only been a good sponsor for this show since its early days, they’ve been a good partner to work with, and helped us a lot as a production, and a lot of other shows too. I don’t know of any other show sponsor I’ve heard so much positive feedback from other show creators for just being great to work with, and they can be a great partner on your own learning journey too, and that seems to be the overwhelming consensus from folks who’ve signed up with them over the years and let me know how it helped them or a family member to learn. Brilliant makes it easier for anyone to learn, be it the basics or advanced materials, they have thousands of lessons, with exclusive new content added monthly, and their focus on interactivity and making learning fun just throws out the old paradigm of boring textbooks, like we try to do here on this show. With Brilliant, you can learn at your own pace, learn on the go, and learn something new, all while helping support our show. To get started for free, visit brilliant.org/IsaacArthur or click on the link in the description, and the first 200 people will get 20% off Brilliant's annual premium subscription.
So we still have one more regular Thursday episode this month, on the Grabby Aliens perspective of the Fermi Paradox, and what Grabby Aliens are and if we will become an example of them and if they are a Kardashev 4 Civilization. First though we have our Monthly Livestream Q&A, Sunday September 25th at 4 pm Eastern time, where my wife and co-host Sarah and I will be taking your questions live from the chat. After both of those it is into October to look at the idea of colonizing planetary rings, like Saturn’s, and if life might be able to evolve in such places. Then we’ll ask the question of what we do if all these options for space travel never pan out, and we are stuck here on Earth, and then we’ll have our Scifi Sunday episode to look at what happens if space travel does indeed pan out and what strange alien environments we might encounter. Possibly on your own Personal Spaceship, which we’ll look at on October 20th. If you want alerts when those and other episodes come out, don’t forget to subscribe to the channel and hit the notifications bell. And if you enjoyed today’s episode, and would like
help support future episodes, please visit our website, Isaac Arthur.net, for ways to donate, or become a show patron over at Patreon. Those and other options, like our awesome social media forums for discussing futuristic concepts, can be found in the links in the description. Until next time, thanks for watching, and have a great week!
2022-09-25 06:22