Absolute Zero (2007) - BBC - Documentary
The greatest triumph of civilization is often seen as our mastery of heat. Yet our conquest of cold is an equally epic journey, from dark beginnings, to an ultra cool frontier. For centuries, cold remained a perplexing mystery Nobody had any idea what it was. Much less how to harness its effects.
Yet in the last 100 years, cold has transformed the way we live and work. Imagine homes or supermarkets without fridges and frozen foods or skyscrapers without air-conditioning, or hospitals without liquid oxygen. We take for granted the technology of cold, yet it has enabled us to explore outer space and the inner depths of our brain. And as we develop new ultra cold technology to create quantum computers and high-speed networks, it may even change the way we think and interact. This is the story of how scientists and dreamers over the past four centuries plunged lower and lower down the temperature scale to conquer the cold, enrich our lives and attempt to reach the ultimate limit of cold.
A Holy Grail as elusive as the speed limit of light: absolute zero. Extreme cold has always held a special place in our imagination. For thousands of years, it seemed like a malevolent force associated with death and darkness.
Cold was an unexplained phenomenon. Was it a substance, a process, or some special state of being? Back in the 17th century, no one knew, but they certainly felt its effects in the freezing London winters. (Simon Schaffer) 17th-century England was in the middle of what's now called "the little Ice Age." It was fantastically cold by modern standards. You have to imagine a world lit by fire in which most people are cold most of the time.
Cold would've felt like a real presence, a kind of positive agent that was affecting how people felt. And not fitted nicely with the most orthodox received view, that natural philosophers inherited from the greeks, from Aristotle, hundreds of years earlier, that there are two agents in the world: hot and cold. They function symetrically. They can combine or separate. Back then, people felt at the mercy of cold. This was a time when such natural forces were viewed with awe as acts of God.
So anyone attempting to tamper with cold did so at their peril. The first to try was an alchemist, Cornelius Drebbel. On a hot summer's day in 1620, King James I and his entourage arrived to experience an unearthly event. Drebbel, who was also the court magician, had a wager with the King that he could turn summer into winter. He would attempt to chill the air in the largest interior space in the British Isles: the great hall of Westminster. Drebbel hoped to shake the King to his core.
(Andrew Szydlo) He had a phenomenally fertile mind. He was an inventor par excellence. His whole world was steeped in the world of alchemy, of perpetual motion machines, of the idea of time, space, planets, moon, sun, gods. He was a fervently religious man.
He was a person for whom nature presented a phenomenal... a galaxy of possibilities. Dr. Andrew Szydlo, a chemist with a lifelong fascination for Drebbel, enjoys his reincarnation as the great court magician. Like most alchemists, Drebbel kept his method secret. Dr. Szydlo wants to test his ideas
on how Drebbel created artificial cold. When Drebbel was trying to achieve the lowest temperature possible, he knew that ice, of course, was the freezing point, or the coldest you could get normally. But he would've been aware of the facts through his experience that mixing ice with different salts could get you a colder temperature.
Salts will lower the temperature at which ice melts. Dr. Szydlo thinks Drebbel probably used common table salt, which gives the biggest temperature drop. But salt and ice alone would not be enough to cool down such a large interior. Drebbel was famous for designing elaborate contraptions, a passion shared by Dr. Szydlo,
who has an idea for the alchemist's machine. So here, we would've had a fan, which would've been turned over blowing warm air over the cold vessels there, and as the air blows over these cold jars, we would've had, in effect, the world's first air-conditioning unit. But could this really turn summer into winter? (Dr. Szydlo) The idea was to stir it in as well as possible in the 5 seconds that you have to do it. Dr. Szydlo stacks the jars of freezing mixture
to create cold corridors for the air to pass through. We can feel it's very cold, in fact I could feel cold air actually falling on my hands, because cold air, of course, is denser than warm air, and one can feel it quite clearly on the fingers. The vital question: would the gust of warm air become cold? I can feel certainly a blast of cold air hitting me as that 2nd cover was released.
Well, temperature, we're on 14 at the moment. Yes, keep it going. That's definitely the right direction. I think possibily even closer still would give us the best of... So, how would the King have reacted to this encounter with man-made cold? He would've been shocked, he wouln't have known what's happening, he could've in fact been wondering whether there was some action of gods or some sort of forces, demonological forces, which were in action and he would have let pacefully freezing as he did so. Had Drebbel written up his great stunt, he might've gone down in history as the inventor of the air-conditioning.
Yet it would be almost 3 centuries before this idea eventually took off. To advance knowledge and conquer the cold required men with a very different mind set. King James' Lord Chancellor Francis Bacon was the first to apply the scientific method to the study of heat and cold. He believed it was important to conduct experiments and analyze the results rather than rely on the established wisdom of the ancients. For Francis Bacon, heat and cold turned out to be right at the center of his world view. One way of understanding why that's so is think why heat and cold matter to human beings.
They really matter in XVII century for two reasons: one is the weather and one is disease. There was after all an obvious tension in everyday experience between the healty effects of warmth and the healthy effects of cold. Warmth made you healthy because it stops you radiating away the vital spirits within you. But cold obviously had crucial effects against death as well.
It could preserve things for immensely long times and maybe it could preserve Francis Bacon's body too. Bacon rarely carried out experiments himself but his one foray into the preservative effects of the cold had disastrous consequences. He took a freshly killed chicken. and stuffed it full of ice and snow to investigate how much longer the chicken meat might stay fresh. He was impressed by the results. The chicken did remain fresh for many days.
Unfortunately, during the process of exposing his own body to the cold he caught pneumonia. As Bacon laid dying it dawned on him that his fascination with the cold was going to cost him his life. The irony, tragedy of that rough sad experiment didn't disuade his followers from doing more experiments on ice and snow and its vital or preservative effects. The men who followed Bacon were really convinced that if we could understand the way in which motion, cold and heat fitted together we could save ourselves from disease, we'd unlock the mysteries of the universe.
This fundamental question, "What is cold?" haunted Robert Boyle, who was born the year after Bacon died. The son of the Earl of Cork, a wealthy nobleman, Boyle used his fortune to build an extensive laboratory. Boyle is famous for his experiments on the nature of air, but he also became the first master of cold. Believing it to be an important, but neglected subject, he carried out hundreds of experiments. (Simon Schaffer) He worked through very systematically a series of ideas about what cold is.
Does it come from the air? Does it come from the absence of light? Is it that there are strange, so-called "frigorific" cold-making particles? The dominant view, certainly in Boyle's lifetime, the view that he set out to attack, is that cold is a primordial substance that when bodies get colder they're sucking in this primordial cold and as they get warmer they expel it. Boyle thought that was wrong and he did experiments to show that it was wrong. Boyle was curious about the way water expanded when it turned to ice. He wondered whether the increase in volume was accompanied by an increase in weight. He carefully weighed a barrel of water and took it outside in the snow, leaving it to freeze overnight.
Boyle reasoned that if once the water turned to ice the barrel weighed more, then perhaps cold was a substance, after all. But when they reweighed the barrel, they discovered it weighed exactly the same. So what must be happening, Boyle guessed, was that the particles of water were moving further apart, and that was the expansion, not some substance flowing into the barrel from outside. Boyle was becoming increasingly convinced that cold was not a substance but something that was happening to the particles, and began to think back to his earlier experiments with his air pump. Boyle's idea was that the air trapped in this glass container is springy, it's elastic and as you're trying compress it, it resists. Now, this is very closely linked in Boyle's program to the way he studies heat and cold because his idea was that as substances like the air get warmer they tend to expand.
It's as though the little particles, little springs, out of which he imagined each air particle is made, were gradually unwinding, so they take up more space and they expand. Boyle's conclusion here was that heat is a form of motion of a particular kind and that as bodies cool down, they move less and less. Boyle's longest-published book was on the cold. Yet he found its study troublesome and full of hardships, declaring that he felt like a physician trying to work in a remote country without the benefit of instruments or medicines. To properly explore this country of the cold, Boyle lamented the lack of a vital tool: an accurate thermometer.
It was not until the mid-17th century, that glassblowers in Florence began to produce accurately calibrated thermometers. Now it became possible to measure degrees of hot and cold. Because rather than mercury they used alcohol, which is much lighter, they made thermometers that were sometimes several meters long and were often wound into spirals. But there was still one major problem with all thermometers: the lack of a universally agreed temperature scale. There are all kinds of different ways of trying to stick numbers to these degrees of hot and cold, and they, on the whole, didn't agree with each other at all.
So one guy in Florence makes one kind of thermometer, another guy in London makes a different kind, and they just don't even have the same scale, and so there was a lot of problem in trying to standardise thermometers. Imagine that you want to make a scale of temperature. What do you do? Well, the obvious thing to do, and this is well understood by instrument makers and experimentalists in the 17th and 18th century is to try to find something in nature which you know always has the same temperature and make that your fixed point. A better strategy even is to find two such phenomena in nature and then you have a lower fixed point, something rather cold, and an upper fixed point, something rather warm and divide the degrees of temperature between into say a hundred convenient bite sized chunks.
The problem however was to find, define a phenomenon whose temperature you guessed was fixed. So for the lower fixed point you might choose the temperature of ice. just as it's melting. And then there's an almost indefinite range of possible candidates for your upper fixed point.
And Isaac Newton, for example, worked rather hard on constructing what he called the Scale of Heat. He, for example, defined the temperature which a human can only just tolerate if they plunge their hand into warm water. It could be the normal human underarm, the temperature of the human blood, the temperature of wax just as it's melting. The first temperature scale to be widely adopted was devised by Daniel Fahrenheit, an accomplished instrument maker who made thermometers for doctors in Holland. He used a mixture of ice, water, and salt for his 0 degrees; ice melting in water at 32 degrees and for his upper fixed point, the temperature of the human body, at 96 degrees, which is close to the modern value. (Hasok Chang) One of the things that Fahrenheit was able to achieve was to make thermometers quite small, and that he did by using mercury as opposed to alcohol or air, which other people had used.
And because mercury thermometers are compact, clearly if you're trying to use it for clinical purposes, you don't want some big thing sticking out of the patient! So the fact that he could make them small and convenient, that seems to be what made Fahrenheit so famous and so influential. It was a Swedish astronomer, Anders Celsius, who came up with the idea of dividing the scale into 100 divisions. The original scale used by Celsius was upside down, so he had the boiling point of water as zero and the freezing point as 100, with numbers just continuing to increase as we go below freezing. And this is another little mystery in the history of the thermometer that we just don't know for sure. What was he thinking when he labeled it this way? And it was the botanist Linnaeus, who was then the president of the Swedish Academy, who after a few years said, "We need to stop this nonsense," and inverted the scale to give us what we now call "Celsius scale" today. A question nobody thought to ask when devising temperature scales was "how low can you go?".
Is there an absolute lower limit of temperature? The idea that there might be would become a turning point in the history of cold. The story begins with the French physicist Guillaume Amontons. He was doing experiments on heating and cooling bodies of air to see how they expand and contract. - We're now going to put ice around that bulb and see what happens. And he was noticing that, well, when you cool a body of air, the volume or the pressure would go down. And he speculated "Well, what would happen if we just kept cooling it?" By plotting temperature against pressure, Amontons saw that as the temperature dropped, so did the pressure, and this gave him an extraordinary idea.
Amontons started to consider the possibility "What would happen if you projected this line back until the pressure was zero?" And this was the first time in the course of history that people actually considered the concept of an absolute zero of temperature. Zero pressure, zero temperature. It was quite the revolutionary idea when you think about it because you wouldn't just think that temperature has a limit of lower bound, or zero, because in the upper end, it can go on forever, we think, until it's hotter and hotter and hotter. But somehow, maybe there's a zero point where this all begins.
So you could actually give a calculation of where this zero point would be. Amontons didn't do that calculation himself, but some other people did later on, and when you do it, you get a value that's actually not that far from the modern value of roughly minus 273 °C. In one stroke, Amontons had realized that although temperatures might go on rising forever, they could only fall as far as this absolute point. For him, this was a theoretical limit, not a goal to attempt to reach. Before scientists could venture towards this zero point, far beyond the coldest temperatures on Earth, they needed to resolve a fundamental question.
By now, for most scientists, the penny had dropped that cold was simply the absence of heat. But what was actually happening as substances warmed or cooled was still hotly debated. The argument of men like Amontons relied completely on the idea that heat is a form of motion, and that particles move more and more closely together as the substance in which they're in gets cooler and cooler. Unfortunately, the science of cold was about to suffer a serious setback. The idea that cooling was caused by particles slowing down began to go out of fashion.
At the end of the 18th century, a rival theory of heat and cold emerged that was tantalizingly appealing, but completely wrong. It was called "The Caloric Theory" and its principal advocate was the great French chemist Antoine Lavoisier. Like most scientists at the time, Lavoisier was a rich aristocrat who funded his own research. He and his wife, Madame Lavoisier, who assisted with his experiments, even commissioned the celebrated painter David to paint their portrait. Lavoisier carried out experiments to support the erroneous idea that heat was a substance, a weightless fluid that he called "caloric." He thought in the solid state of matter, molecules were just packed close in together, and when you added more and more caloric to this, the caloric would insinuate itself between these particles of matter and loosen them up.
So the basic notion was that caloric was this fluid that was, as he put it, "self-repulsive." It just tended to break things apart from each other. And that's his basic notion of heat, as the cold is just the absence of caloric, or the relative lack of caloric.
Lavoisier even had an apparatus to measure caloric, which he called a "calorimeter." He packed the outer compartment with ice. Inside, he conducted experiments that generated heat. Sometimes from chemical reactions, sometimes from animals, to determine how much caloric was released.
He collected the water from the melting ice and weighed it to calculate the amount of caloric generated from each source. (Robert Fox) I think the most striking thing about Lavoisier is that he sees caloric as a substance which is exactly comparable with ordinary matter, to the point that he includes caloric in his list of the elements. It's very easy to talk about the quantity of heat and to think of it, as a fluid, whereas the talk about the quantity of heat and to think that it's a vibration of the particles of matter, which was the other alternative, that's much more difficult, conceptually. It's a very hard model to refute because if you can accept there's a substance that doesn't have any weight (indeed, for Lavoisier, heat, caloric, is an element.
Is an element like oxygen or nitrogen. Oxygen gas is made of oxygen + caloric and if you take the caloric away, presumably the oxygen might liquify.), that's a very hard model to shift because it explains so much, and indeed, Lavoisier's chemistry was so otherwise extraordinarily successful.
However, Lavoisier's story about caloric was soon undermined. There was one man who was convinced Lavoisier was wrong and was determined to destroy the caloric theory. His name was Count Rumford.
Count Rumford had a colourful past. He was born in America, spied for the British during the Revolution, and after being forced into exile became an influential government minister in Bavaria. Among his varied responsibilities was the artillery works, and it was here in the 1790s that he began to think about how he might be able to disprove the caloric theory using cannon boring. Rumford had noticed that the friction from boring out a cannon barrel generated a lot of heat. He decided to carry out experiments to measure how much. He adapted the machine to produce even more heat by installing a blunt borer that had one end submerged in a jacket of water.
As the cannon turned against the borer, the temperature of the water increased and eventually boiled. The longer he bored, the more heat was produced. For Rumford, what this showed was that heat must be a form of motion, and heat is not a substance, because you could generate indefinitely large amounts of heat simply by turning the cannon. Despite Count Rumford's best efforts, Lavoisier's caloric theory remained dominant until the end of the 18th-century. His prestige as a scientist meant that few dared challenge his ideas. Sadly this did not protect him from the revolutionary turmoil in France, which was about to interrupt his research.
At the height of the reign of terror, Lavoisier was arrested and eventually guillotined. The reason he was guillotined was not because of his science but because he helped run the privatized income tax service of the French state. There's nothing more unpopular, even in France, than a privatized tax collector.
Once he was guillotined, his wife left France and eventually met Rumford when he moved to Western Europe in the early 1800s. Rumford then married her. So he'd married the widow of the man who'd founded the theory that he destroyed.
The marriage was short-lived. After a tormented year, Rumford left Madame Lavoisier and devoted the rest of his life to his first love, science. It would be nearly 50 years before his theory of heat and cold was finally accepted. A founder of the Royal Institution, Rumford continued to support the pursuit of science. And it was here that the next major breakthrough in the conquest of cold would occur. Michael Faraday, who later became famous for his work on electricity and magnetism, unwittingly carried out an experiment that would begin the long descent towards absolute zero.
He was asked to explore the properties of a newly discovered pungent gas called chlorine. This experiment was potentially explosive, which is perhaps why it was left to Faraday and perhaps also why Dr. Andrew Szydlo is curious to repeat it today.
We are about to undertake an exceedingly dangerous experiment in which Michael Faraday in 1823 heated this substance here, the hydrates of chlorine, in a sealed tube. Is that sealed? (man) That's sealed, Andrew. (Andrew) That's absolutely brilliant! In the original experiment, Faraday took the sealed tube and heated the end containing the crystals. He put the other end in an ice bath. Soon he noticed yellow chlorine gas being given off. (Andrew) Because the gas is being produced, pressure's building up.
But because this side is so very cold, hopefully what we'll see is some tiny oily droplets of chlorine, liquid chlorine, being produced. It's the pressure which is causing this. Ray, this is where it starts to get dangerous, so if you'll now take a few steps back... When Faraday did the experiment, a visitor, Dr. Paris,
called him to see what he was up to. Paris pointed out some oily matter in the bottom of the tube. Faraday was curious and decided to break open the tube. Right, so let's have a look inside here. The explosion sent shards of glass flying.
With the sudden release of pressure, the oily liquid vanished. And there we are. Is that what happened? Yes, that's exactly what happened. It popped open, glass flew. And can you detect the strong smell of chlorine? - I can now. Absolutely. Well, he detected the strong smell of chlorine
and this was a major mystery for him. Faraday soon realized the increased pressure inside the sealed tube had caused the gas to liquify. Later, he used the same technique to liquify ammonia gas.
He noticed that on releasing the pressure, the liquid evaporated, triggering a dramatic drop in temperature. He predicted that one day this cooling might be useful: "There is great reason to believe that this cooling technique with ammonia may be successfully employed for the preservation of animal and vegetable substances for the purposes of food." But Faraday's idea of using ammonia as a refrigerant was ahead of its time. Besides, he had no interest in commercial exploitation.
Across the Atlantic, a Yankee entrepreneur had a very different philosophy. and was about to commercialize cold. Frederic Tudor had a chance conversation with his brother that led him on a path to become one of the richest men in America. (Dennis Picard) The story goes, at the dinner table they were trying to decide what they had on their father's farm they could make money off of.
And certainly there was a lot of rocks, but people weren't going to pay for that, so they came up with the idea of maybe ice, 'cause some areas did not have ice. And it seemed kind of crazy at first, but it paid off. When Tudor began harvesting ice from New England ponds, he soon realised he needed specialised tools to keep up with the huge demand. (Dennis Picard) We had the saws, and the saws were an improvement over the old wood saws. They have teeth that are sharpened on both sides and set, so it cuts on both the up and the down stroke.
The crew could clear a 3-acre pond easily in a couple of days. Tudor's dream to make ice available to all was not confined to New England. He wanted to ship ice to hot parts of the world like the Caribbean and the deep South. (Dennis Picard) When Tudor first tried to convince shipmasters to put his load of frozen water into the ships, they all refused, 'cause they told him that water belonged outside the hull, not inside. So he had to go find other investors to get the money to buy his own ship, and he bought a ship by the name of the "Favorite." New England became the refrigerator for the world, with ice shipments to the Caribbean, the coast of South America and Europe.
Tudor even reached India and China. Watching the ice cutters working Walden Pond, Henry Thoreau marveled that water from his bathing beach was traveling halfway around the globe to end up in the cup of an East Indian philosopher. Tudor, who soon became known as the "Ice King," began using horses and huge teams of workers to harvest larger and larger lakes as the demand for ice grew. During the latter half of the 19th century, the ice industry eventually employed tens of thousands of people.
(Dennis Picard) Tudor became the largest distributor of ice, and he became one of the first American millionaires. And we're talking about one of his ships going to the Caribbean giving him a profit of $6,000! Now, this is in a time period when people were earning $200 to $300 a year, the average family. So someone earning thousands of dollars was just inconceivable, and that would be losing 20% of your ice when it got there.
There was still huge amounts of profit. Tudor's success was based on an extraordinary physical property of ice: it takes the same amount of heat to melt a block of ice as it does to heat a similar quantity of water to around 80 °C. This meant that ice took a long time to melt, even when shipped to hotter climates. What started out as a small family enterprise turned into a global business. Frederic Tudor had industrialised cold in the same way the great pioneers of steam had harnessed heat.
By the 1830s, the Industrial Revolution was in full swing. Yet ironically, it was not until a small group of scientists worked out the underlying principles of how steam engines convert heat into motion that the next step in the conquest of cold could be made. Only after solving this riddle of heat engines could the first cold engines be made to produce artificial refrigeration. How much useful work can you get out of a given amount of heat? By the early 1800s, that had become the single most important economic problem in Europe. To make a profit was to convert heat into motion efficiently, without wasting heat, and getting the maximum amount of mechanical effect.
The first person to really engage with this problem was a young French artillery engineer, Sadi Carnot. He thought that improving the efficiency of steam engines might help France's flagging economy after defeat at Waterloo in 1815. Working at the Conservatoire des Arts et Mètiers, he began to analyze how a steam engine was able to turn heat into mechanical work. The originality of Carnot's treatement, in my eyes, is essentially that he shows that in order to extract energy, to extract work from the heat engine, you need a high temperature source, which is the boiler, and you need a low temperature, which is that of the condenser. And the essence of the heat engine for Carnot is that heat passes from the high temperature of the boiler to the low temperature of the condenser. In steam engines, it looks as though heat is flowing around the engine, and as it flows, the engine does mechanical work.
The implication there is that heat is neither consumed nor destroyed. You simply circulate it around, and it does work. So, there the analogy would be between heat, in a steam engine, and water, in a water wheel. It's though it's the flow of heat that's actually getting the work done in the standard steam engine. Carnot likened this flow of heat to the flow of water over a waterwheel.
He saw that the amount of mechanical work produced depended on how far the water fell. His novel idea was that steam engines worked in a similar way, except this fall was a fall in temperature from the hottest to the coldest part of the engine. The greater the temperature difference, the more work was produced. Carnot distilled these profound ideas into an accessible book for general readers, which meant it was largely ignored by scientists instead of being heralded as a classic. Well, this is the book. It's Carnot's only publication.
"Reflections on the Motive Power of Fire" of 1824, a small book, 118 pages only, published just 600 copies, and in his own lifetime, it's virtually unknown. Twenty years after the publication, William Thompson, the Scottish physicist, is absolutely intent on finding a copy. He's here in Paris, and the accounts we have suggest that he spends a great deal of time visiting bookshops, visiting the bouquinistes on the banks of the Seine looking, always asking for the book, and the booksellers tell him they've never even heard of it. Back then, William Thompson, who would later become Lord Kelvin, a giant in this new field of thermodynamics, was impressed by Carnot's idea that the movement of heat produced useful work in the machine. But when he returned home, he heard about an alternative theory from a Manchester brewer called James Joule. Joule had this notion that Carnot was wrong, that heat wasn't producing work just by its movement.
Heat was actually turning into mechanical work, which is a very strange idea when you think about it. We're all now used to thinking about energy and how it can take all different forms, but it was a revolutionary idea that heat and something like mechanical energy were, at bottom, the same kind of thing. The experiment that convinced Joule of this was set up in the cellar of his brewery. It converted mechanical movement into heat, almost like a steam engine in reverse.
He used falling weights to drive paddles around the drum of water. The friction from this process generated a minute amount of heat. Only brewers had thermometers accurate enough to register this tiny temperature increase caused by a measured amount of mechanical work.
Joule's work mattered because it was the first time that anyone had convincingly measured the exchange rate between movement and heat. He proved the existence of something that converts between heat and motion. That something was going to be called "energy" and it's for that reason that the basic unit of energy in the new International System of Units is named after him: the Joule.
This apparent contradiction between Joule and Carnot was eventually resolved by Thomson in what would later become known as "the laws of thermodynamics". The first law, from Joule's work, states that, "Energy can be converted from one form to another, but can never be created or destroyed." The 2nd law, from Carnot's theory, states that, "Heat flows in one direction only, from hot to cold." In the 2nd half of the 19th century, this new concept of energy paved the way for steam power to artificially produce cold.
The flow of heat, from hot to cold, drives any refrigeration cycle whether it's a modern fridge or a steam powered ice-making machine. In the first stage of this cycle, gigantic pistons compress ammonia gas into a hot liquid. The hot liquified ammonia is pumped into a condenser where it is cooled and fed into pipes beneath the water tanks.
In the next stage the liquid ammonia re-evaporates, and the temperature drops. As the ammonia re-absorbs heat from the surrounding water, gradually, the tanks of water become blocks of ice. By the 1880's, many towns across America had ice plants like this one, which could produce 150 tons of ice a day. For the first time, artificially produced ice was threatening the natural ice trade created by Frederic Tudor. America's appetite for ice was insatiable.
Slaughterhouses, breweries, and food warehouses, all needed ice. Animals were disassembled on production lines in Chicago and the meat was loaded into ice-cooled boxcars to be shipped by railroad. Livestock on its way to the great meat-packing centers of the nation, to markets everywhere.
Food of every sort safely and quickly delivered in refrigerator cars. From New York to Los Angeles restaurants were able to serve thousands of miles from where their meals once roamed. As fruit and vegetables became available out of season, urban diets improved, making city dwellers the best-fed people in the world. And to keep everything fresh at home, the iceman made his weekly delivery to recharge the refrigerator. (Tom Schachtman) Refrigeration makes a tremendous difference in people's lives.
First of all, in the diet, what is possible for them to eat. They can go to the store once a week. They don't have to go every day. They can obtain at that store foods that are from almost anywhere in the world that have been transported and kept cool, and then they can keep them in their own home.
Eventually the iceman disappeared as more and more households bought electric fridges. These used the same basic principles as the old ice-making machines. Heat from the food inside is drained away by the evaporating coolant and is dumped, at the back.
The electric pump drives this cycle of compression, evaporation and condensation. And that's how the fridge got its hum. The electric power companies loved refrigerators because they ran all day and all night. They may not have used that much power for each hour, but they continued to use that.
So one of the ways that they sold rural electrification was the possibility of having your own refrigerator. In the early days, the fridge's icebox was used to freeze water, nothing else. Freezing was seen as having the same damaging effects as frost. The man who would change this idea forever was a scientist and explorer called Clarence Birdseye. In 1912, Birdseye set off on an expedition to Labrador, and the temperature dropped to 40 degrees below freezing.
The Inuit had taught Birdseye how to ice fish. He cut a hole down in the ice which could be several feet thick, the fish were on the pool and lined down below, bring the fish up. And as he did that, he found that they'd freezed in this terribly cold air, almost before they hit the shoulder. (Tom Schachtman) When you went to cook this fish, it tasted just as good as if fresh, and he couldn't figure that out, because when he froze fish at home, they would taste terrible. So when he got back home, he finally tried to figure out what was the difference between this quick freezing and the usual freezing. Under closer examination, he could see what was happening to the fish cells.
With slow freezing, large ice crystals formed, which distorted and ruptured the cells. When thawed, the tissue collapsed and all the nutrients and flavor washed away. That's the "mushy strawberry" syndrome. A people's fruit, the freezed strawberries that they picked in their garden, and they put them out on the table the next day and they, they're collapsed, they're all mushy.
But with fast freezing, only tiny ice crystals were formed inside the cells, and these caused little damage. It was all down to the speed of the freezing zone. What Birdseye found out is that you can get through this zone very quickly: flash freezing or quick freezing. You'll avoid this ice crystalization. And that made it possible for the food, when it's unfrozen and cooked, to taste just as good as fresh. The basic concept was simple, but it took Clarence Birdseye another 10 years to perfect a commercial fast-freezing technique that would mimic the natural process he'd experienced in Labrador.
In 1924, he opened a flash freezing plant in Gloucester, Massachusetts that froze freshly landed fish at minus 45 degrees. He then extended that to all sorts of other kinds of meats and products and vegetables and almost single-handedly invented the frozen food industry. Fridges and freezers would eventually become icons of modern living, but there was a less visible cold transformation happening at the same time. This would also have a huge impact on urban living: the cooling of the air itself. Three centuries had passed since Cornelius Drebbel had shaken King James in Westminster. Now at the dawn of the 20th century, air cooling was about to shake the world.
Tell me, what is the low down on this air-conditioning thing? Now you've started something by asking me that. Air-conditioning was about to transform America, and the person responsible was Willis Carrier, whose important breakthrough passed into comfort cooling mythology. Let's go back to that foggy night, when the young engineer named Willis Carrier sought the answer to a problem. The effects of humidity, of moist air, on industrial production. "Fog! Maybe that's the answer.
Let's see. Fog is water vapour, that's been condensed. That's because the air has become cooler and cool air can't hold as much moisture as warm air.
Maybe that's the way to reduce humidity. Cool the air and condense the moisture. It might work." Control the humidity through control the temperature.
That was Willis Carrier's idea. (Marsha Ackermann) Carrier is sent to Brooklyn for a very special job in 1902. The company that publishes the magazine "Judge", one of the most popular full-color magazines in America at this particular time, is having a huge problem. It's July in Brooklyn and the ink which they use on their beautiful covers is sliding off the pages. It will not stick because the humidity is too high.
Carrier, using some principles that he's been developing as a young new employee of this fan company, finds a way to get out the July 1902 run of the "Judge" magazine, and from there he begins to eventually build his air-conditioning empire. The demand for air-conditioning gradually grew. In the 1920's, movie houses were among the first to promote the benefits. People would flock there in summer to shelter from the heat.
(Marsha Ackermann) The movies are wildly popular, and the air-conditioning certainly helps to attract an audience, especially if they happen to be walking down the street on a horribly hot day and they duck into this movie theater and have this wonderful experience. Air-conditioning became increasingly common in the workplace too, particularly in the South, where textile and tobacco factories were almost unbearable without cooling. (man) When employees breath good air and feel comfortable, they work faster and do a better job. I think some people think that these were nice compassionate employers who were cooling down the workplace for the workers, but of course, nothing could be further from the truth. That was an inadvertent by-product, but actually this was a quality control device to control the breaking of fibers in cotton mills to get consistent quality control in these various industries to control the dust that had bedeviled tobacco stemming room workers for decades. I mean, I think the workers obviously went home and to their unair-conditioned shacks in most cases and talked about how nice and cool it was working during the day.
Well, as anybody will tell you who's lived here for very long. Even today, on the age of the air conditioning we've so plenty of sun and sweat and humidity it's something you just have to deal with. It's silly to suffer from the heat when you can afford the modest cost of air-conditioning.
By the 1950's, people were air-conditioning their homes with stand-alone window units that could be easily installed. This wasn't just an appliance; it offered a new, cool way of life. (Raymond Arsenault) Walking down a typical Southern street prior to the air-conditioning revolution, you would have seen families, individuals, outside. They would have been on their porches, on each other's porches.
There was a visiting tradition, a real sense of community. Well, I think all that changes with air-conditioning. You walk down that same street and basically what you'll hear are not the voices of people talking on the porch; you'll hear the whirr of the compressors. Guess what we've got! An RCA room air conditioner.
I'm a woman, and I know how much pure air means to mother in keeping our rooms clean and free from dust and dirt. Control of the cold has transformed city life. Refrigeration helped cities expand outwards by enabling large numbers of people to live at great distances from their source of food. Air-conditioning enabled cities to expand upwards. Beyond 20 stories, high winds make open windows impractical, but with air-conditioning, 100-story skyscrapers were possible.
(Simon Schaffer) Technologies emerged, which not only worked to insulate human society against the evils of cold, but turned cold into a productive, manageable, effective resource. On the one hand, the steam engine; on the other, the refrigerator, those 2 great symbols of 19th-century world, which completely changed the society and economy of the planet. All that is part of, I think, what we could call bringing cold to market. Turning it from an evil agent that you feared into a force of nature from which you could profit.
The explosive growth of the modern world over the last two centuries owes much to the conquest of cold. But this is only the beginning of the journey down the temperature scale. Going lower would be even harder, but would produce greater wonders that promise extraordinary innovations for the future. With rival scientists racing towards the final frontier, the pace quickens and the molecular dance slows as they approach the Holy Grail of cold: absolute zero. A century ago, the great polar explorers were pushing further and further towards the coldest places on Earth: the north and south poles.
The competition to reach these goals was matched by a less publicised but equally daunting scientific endeavour: the attempt to reach the coldest point in the universe: absolute zero. This mysterious barrier was a physical paradox as tantalizing as the speed limit of light, which can also never be exceeded. It was a frontier so enticing that rival physicists from all over Europe begun a race towards this absolute limit of cold. This is a story of showmanship, setbacks, rivalry and despair.
The stakes were high. For the winner there was glory and the chance of the Nobel Prize. For the loser, the prospect of being a forgotten foot soldier of science. When explorers ventured into the Antarctic they experienced some of the coldest temperatures on Earth, reaching down to -80 °C. But this was nothing compared to the ultimate limit of temperature, absolute zero, at around -273 degrees.
Only in a laboratory, by liquifying gases could adventurers take the first steps towards this Holy Grail, a place utterly drained of all thermal energy. Among the front-runners in the race towards absolute zero was James Dewar, a professor at the Royal Institution in London. "- It will be the greatest achievement of our age... " In 1891, he gave one of his celebrated Friday night public lectures on the wonders of the super cold, to celebrate the centenary of his great predecessor, Michael Faraday. "... The descent to a temperature within 5 degrees of zero would open up new vistas of scientific inquiry, which would add immensely to our knowledge of the properties of matter."
(Simon Schaffer) James Dewar is a canny and I think very ambitious, practically-minded Scottish scientist. He could really show, both his colleagues and the fee-paying audiences who came to this inmensely succesful brilliantly engineered lectures, some of the secrets of nature. Take this rubber ball... It bounces well, I think you'll agree.
But let's see what happens after a few seconds' immersion in liquid oxygen. Dewar invented the vacuum flask to carry out his research, and it's still called "a Dewar" to this day. Now, let's see what happens. (Kostas Gavroglu) This phantasmagoric aspect of science always helped science to be accepted by the public.
Though it is a little mystifying, it did play a role of having society, having the public accept that these weird people in the laboratories are doing truly interesting, if not magical things. James Dewar's life was defined by the cold. As a boy, he used to skate on a frozen pond in Scotland. He claimed in later life that his most formative early experience resulted from an accident on the ice. (Tom Shachtman) After Dewar fell through the ice. He was rescued but when he got home they discovered that he had rheumatic fever, which put him in bed for eight months.
And he was in danger of having his limbs atrophy, with pulsy and so the village joiner set him tasks to develop his limbs, and especially his hands, and one of the tasks was to make a violin. And he developed a great deal of mechanical aptitude which stood him in very good stead in later years when he had to create apparatuses for his uses. Dewar's dream was to take on the mantle of the Royal Institution's greatest scientist, Michael Faraday.
Seventy years earlier, Faraday had done experiments showing that under pressure, gases like chlorine and ammonia liquify. And as this liquids evaporate, their temperature drops dramatically. Faraday was curious to see if this method of pressurizing gases into liquids could be used for all gases. But some gases, what he called the "permanent" gases would not liquify, no matter how much pressure he applied. So he abandoned this line of research.
"Faraday's was a mind full of subtle powers, of divination into nature's secrets... and although unable to liquify the permanent gases, he expressed faith in the potentialities of experimental inquiry. The lowest point of temperature attained by Faraday was -130 degrees centigrade."
For over 30 years noone could reach a lower temperature than -130 °C. Absolute zero remained an elusive and very distant goal. Now, Michael Faraday, in the early to mid 19th century, had left a kind of for-long frontier for physicists and chemists: what he called the permanent gases (hydrogen, nitrogen, oxygen), which no means whatsoever seemed to be able to liquify.
And this was a kind of "no man's land" which one could not cross. And that was a standing challenge for the scientists of the later 19th century: "It must be possible to turn these gases into pure liquids." It was not until 1873 that a Dutch theoretical physicist, Van Der Waals, finally explained why these gases were not liquifying. By estimating the size of molecules and the forces between them, he showed that to liquify these gases using pressure, they each had to be cooled below a critical temperature. At last, he had shown the way to liquify the so-called permanent gases.
Oxygen was first, and then nitrogen, reaching a new low temperature of almost -200 °C. "Only the last of the permanent gases remains to be liquified: hydrogen, in the vicinity of -250 °C. It will be the greatest achievement of our age, a triumph of science." Dewar was determined to be the first to ascend what he called "Mount Hydrogen". But he was not alone.
The competitor Dewar feared most was a brilliant Dutchman, Heike Kamerlingh Onnes. Kamerlingh Onnes was younger than Dewar and to a certain extent looked up to the Scotsman as his senior. Dewar didn't have the same, if you'll pardon the expression, "warm feelings," towards his rival in the race for cold. Dewar recognized that Kamerlingh Onnes had a new radical approach to science and was planning an industrial scale lab. (Dirk van Delft) When Onnes took over the physics laboratory in Leiden, he was only 29 years old.
And, well, he gave his inaugural address here in this lecture room, the big lecture room of the Academy Building of Leiden University, and it was all there. He was explaining what to do in the next years, and he was talking about liquifying gases, making Dutch physics famous abroad, and well, it was amazing how farsighted all those visions were. Kamerlingh Onnes' lab was more like a factory. He recruited instrument makers, glassblowers, and a cadre of young assistants who became known as "blue boys" because of their blue lab coats. Later, he set up a technical training school, which still exists to this day.
Dewar and Onnes could not have been more different. Dewar was very secretive about his work, hiding crucial bits of apparatus from public view before his lectures. Onnes on the other hand, openly shared his lab's steady progress in a monthly journal. Onnes was the tortoise to Dewar's hare. In the case of Dewar, you had a brilliant experimenter, a person who could actually build the instruments himself, and a person who really believed in the brute force approach, and that is, have your instruments, set up your experiment, and try as hard as you can, and then, you'll get the results you want to get. In the case of Kamerlingh Onnes, you have a totally different approach.
He's the beginning of what later on was known as "big science". Unlike Dewar, Onnes thought detailed calculations based on theory were vital before embarking on experiments. He was a disciple and close friend of Van Der Waals, whose theory had helped solve the problem of liquifying permanent gases.
Though their approaches were different, Kamerlingh Onnes and Dewar used a similar process in their attempts to liquify hydrogen. Their idea was to go step-by-step down a cascade using a series of different gases that liquify at lower and lower temperatures. By applying pressure on the first gas and releasing it into a cooling coil submerged in a coolant, it liquifies. When this liquified gas enters the next vessel, it becomes the coolant for the 2nd gas in the chain. When the next gas is pressurized and passes through the inner coil, it liquifies and is at an even lower temperature.
The 2nd liquid goes on to cool the next gas and so on. Step by step, the liquified gases become colder and colder. Each one is used to lower the temperature of the next gas sufficiently for it to liquify.
In the final stage, where hydrogen gas is cooled, the idea was to put it under enormous pressure, 180 times atmospheric pressure, and then suddenly release it through a valve. This would trigger a massive drop in temperature, sufficient to turn hydrogen gas into liquid hydrogen at -252 degrees, just 21 degrees above absolute zero. Here was the risky bit because his apparatus was going down in temperature, getting very, very cold, so very fragile, quite easy to fracture. While at the same time, the pressures he was working at were very, very high, so the possibility of explosion. He took the most amazing risks, both with himself - he was a lion of a man in terms of courage - and with those around him.
All the equipment he was working with could have crumbled or blown up and more than occasionally, it did. Dewar had many explosions in his lab. Several times, assistants lost their eyes as shards of glass catapulted through the air. In the notebook he actually writes, jots down many details of what happened to the apparatus, but not what happened to his assistants. So somehow you get the impression that apparatus is more important than the assistants. (Frank James) Well, the assistants seemed to have been quite loyal to him 'cause he stayed working.
I mean, if you look at the picture of Dewar lecturing there are two assistants. There one of whom has lost his eye but the painter manages to portray him with his lost eye facing the other way so you don't actually see it in the picture. So it clearly there was something going for Dewar with his assistants in that he kept that sort of loyalty in a way that would be almost inconceivable in the modern world.
Over in Leiden, Onnes was facing anxious city officials who were so worried about the risk of explosions that they ordered the lab to be shut down. Dewar wrote a letter of protest on behalf of Onnes but the Leiden lab remained closed for 2 years. (Dirk van Delft) Onnes had to wait and to wait and to wait. Dewar was already starting his liquifying hydrogen, and Onnes had the apparatus to do so too, but he just couldn't start, so we had lost the battle before it was even begun.
The year is 1898. Dewar has been working on trying to liquify hydrogen for more than 20 years, and he's finally ready to make the final assault on Mount Hydrogen. By using liquid oxygen, they brought down the temperature of the hydrogen gas to -200 °C. They increased the pressure till the vessels were almost bursting and then opened the last valve in the cascade. (as James Dewar) "Shortly after starting, the nozzle plugged, but it got free by good luck and almost immediately drops of liquid began to fall and soon accumulated 20 cubic centimeters." Dewar had liquified hydrogen, the last of the so-called permanent gases.
To prove it, he took a small tube of liquid oxygen and plunged it into the new liquid. Instantly, the liquid oxygen froze solid. Now he was convinced. He had produced the coldest liquid on Earth and had come closer to absolute zero than anyone else. (Tom Shachtman) Dewar thought that he had done the most amazing feat of science in the world, that he would be immediately celebrated for it and get whatever prizes there were available.
And that didn't happen. I think for Dewar, it was the ambition of a mountaineer. You've climbed the highest mountain peak that you can see in the range around you, and just as you get to the top of the peak, there's an even higher mountain just beyond. That mountain was helium, a recently discovered inert gas.
Van Der Waal's theory predicted helium would liquify at an even lower temperature than hydrogen, at around 5 degrees above absolute zero. Now all Dewar had to do was obtain some. It should not have been difficult. The two chemists who had discovered the inert gases, Lord Rayleigh and William Ramsay, often worked together in the lab next door. Unfortunately, Dewar had made enemies of both of them by publicly criticizing their science and belittling their achievements, so they had no desire to share their helium.
Kamerlingh Onnes was faced with the same problem as Dewar, which was: Where can I get a supply of helium gas? And he actually asked Dewar to try and collaborate with him too, and Dewar said, I'm having such a problem getting the gas by myself, I can't possibly give you any. I'd like to, but I can't. Eventually, each found a supply, but Onnes' industrial approach paid dividends. After 3 years, he had amassed enough helium gas to begin experiments.
The tortoise was beginning to pull away from the hare. The liquefaction of these gases has become a matter of enormous pride and prestige for Dewar. But pretty quickly he run out of resources. He was reaching the limit of what the budget would bear at the Royal Institution and the helium supplies dried up. One day, when they were in the mid of working with gas of helium an assistant in Dewar's lab turned the knob to the left instead of to the right, the whole canister of the gas escaped into the air and they had 6 months when they couldn't do any work whatsoever. Dewar was furied.
(Kostas Gavroglu) At one point, Dewar writes to Kamerlingh Onnes telling him that he is not in the race anymore. He thinks that the problems for liquifying helium are such that he's not able to complete the job. The battlefields of science are the centers of a perpetual warfare in which there is no hope of a final victory. To serve in the scientific army, to have shown the initiative is enough to satisfy the legitimate ambition of every earnest student of nature. Thank you.
In the summer of 1908, Onnes summoned his chief assistant, Flim, from across the river. They were finally ready to try to liquify helium. At 5:45 on the morning of July the 10th, he assembled his team at the lab. They had rehearsed the drill many times before. Leiden was a small university town and the word quickly spread that this was the big day.
It took until lunchtime to make sure the apparatus was purged of the last traces of air. By 3 in the afternoon, work was so intense that when his wife arrived with lunch, he asked her to feed him so he didn't have to stop work. This was a man obsessed. At 6:30 in the evening, the temperature began to drop below that of liquid hydrogen.
It's getting very late in the day and the team is down to its last bottle of hydrogen and if they can't liquify helium now they're going to have to wait for months to try again and the temperature gauge is stuck at five degrees above absolute zero. And Onnes doesn't know why this is, and a colleague comes in and he suggests that that means maybe they've actually succeeded and they don't even know it yet. So Onnes takes an electric lamp type thing and he goes underneath the apparatus and looks, and sure enough, there in the vial is this liquid sitting there quietly. It's liquified helium. They had reached -268 °C, just 5 degrees above absolute zero and finally produced liquid helium. This monumental achievement eventually won Onnes the Nobel Prize.
When James Dewar heard that he had lost the race to Kamerlingh Onnes, it reignited a festering resentment. Dewar berated his long-suffering assistant, Lennox, for failing to provide enough helium. Only this time, Lennox had had enough. He walked out of the Royal Institution, vowing never to return until Dewar was dead.
And he kept his word. For Dewar, it was the end of his low temperature research. He must've been incredibly irritated and knowing Dewar he must... one can imagine that sort of irritation he would've felt when Onnes came in for the dutch to liquify helium and even today Onnes' discovery of liquid helium is seen as a much more significant discovery than Dewar's work on liquifying hydrogen, which is slightly unfair, 'cause it's all part of the process of trying to achieve absolute zero.
It remained, that's very clear, a wound in Dewar's soul that never really healed. I think that Dewar emerges at the end of the story as a rather tragic figure, one of the very greatest late 19th century british scientists who in the end is frustrated by a failure which hardly anybody could have expected him to achieve. James Dewar's dream of reaching absolute zero was over. He spent the rest of his life investigating other scientific problems, such as the physics of soap bubbles. He'd always been a loner.
Ultimately, his refusal to collaborate costed him the glory he felt he deserved. I think it's really impressive how often scientists do seem to be driven by the spirit of competition, by the spirit of getting there first. But what's really fascinating about these races, the race for absolute zero, is that the goalposts move as you're playing the game. The race in science is not for a predetermined end, and once you're there, the story's over, the curtain comes down. That's not at all what it's like.
Rather, it turns out you find things you didn't expect. Nature is cunning, as Einstein would have said, and she is constantly posing a new challenge, unanticipated by those people who start out on the race. Sometimes, an unexpected event triggers a whole new area of research.
This happened in Leiden as Onnes' team began to investigate how materials conduct electricity at these very low temperatures. They observed that at around 4 degrees above absolute zero, all resistance to the flow of electricity abruptly vanished. Electrical resistance dropped as if it had gone over a cliff. It was going down and down and down and then disappeared or all but disappeared. Now, this was an astonishing thing.
Nobody ever had seen anything like this before. There was nothing on Earth that had no electrical resistance. Onnes later invented a new word to describe this bizarre phenomenon. He called it "superconductivity." (Allan Griffin) We have a circular ring of permanent magnets, which are producing a magnetic field.
And now when we put a superconducting puck over it and give it a little push, the magnetic field repels the superconductor. The magnetic field from the track induces a current in the supercooled puck, which in turn creates an opposite magnetic field that makes the puck levitate. It produces a magnetic field like a north pole against north pole, and that's why you have the repulsion. As the puck warms up, its superconducting properties vanish along with its magnetically induced field. For decades after its discovery in 1911, the underlying cause of superconductivity remained a mystery.
Every major physicist, every major theoretical physicist had his own theory of superconductivity. Everybody tried to solve it, but it was unsuccessful. There were more surprises ahead. In the 1930s, another strange phenomenon was observed at even lower temperatures. This rapidly evaporating liquid helium cools until at 2 degrees above absolute zero, a dramatic transformation takes place. (Allan Griffin) Suddenly you see that the bubbling stops and that the surface of the liquid helium is completely still.
The temperature is actually being lowered even further now, but nothing particularly is happening. Well, this is really one of the great phenomenon in 20th-century physics. The liquid helium had turned into a superfluid, which displays some really odd properties.
Here I have a beaker with an unglazed ceramic bottom of ultrafine porosity. Ordinarily, this container with tiny pores can hold liquid helium, but the moment the helium turns superfluid, it leaks through. We call this kind of flow a "superflow." Superfluid helium can do things we might have believed impossible.
It appears to defy gravity. A thin film can climb walls and escape its container. This is because a superfluid has zero viscosity.
It can even produce a frictionless fountain, one that never stops flowing. Superfluidity and superconductivity were baffling concepts for scientists. New radical theories were needed to explain them. In the 1920s, quantum theory was emerging as the best hope of understanding these strange phenomena. Its central idea was that atoms do not always behave like individual particles.
Sometimes they merge together and behave like waves. They can even be particles and waves at the same time. This strange paradox was hard to accept, even for great minds like Albert Einstein. In 1925, a young Indian physicist, Satyendra Bose, sent Einstein a paper he'd been unable to publish.
Bose had attempted to apply the mathematics of how light particles behave to whole atoms. Einstein realized the importance of this concept and did some further calculations. He predicted that on reaching extremely low temperatures, just a hair above absolute zero, it might be possible to produce a new state of matter that followed quantum rules.
It would not be a solid or liquid or gas. It was given a name almost as strange as its properties: a Bose-Einstein condensate. For the next 70 years