Why Moore’s Law Matters
Moore's Law is dead Moore's Law isn't dead Moore's Law is being replaced with something else Moore's law will go on for another hundred years so on and so on all this talk about more can get pretty tiring and now we have here yet another thing about Moore's Law we talk about it all the time but what has more and his law really been about in this video where you look at the story and long lasting impact of Moore's Law but first let me talk about the Asian Armature patreon Early Access members get to see new videos and selected references for those videos before the release of the public it helps support the videos and I appreciate every pledge thanks and on with the show when Gordon Moore wrote his famous 1965 article the integrated circuit or IC was itself just five years old Dr Moore had not yet even founded Intel he was then still the director of research and development at Fairchild semiconductor the article is less of a rigorous scientific analysis and more of a marketing white paper at the time fairchild's ICS were not selling well outside of the government so more was going around trying to persuade Engineers to incorporate ICS into their products the 1965 article's intention was to convince the reader that ICS were the cheapest way to make smaller electronic devices and systems what was initially dubbed as Moore's plot was just one part of that argument in a section titled cost and curves he notes that in 1965 the most economical IC meaning to have the lowest per component cost had about 50 components the term component was then defined as a function or device including diodes and resistors in other words more than just transistors he writes the complexity for minimum component costs has increased at a rate of roughly a factor of two per year he also produced the now famous plot showing the component counts of fairchild's icy chips over the past four years extending out the plot more predicted that in 1970 an economical IC will have about 1 000 components and then 10 years later or 1975 an IC would have 65 000 components I want to note here that the math was wrong if the Baseline was 50 components in 1965 and we double each year then by 1970 we should have 1 600 components 50 times 2 to the fifth power not a thousand and by 1975 an IC should have 51 200 components 50 times 2 to the 10th power not 65 000. I do admit that he said that the rate was roughly two per year but that roughly is doing some serious heavy lifting anyway Moore's core point is that the cost of producing a chip is dependent on the cost of printing and etching an IC design onto a wafer he then argues that this cost is not correlated to how many components are actually on that design so if the ICS get more complex complexity being defined by component counts but the costs stay the same then the cost per component Trends down the 1965 article was widely read and debated Moore's prediction that denser or more complex ICS would get cheaper on a per-component basis was controversial the generally accepted wisdom back then was that system miniaturization always added costs rather than cut them this presumption was for very real reasons for instance several research managers at Bell Labs argued that an increasingly complex IC was far more likely to have one or more of its components not work this would of course ruin the whole chip so in fact the yield rates for increasingly complicated chips would approach zero percent making them unimaginably more expensive yet others pointed out the graph's small sample size as well as the issue that the only data points had used for were from Fairchild semiconductors classic Reddick and Hacker News stuff regardless of the feedback Gordon Moore and his comrades used Moore's plot to guide the technology roadmap first at Fairchild and then Intel Moore pushed his r d lab people to double chip complexity by a factor of two each year knowing that more complex chips are more profitable three new technologies introduced in the 1960s allowed this to happen the first was new lithography equipment like the contact printer to improve the resolution at which we can Mass print chip designs equipment advances are one of the core drivers of innovation throughout semiconductor history the second major technology was a new gauge structure the Silicon gate Moss transistor these are the ic's basic Electronic Component there are billions on a single die switching on and off a transistor is made up of a gate sitting on top of a channel connecting a source and a drain the gate allows or disallows a current to move along the channel from the source to the drain what was special about the Silicon gate Moss transistor was the fact it used polysilicon for its gate this gate performed far better than the previously used aluminum gate because it could bear higher temperatures without melting the Silicon gate was discovered by four companies including Fairchild semiconductor at about the same time in the mid-1960s but it was Intel who first bet the farm on it in the late 1960s a bold technical choice that paid off in letting them cram together three to five times more gates than ever before the third technology came out of the design World Intel's first breakthrough products using silicon gate technology were memory chips SRAM and dram these memories can be more easily integrated into systems and perform far better than the magnet-based ferrite core memories that had been used up until then remember those from the Wong Labs video anyway the memory chips fundamental building block is the memory cell which stores one bit throughout the late 1960s and early 1970s designers achieved shrinkage by slimming down the number of transistors in that memory cells design the first memory cells used six transistors each Intel's first dram IC the Intel 1103 first introduced in 1970 had three transistors per memory cell then a one transistor memory cell first developed by Robert Dennard in 1966. this one transistor memory cell would eventually enter the market in the mid-1970s with the 64 kilobit generation these three advances made a component doubling each year technically possible but as it turned out a one-year process node Cadence was too fast for Intel's customers system designers like IBM they wanted new Chips every three to four years rather than one this was reinforced by an American Tax depreciation law that specified a six-year depreciation cycle for computer buyers a new memory generation hitting the market each year was just too frequent so Intel introduced the new generation of memories with a 4X increase in density every three years Intel can sell these Leading Edge Products at the start at a good price once competitors like TI catch up Intel can move up the next step on Moore's plot the older product thereafter rapidly declines in price this wine vintage model has some call it is well known they ported this model from memory to microprocessors soon after the categories invention in 1971.
Moore's plot worked because it was predictable at any one time Intel might not always have the absolute best microprocessor compared to those from zylog or Motorola but they could sell customers on that product's future you could point to the plot and tell the customers that the microprocessor in front of them was going to get better down the line that predictability helped customers gain confidence in the product as Intel ascended to become America's most prominent and profitable Semiconductor Company Moore's plot became more accepted by some of its peers TI AMD and Motorola adopted it in some form for their technology road maps too in 1975 10 years after the original article Gordon Moore gave a follow-up speech by then Dr Moore had succeeded Robert Noyes as Intel CEO bringing a great deal publicity to his plot at around this time people Caltech Professor Carver Mead perhaps is most fervent adherent started referring to the plot as a law also by this time the benefits of silicon gate technology had started to Peter out Intel's Engineers were having difficulties pushing out more complex ICS the smaller the Silicon Gates got the less control the gates had over the flow of electrons moving from the source to the drain this causes contamination still Moore felt that his 1965 prediction had worked out in that 1975 speech Moore cited the existence of an IC with 65 000 components a 16 kilobit charge coupled device or CCD memory but in recognition of the growing technical challenges of cramming more components onto an IC Moore announced the revising of his Law's Cadence down from a doubling every year to every two years starting in 1980. technically Moore's 1965 prediction was right the best kind of right but that CCD memory turned out to be commercially uncompetitive the chip would be far better known for its eventual use in solid-state Imaging the most advanced mainstream memory technology at the time the 16 kilobit dram was at least a year away from commercialization even then this memory I see had less than half of the 65 000 component prediction reflecting on that 1975 speech Moore seemed a bit surprised that so many people cared about his changing of the goal posts I don't think anybody was planning their business around it probably because I was basking in the glow of the first prediction being right I don't think anyone was paying any attention to it so Moore's original 10-year extrapolation turned out to be quite wrong yet Moore's Law didn't get tossed into the trash like that two-day old white rice in the fridge instead the industry modified it to make delicious fried rice the 1970s style dram chip differs from its microprocessor friends in that it is made up of regular repetitive memory cells to get a more sophisticated memory chip as in to have more capacity you have to stuff together more and more memory cells into the same space logic chips on the other hand were seen as more of a design challenge than a production challenge at least back then as a result in the 1970s when people talked about Moore's Law they were more often referring to memory rather than logic chips to them making memory chips was the best pure expression of raw production ability this production ability matters since this memory was a commodity the first to market with the Next Generation reaped the most profits Japan's rapid catch-up to the American Semiconductor industry was built on continuing that Cadence of quadrupling memory bit capacity every three years there were first to The crucial 64 kilobit generation leapfrogging the Americans which took almost two years to catch up and thanks to the strict control of manufacturing processes Japanese yields sword the cost of making a Japanese memory IC plunged undercutting American Products when the dram Market crashed multiple times throughout the first half of the 1980s Japan's lower cost structure among other things allowed them to survive the market turmoil while many American companies couldn't part of the reason why Japan surged ahead in the early 1980s was that Intel and the rest of the Americans fell behind in Moore's Law measurements of Intel's microprocessor densities find a weird flatness in the early 1980s a period of time when Japan and others surged ahead this three-year period is because of the company's struggles transitioning to a new generation of lithography equipment in his 1975 speech Moore cited three significant factors contributing to continued increasing complexity the first one was to increase the size of the die the second was called technical cleverness in other words using computer automation tools to lay out designs that can make better use of the chip dye canvas the third thing and the one most relevant today was a decreased line Dimension the depth width and density at which we can etch circuit designs onto silicon improvements in lithography equipment encompassing machines that do exposure resist processing and etch have largely driven the upper limits of complexity for semiconductors since the late 1970s when those lithography advancements stall it filters down the chain in the early 1980s Intel needed to produce their own 64 kilobit dram chip the standard lithography tool back then was the projection aligner these tools projected a chip design across a whole wafer however the 64 kilobit needed more resolution than the aligner could provide so Intel had to move on to the newly introduced stepper a device that moved across the wafer step by step Ergo the name they struggled with it Intel's first yields a 1982 came in at 40 percent it took another two to three years 1985 basically to bring those production yields up to 75 percent Intel's previous generation 16 kilobit dram came out in 1979 so it means a five to six year gap between new node Generations this lithography delay contributed to Intel exiting the commodity memory business entirely reorienting the company towards their growing microprocessor business at the same time the company recommitted itself to Manufacturing in 1983 Dr Moore instructed his colleagues to accelerate their manufacturing capability to match that of the Japanese four times every three years or doubling every 18 months that October in a talk given at the data Quest semiconductor industry conference more publicly redefined his law once more revising the complexity doubling Cadence to the aforementioned 18 months I did a video detailing this before but throughout the late 1980s Intel and the Americans be committed to manufacturing raising their yields enough to erode the Japanese advantage when Japan's semiconductor industry fell into over capacity in their core commodity dram Market it triggered a massive crash the industry Consolidated and Japan lost its Mojo to the Koreans Intel sidestep this memory disaster with its pivot to microprocessors this as well as its Revival in manufacturing returned the company to prominence to them this was thanks to renewed adherence to Moore's Law in the early 1990s the U.S federal government began a slow Retreat from direct involvement in technology policy something started during the first Bush Administration and continued by Clinton this included the sun setting of The National advisory committee for semiconductors the U.S government set this up in 1988 to figure out a long-term road map to keep America at least one node generation ahead of the Japanese by 2000. this plan was called Microtech 2000 and with the government now stepping back the semiconductor industry Association or Sia was asked to fill the void take over the Microtech plan and run it as an ongoing living document Moore jumped in in the fall of 1991 he convinced the other heads of the semiconductor industry to use Moore's Law and its concept of ever doubling complexity has the basis of a new unified National Road map together the companies created an r d cartel directing the industry's entire future since achieving increasing complexity requires massive resources that only large companies have it favored established incumbents this new national road map called the national technology roadmap for semiconductors received substantial buy-in from the government Moore's Law became the deciding factor for hundreds of millions of dollars in federal Investments like those from DARPA every academic research proposal was judged on whether or not it helped continue the complexity pace of Moore's Law during this period of time America's entire semiconductor industry achieved convergence in the popular video game Dead Space convergence happens when the Necromorphs gather together at a marker triggering the release of incredible bursts of energy you has the video game protagonist Isaac Clark have to stop this convergence event a similar thing happened with America's semiconductor industry in the 1990s this convergence event standardized all the materials equipment and processes drastically cutting down on cost and releasing incredible bursts of productivity in 1995 Gordon Moore gave a speech at an Sia dinner reflecting on three decades of Moore's law in that speech Moore claimed to be surprised as anyone that the Law's tenants albeit so revised continued to hold he said that all exponentials had to end but when he had given up on trying to predict when that might be that ending was still far away for Intel in 1995 was about to come into a burst of lithography equipment improvements deep ultraviolet or duv eczema lasers soon entered the market these sophisticated lasers replaced the old i-line lamps companies use for lithography unlocking a vast new Green Field for lithographic improvements in 1999 the industry started using 248 nanometer Krypton fluoride lasers for their Leading Edge process node at the time 180 nanometers two years later in 2001 193 nanometer argon fluoride lasers I will always call them ARF entered the market these are substantially similar to their 248 nanometer predecessors these enable Intel to rapidly commercialize new technology nodes at three-year intervals progressively reducing their transistor's smallest Dimension 30 each time Intel's adherence to Moore's Law cause semiconductors to get better and cheaper at the same time prices for particular microprocessor introduced from 1995 to 2000 would rapidly fall somewhere from 50 to 70 percent one year after its introduction it was a great time for it vendors they automatically received better and cheaper performance every generation without needing any change to their products the introduction of 193 nanometer lasers allowed Intel to hit its Moore's Law Cadence from 2001 to 2003.
but those lasers were unsuitable for the next node step coming in 2003 the 70 nanometer node and unfortunately the Next Generation lithography technology Intel had selected euv was not ready furthermore generations of shrinkage had weakened the gate's control over the flow of electrons so much so to cause power consumption issues thus in 2004 Intel shifted to multi-core processes abruptly ending the streak it took users by surprise forcing them to rewrite software at Great expense to be parallelized the introduction of 193 immersion lithography in 2006 allowed Intel to unlock new gains in semiconductor manufacturing with euv still seemingly years away Intel moved ahead feeling that they could do without it they were of course Raw right now the company is looking to recover their node step speed with a rapid adoption of euv and high na UV hitting the right lithography technology remains critical to achieving the next Leading Edge node and enabling further progress in computing power so as you can see Moore's law was not only never a law it was never even remotely predictive of anything from the very start that original concept back in 1965 was nothing more than a vaguely defined chart covering a single four-year span it was then a very young industry one of those up and to the right charts they put into VC pitch decks that original concept died a long time ago so what is Moore's law now it is a mythology a story that maintains the culture of an entire industry much like the scriptures of a religion and like with any sensible mythology the story has been reinterpreted and redefined over the years to fit the audience of the time regardless the ambitious concept of ever increasing complexity remains you might be thinking to yourself now well gosh we don't need things to be faster I've been using the same phone and PC for years now why do we need this effort I reject this form of technological nihilism the average consumer might not see it anymore in their phones or gaming rigs but other Industries are still craving more computing power our economy and Society still have many complicated technical problems and throwing more Computing at those problems has consistently been the best way to solve them a recent very interesting paper out of a team at MIT computer science and AI lab led by Neil Thompson looked at the impact Moore's law has had on three economically significant areas weather prediction protein folding and oil exploration each of these multi-billion dollar Industries have greatly benefited from increased computing power for instance take the weather from 1956 to 2017 the amount of compute power used by the NOAA has increased by a trillion times compounding at 48.2 percent a year over that same time period the error rate and temperature predictions has declined from 5.8 degrees in 1972 to just 3 degrees and 2017. you can attribute up to 94 of this improved accuracy solely to the additional compute power with improved algorithms coming up a distance second getting even better results will require yet more computing power where not for the complexity advancements provided by Moore's Law such improvements would not be economically feasible and to me that is why Moore's Law matters alright everyone that's it for tonight thanks for watching subscribe to the Channel Sign up for the newsletter and I'll see you guys next time