Remember how it used to be that we had to worry about whether we had a Verizon or AT&T iPhone? And that the AT&T iPhone was the one that you can use abroad? While the Verizon one was just for America? Wasn't that weird? Now we don't have to worry about that. Because when we moved to 4G, the world chose a single global wireless standard: LTE. After the bruising battles of 3G, 4G LTE's dominance came far more smoothly. In this video, we look at how 4G LTE won the wireless world. ## Beginnings Perhaps the problem with 3G had been that 2G GSM was too successful.
Its unexpected worldwide adoption made Europe the center of the mobile telecom world. It also enriched many of the area’s networking companies. So when it came time to go to 3G, a competitive race broke out between two standards. First, there was UMTS/WCDMA. UMTS is the formal name for the whole networking system. WCDMA refers to the air interface connecting the handset and tower. But a lot of people just call it WCDMA, and that is what I will do too to keep it simple.
WCDMA evolved out of 3G work done by a Europe-Japan collaboration. It is maintained by an umbrella organization called the Third Generation Partnership Project, or 3GPP, which also now develops GSM. Thanks to this, WCDMA became the dominant 3G system in the world. But there was also the more America-centric CDMA2000, which evolved from the cdmaOne 2G standard. The majority of its development was driven by the mobile company Qualcomm, which owned several critical patents. The "3G wars" as they were called intertwined with the larger telecom bubble going on at the end of the 1990s. Companies took out billions in debt to buy 3G spectrum for services customers had little interest in.
After the bubble popped, telecoms fumbled for ways to monetize their 3G networks. Some thought the killer app was mobile data. Others, texting or mobile internet services like i-Mode. And yet others championed value-added services like mobile banking. ## The iPhone & Android Then in 2007 came the iPhone. The year after that in 2008, we got Android. The most popular smartphone before the two was the BlackBerry, and that thing sipped data like it cost a million bucks per bit. iPhone and Android
definitely were not like that, and it caught the whole tech world by surprise. The original iPhone ran on an improved data extension to 2G GSM called “Enhanced Data Rates for GSM Evolution” or EDGE. Since it was faster than 2G but not as fast as 3G, they called it 2.75G. Sounds like how they name semiconductor process nodes. EDGE was slow - up to 384 kilobits per second. But that was relatively ok because the first iPhone itself was pretty limited. The screen wasn't that big. Without apps, most people streamed YouTube,
looked on Google Maps or browsed the web on the fully featured Safari browser. Even so, data usage grew fast. By February 2008, Google said that iPhones did 50 times more searches than any other mobile device. It by then had not even been on the market for a whole year.
But then in 2008, Apple released the iPhone 3G - which ran on the WCDMA/UMTS standard. They also spun up the App Store, and cut the sticker price (if bought with a two year subscription). ## Dropped Calls The resulting demand was so great that on launch day AT&T and Apple's activation servers crashed - forcing many customers to go home without active iPhones. The App Store was also a "grand slam" as Steve Jobs said. In its first weekend, users downloaded over 10 million Apps. Which was a lot back then.
These iPhone users consumed so much more data - a single streamed video uses as much bandwidth as 100 phone calls. And unsurprisingly, AT&T's wireless networks struggled to keep up. Prior to the iPhone, less than 4% of AT&T's 200+ million subscribers ever watched a video on their phones. Now that that number was far higher, you were getting dropped calls, spotty coverage, and slow download speeds. And then there are surge events, where people get together and all download or upload video to the internet. You get 100 to 5,000 times more connection failures - meaning times where the phone can't reach the tower because the tower is overwhelmed.
You get 7-30 times more blocked or dropped calls. And it takes 50-70% longer round trip wait times on the data networks. Famously in March 2009, thousands of iPhone users came to Austin, Texas for “South by Southwest” - a musical event for hipsters. The network totally fell apart, leaving a lot of very loud people with no signal.
People were apparently just shuffling down the street like zombies, heads down at their phones looking for a signal. They might have to walk for a while - such events can affect towers up to 10 miles away from the event. In the end, AT&T had to bring in mobile towers - previously deployed for President Obama’s inauguration - to handle the crush. ## The Coming Storm It is important to note here that the average iPhone user pays AT&T about 60% more than normal users over the span of their two-year contract. All in all, meaning an additional $2,000 So customers are hard pressed to find much empathy for the telecom giant despite pleas for patience concerning network upgrades.
And the data boom was not expected to end. The iPhones just kept coming. The iPhone 4 in 2010, which was my first. And then the iPhone 4S. As smartphone adoption grew, data consumption was expected to double each year. Gene Munster, a securities analyst, told the New York Times: > "Whether an iPhone, a Storm or a Gphone, the world is changing, we’re just starting to scratch the surface of these issues that AT&T is facing."
People were even predicting a mobile apocalypse - with all the cellphone networks coming to a halt by 2013. It was estimated that data consumption in 2015 would be 26 times that of what it was in 2010. ## HSPA & EV-DO It thus became very clear that the telecoms needed a bandaid to solve urgent needs, as well as a longer term solution for the future.
The immediately available thing has been the 3G+ solutions. 3GPP extended the WCDMA/UMTS standard to create a new data service called High Speed Packet Access, or HSPA. First released in 2004, HSPA offered peak speeds between 3.6 megabits and 7.2 megabits per second. You would probably call it more of a 3.5G technology. cdma2000 had their own HSPA-like data extension too with Evolution Data Only or EV-DO. It offered reduced latency and peak downlink speeds of 2.4 megabits per second.
HSPA was originally envisioned as a mobile broadband product targeting business executives needing to connect to fast internet while traveling away from their offices. You bought these USB modems that you can plug into your laptop. But smartphones needed a different type of data service than what these HSPA dongles were built for. An executive connecting to a corporate VPN is probably going to use it for a brief period of time for like uploading or downloading documents. But apps like Twitter or Facebook do not work like that - it just constantly sucks down data. And of course, the number of smartphone users far, far outnumbered that of USB modem users. A newer, long-term solution was necessary. ## Convergence The 3GPP first mentioned 4G as the long term evolution of HSPA back in 2004. Nothing much came of that, however.
It was not until early 2006 that this drive towards 4G picked up steam. Several telecoms in the US and Europe published a joint white paper called "Next Generation Mobile Networks Beyond HSPA & EVDO". Thus began the NGMN alliance, an open forum of telecoms defining, evaluating and eventually ratifying a new standard for commercial launch.
Their work pushed 4G development towards concrete performance and economic targets. There were three major issues to address. First, 4G networks must provide way more bandwidth and capacity. Easier said than done. Second, we needed Convergence. 2G and 3G networks had to maintain two separate networks:
A Circuit Switched Network for handling voice/SMS and a Packet Switched Network for doing data. This was not only duplicative, but also added a slight 100-millisecond lag as packets transferred between the two networks. People figured that we can carry voice and texts as just data. Why not unify the two? And finally, there was a desire - at least on the UMTS side - to clean up a wireless standard that over the years had gotten complicated and unwieldy. Planners decided that it was a good time to sit down with a fresh sheet of paper - Seymour Cray-style.
## Long Term Evolution So what is LTE? The 3GPP focused on two work items: System Architecture Evolution - which concerns the core network, the part of the cellular network that handles data traffic to and from the wider Internet. And then there is Long Term Evolution - which technically only covered the radio access network, mobile handsets, and the OFDMA air interface for downlink. The 3GPP's official name at first for the whole system end-to-end was Evolved Packet System. Which was a dumb name because EPS already means "earnings per share". So everyone ended up preferring the best name of the bunch: Long Term Evolution, or LTE.
3GPP rolls out regular, iterative "releases" so that ecosystem partners can gradually onboard. Until Release 7 in 2007, the 3GPP mostly focused on HSPA, including an improved version of that called HSPA+ or HSPA Evolution. LTE began with Release 8. And it represented a big change from its priors. Yes,
it offered much higher theoretical data rates - 100 megabits down, 50 megabits up. And yes that matters. But to the telecoms, other things LTE offered mattered more. Release 8 had far better spectral efficiency than HSPA, meaning that it could transmit more data over the same piece of spectrum. A big deal for telecoms needing to pay billions for spectrum licenses. LTE offered far better latency. Meaning that it took less time for data to travel between the phone and the network. It also helped phones fire up faster to communicate with the station.
LTE was also very flexible. Operators can deploy it on all the current existing frequencies, as well as a bevy of new ones. This again accommodated all the different telecoms around the world who have the bands they got, and needed to make do with that. By now, the NGMN Alliance represented 70% of the world's telecoms, including some of the world's largest like China Mobile. In June 2008, they officially ratified and approved LTE as broadly meeting its expectations. This was a major step forward.
In December 2008, the 3GPP froze the feature set of Release 8 - officially releasing LTE to its telecom partners for adoption. ## OFDMA So how does LTE achieve all this? At the core of this new standard was an air interface technology called "orthogonal frequency division multiple-access" or OFDMA. In a prior video, we talked about this type of technology. Spectrum is fundamentally limited
like land. So telecoms spend billions to buy licenses to a slice of spectrum. They then use an air interface technology standard to allocate pieces of that spectrum for towers to communicate with many handsets. Some of the challenges with 1G and 2G was how to use those slices to give each individual handset the ability to transmit ever more data without interference. First 1G networks used FDMA. The "F" means "Frequency", and it involves splitting up the frequencies into slices - with guard bands in between to prevent interference. Each user got exclusive use of a slice.
TDMA. The "T" stands for time, and it means splitting up the frequencies and giving users a slice for a certain period of time before jumping to another one. And then CDMA. The "C" stands for "Code", which gave each user a special code and the whole bandwidth. The code keeps everyone from interfering with one another. And now we got OFDMA. Woof, this one is a bit complicated to explain. The "OF" in OFDMA means "Orthogonal Frequency" and that gives you a hint as to how it works. But only if you remember what orthogonal means.
It goes back to the beginning. Just like how FDMA split the bandwidth into slices, OFDMA does the same. Except these slices - or subcarriers - are "orthogonal" to each other. Each subcarrier is a wave and the many waves are positioned such that when one peaks, its wave-peers are at a zero value. Thusly, the subcarriers are encoded so that they overlap without interfering with one another. We then split up data stream and transmit the pieces across all of these subcarriers. The receiver reads the subcarriers' pieces
and reassembles the data with error correction accordingly. Doing it this way better ameliorates issues with reflections. At high data rates, such reflections can cause small delays in receiving packets, which in turn can cause big interferences. Did that all make sense? Let me try this final metaphor. Imagine a group of people in a room reading you a sentence. Each group member simultaneously reads aloud a word in the sentence but at different pitches.
We listen to each specific pitch to receive the words spoken at us and then assemble the sentence accordingly. OFDMA is also used for later WiFI standards. It works best on wide bands of spectrum. And like I said before telecoms can use it at virtually any wavelength they can get their hands on, a big reason why LTE became so widely adopted. ## Competitors Like as with 3G, LTE was not alone - there were two other major competing standards.
There was Qualcomm’s Ultramobile Broadband, or UMB. This built on their CDMA2000 work though it was not backwards compatible with those networks. And there was WiMAX. WiMAX began back in mid-2001 by the WiMAX Forum - a group of commercial Wi-Fi networking vendors - to promote wireless broadband access.
The original intent was to challenge DSL and cable modem service - 802.16e, or fixed. But they also had a mobile variant of the standard, 802.16m. WiMAX had a head start on LTE in terms of technical availability. Moreover, backers argued that it can leverage the scale of the existing Wi-Fi supplier base. There were certainly more Wi-Fi devices out there than 3G ones.
In the United States, WiMAX even had a serious telecom backer: Sprint and its majority-owned mobile broadband partner Clearwire. They also had the support of silicon tech giants like Intel, Cisco and so on. So people should not downplay WiMAX's seriousness for the 4G crown. ## What is "Real" 4G? It is at this point that I should take a pause and acknowledge the insane naming system of "4G". 4G is a marketing term - just like 3-nanometer process nodes. And thusly, the telecoms have flexibly stretched those definitions to drive sales. In the prior generation, naming was set by the International Telecommunications Union, a UN Body, with the IMT-2000 project. IMT-2000 certified a family of similar
wireless standards to be officially called "3G". For "4G", the ITU in 2008 created IMT-Advanced with the same goal. They set several requirements like being an all-IP network and offering a peak downlink rate of 600 megabits per second and peak uplink of 270 megabits per second. Per these 2008 requirements, this first version of LTE - with its peak downlink of 100 megabits and uplink 50 megabits - did not qualify.
So the 3GPP worked on a new iteration of LTE called LTE-Advanced. The WiMAX Forum also quickly brought out a new iteration that qualified called WiMAX2. These two wireless standards - UMB never made it that far - thus qualified as being the "real" 4G per the ITU.
However, the telecoms did not want to wait for all that nerd stuff. In late 2007, Verizon chose to adopt LTE for the future network buildout and quickly marketed those plans as being 4G. Sprint and Clearwire also did the same in 2008, announcing and marketing their network as being 4G. This put the remaining American telecom networks AT&T and T-Mobile at a bit of a disadvantage. They already spent billions on expensive HSPA+ network upgrades, and did not want to immediately roll out LTE. But customers will definitely choose the 4G over the 3.75G, since 4 is higher than 3.75. So they started marketing their own networks as being "4G-like" or "4G-ready", in the case of AT&T. Or in the case of T-Mobile, just plain 4G.
After some time of this consumer confusion, the ITU finally threw up its hands and acknowledged that the damage had been done - allowing the older LTE and WiMAX standards be called 4G. Essentially, 4G is just whatever provides a "substantial level of improvement" over 3G. And what does "substantial" mean? Who cares. The prior 3G rollout debacle primed customers to expect a big gap between theoretical peak data rates and what is actually experienced out in the wild. In the end you just have to "feel" it.
## LTE Wins After the LTE feature freeze, telecoms quickly started adopting it, with the first being in Finland. Qualcomm and UMB quickly dropped out of the battle when their major telecom customers chose LTE - including the aforementioned Verizon Wireless, a previously very large CDMA2000 customer. The industry's excitement over a potential world standard showed Qualcomm where the winds were blowing. So in 2009, soon after LTE's release, Qualcomm halted their work on UMB and embraced LTE as well. Qualcomm figured that they can still make money with existing 3G CDMA work and grow in a different category. The rising success of their Snapdragon mobile System-on-chips - first released in late 2007 - might have helped nudge them along too.
So that just left WiMAX, which remained backed by tech firms and a smattering of telecoms. Sprint left the aforementioned NGMN alliance after it backed LTE because they were so committed to WiMAX. In the end, however, the biggest WiMAX backer was Intel, which produced chips and components for networking equipment. They had the most to gain from a WiMAX world.
But Intel was not a telecom. And while WiMAX showed technical promise in trials in South Korea, the world's telecoms did not want to see another split like there was with 3G. It was not only confusing but also cut them out of lucrative global roaming fees. LTE had broad telecom backing though the NGMN alliance, backwards compatibility with existing wireless standards, and a smooth onramp path with HSPA and HSPA+. WiMAX had none of that. Intel was trying to make inroads into a space it did not know and was not welcomed. In January 2009, Nokia ended production of their only WiMAX device - a sort of internet tablet thing.
Finally in 2011, Sprint confirmed that it would shut down its WiMAX network and add LTE into its 4G rollout. WiMAX returned to its roots of fixed wireless. Intel's offices for promoting WiMAX as a mobile product went quiet. ## Conclusion LTE had a bit of a slow start - there were just 120,000 4G LTE users in 2010 and only 8.8 million in 2011. But it really started to turn the corner after the September 2012 release of the iPhone 5, which supported LTE. Booming sales of this design-refreshed iPhone with its larger 4-inch screen triggered the release of yet more LTE-enabled smartphones. America led the way in LTE adoption - which is a bit of a reversal from prior times. But
major LTE network launches in India and China that same year helped really push 4G past 3G. By 2022, there were an estimated 5.16 billion LTE subscriptions. The path from 1G to 4G had been a bit of a mess, but we had finally ended up with one standard to rule them all. Convergence.
2025-03-08 16:40