The iPhone was 3G's Killer App

The iPhone was 3G's Killer App

Show Video

2G GSM was one of the most successful things  to ever come out of Europe's technology sector. So when the next generation came around,  everyone bought into the wireless hype. Billions of dollars spent buying spectrum, which  led to a big crash and then a slow 3G rollout.

Everyone wondered how the telecoms could  ever make back the billions spent on a   3G investment. What could ever be big enough? In this video, we relive the 3G wars,   the 3G crash, and the Jesus phone  that brought them all back to life. ## Beginnings 0G wireless networks existed, but  I am not going to talk about them. The first commercially available 1G wireless  networks date to the 1980s. In the United States,   Bell Labs launched a trial 1G cell  network in Chicago in 1977. Then they  

and Motorola launched the Advanced  Mobile Phone System or AMPS in 1982. 1G deserves to be labeled as such - not only  because they were the first commercial deployments   - but also because they established the tenets  of the modern cellular industry as we know it. What are those tenets? First, the concept of  a telecom operator buying a license for a band   of spectrum from the local government  authority. It basically begins here.

Second, the telecom does not try to cover a  massive service area with just one massive tower.   Instead, they set up many smaller base stations  that only have to cover a single cell area. The third and final tenet of modern  cellular systems introduced by AMPS   and other 1G networks was seamless handoff/access. When someone exits a cell and enters another,   any in-progress calls are seamlessly  transferred between the two. ## FDMA AMPS did this by splitting the whole  licensed band into non-overlapping slices. Each base station in a cell gets its own slice  for communicating with handsets in its area.  

No two neighboring base stations get the same  frequency slice, so to prevent interference. Imagine the spectrum as like a piece  of land. This makes sense. Like land,   spectrum is a limited resource. When the station connects to  the handset during a phone call,   the base station is essentially  building a road on that land. But an AMPS base station can't communicate  with two or more handsets using the same   slice of spectrum. To deal with this, the  base station cuts its own slice up into yet   smaller subdivisions and gives each person  in the cell their own sub-divided slice.

So going back to the metaphor,  that means one road to one driver. This idea is known as Frequency-Division  Multiple Access or FDMA. If the metaphor works, then you can see the  drawbacks of the approach. Since we need   to space out the roads to keep the drivers  from potentially driving into one another,   we can't have that many roads on our land. 1G calls sounded bad. The handsets were  too big and suffered poor battery life.   And the handoff between towers  was unstable. Other than that,  

everything was awesome. By 1990, there  were 11 million wireless subscribers. ## 2G 2G networks introduced named standards. Managed by working groups and  organizations, standards help   commercialize basic cellular technologies.  They offer telecoms an ecosystem that saves   them money through economies of scale while also  establishing international connectivity norms. Wireless generations can accommodate multiple  standards. The most widely adopted 2G standard  

was Europe's GSM. GSM originally began in  1983 as the Groupe Spécial Mobile committee,   working on a new standard after some  too-expensive 1G deployments in Europe. GSM was notable for its use of  Time-Division Multiple Access,   or TDMA to let the same  spectrum carry more data bits. What is TDMA? As I mentioned, the 1G pioneer  AMPS network used Frequency-Division,   where we split up the land  into many little pieces. If FDMA is like building a  road for each user in the cell,   then TDMA is like those vacation house  timeshares or an AirBnb for a digital nomad. Someone visits a house for a weekend or so,   and then they jump to another house next  door after an allotted time period is up.

With TDMA, we slice up the spectrum like  as with FDMA. But each handset - instead   of getting their whole own slice all the time  - only gets their slice for a certain time   slot. Time guard bands are left in between each  handset's transmission to prevent interference. Coordinating this round-robin  switching behavior can get complicated.   But the upside is that we can  enable more users per channel   slice and squeeze more capacity by  making the time guard bands smaller. The first 2G GSM networks  went up in Europe in 1991,   with commercial operations beginning in 1992. The  first non-European telecom to sign up was Telstra.

GSM broke out. In part because of its simplicity,   wealth of convenient features like smart cards  and SMS texting, as well as being first to market. By 1994, over 100 telecoms had signed  up including several in the US,   which GSM entered when it  added the 1900 megahertz band.

GSM eventually captured 80% market  share of the 2G market, sweeping aside   America's cdmaOne - remember that name - and  Japan's Personal Digital Cellular standard. ## The 3G Controversy As 2G GSM was being finalized - the GSM ecosystem  players started pondering what should come next. What eventually emerged was the trend  towards better internet access. While  

GSM did initially have that, the data speeds  were not great - like 9.6 kilobits per second. To put in perspective, to download a  3 megabyte MP3 file with 9.6 kilobits   per second, that would take like 44 minutes. A few years before GSM launched, the  European Union began to define a new   global standard called Universal Mobile  Telecommunications System or UMTS. UMTS   would not only handle calls but also offer a  stepwise improvement in data transfer rates. But throughout the 1990s, it started to dawn  on everyone that mobile was a world-changing   technology. By 2005, there would be 1.2  billion wireless subscribers using GSM  

technology. And the Europeans sat at  the very heart of the new revolution. That not only conferred immense global  prestige, but also financially benefitted   Europe's domestic wireless companies like Nokia  and Ericsson. After building networks at home,   they exported to developing countries  leapfrogging wired networks directly to cellular. Like, Eastern Europe, Southeast Asia  and mainland China. The last of which   would become the world's biggest GSM market. Moreover, 3G was initially marketed  early on in the 1990s to the telecoms as   an entirely new thing. The telecoms were not super  thrilled about that considering that many of them  

had just finished building their 2G networks  and needed to make back their investments. Thus grew a view within the European body then  in charge of GSM - European Telecommunications   Standards Institute or ETSI - that creating  a new standard was not necessary. Rather,   they should either built an evolutionary 3G onramp  or just extend GSM to handle higher data rates. So as the Europeans built up UMTS,  research insights from that work   flowed into GSM. The first data extension  was something called General Packet Radio  

Service or GPRS. You can probably  call this a 2.5G wireless technology. GPRS raised GSM data rates to 53.6 and then  later 114 kilobits per second. Which was   good enough to let people send emails  and even kind of browse the web in the   form of Wireless Application Protocol, WAP. After GPRS, we had the Enhanced Data Rates for  GSM Evolution or EDGE, which offered speeds of   up to 384 kilobits per second using the same  fundamental GSM technology under the hood. GPRS and EDGE were good extensions for  GSM. However their limitations also   demonstrated the family's  fundamental shortcomings.

Using the TDMA method - as in, the  timeshare - would always prevent GSM   from matching the higher data transfer  rates that another technology can do. ## CDMA So now, we should pick back  up a latent thread - CDMA. CDMA stands for Code-Division Multiple Access. And   the name gives you a hint as to  how this accursed thingy works. With TDMA, we still had to divide  the spectrum same as FDMA, leaving   guard bands in between the divisions. Like as  before, this was not efficient. Can we not?

With CDMA, you can. Each transmitting CDMA user  is given a unique codeword and is allowed to   broadcast using the whole spectrum band.  In other words, everyone can talk at the   same time but because they are essentially  speaking different "languages" it works. Qualcomm introduced the first CDMA networks with  the cdmaOne standard back in the 2G days. It did   fine in the United States, but the Europeans  opted not to adopt it for the GSM standard   due to its complexity. And that  probably was the right choice. Though GSM blew ahead in terms  of worldwide subscribers,   cdmaOne subscribers grew quite  well into the late 1990s.

The CDMA concept was once again presented for  inclusion into UMTS. But for whatever reason   - most likely a bad case of "Not Invented Here"  syndrome - ETSI struggled to adopt it. A faction   in Europe instead backed an enhanced version of  TDMA, Advanced TDMA, and this went back and forth. ## No Global 3G For 3G, the goal had always been to  collaborate to create a global standard. That was the initial dream of  the United Nations' International   Telecommunications Union or ITU. But GSM's  unexpected success changed the dynamics. Europe wanted a single global standard, but one  strongly influenced by them, just like with GSM.

The Americans felt that they had to keep  any standard from being entirely dominated   by Europe. GSM winning the 2G era led to many  American wireless companies losing business. And American consumers looked  upon with jealousy at their   European friends' international roaming powers. And Japan too needed to get into the mix, because  their own 2G national standard PDC left them with   a sort of Galapagos Island effect that  hurt their companies' competitiveness. So in late 1997, the ITU chose  not to back a single standard   but a family of them. All somewhat-compatible  and called "3G". Each had their own backers   racing for adoption by 2000. Ergo why  they were named the IMT-2000 family. That same year, ETSI decided to adopt an  air interface standard known as WCDMA out   of five candidates for their UMTS system.  Developed as a Japan-Europe collaboration,  

WCDMA is a CDMA-based technology known  for its efficient use of spectrum. Japan's biggest wireless telecom NTT  DoCoMo preferred WCDMA because they   had limited amounts of spectrum. And since  it had been a joint Japan-Europe effort,   the Europeans felt that WCDMA was  enough of theirs to move forward. I need to note that WCDMA is not exactly the  same as UMTS. WCDMA specifically refers to the  

air interface - which was how the handset  communicates with the base station. But   to keep it simple, I shall refer to the  two - UTMS and WCDMA - as just WCDMA. Anyway to promote WCDMA, in 1998, ETSI joined  with several other standards bodies in the US,   Japan, Korea, and China to create the  Third Generation Partnership Project,   or 3GPP. ETSI and the other  standards bodies handed over   WCDMA and the ongoing GSM data extension  work, meaning GPRS/EDGE etc, over to 3GPP. ## The Patent Fight Over in the US, the rather litigious Qualcomm  soon came to believe that ETSI - working in   cahoots with Ericsson and Nokia - was suppressing  the Qualcomm flavor of CDMA behind the scenes.

Ericsson and Qualcomm were also then in the midst   of a great patent infringement  battle dating back to 1996. In 1998, Qualcomm announced that they would  withhold licenses for their patents - despite   widely held norms that they should do so -  so long as “politics” continued to play a   role in 3G standards development.  American politicians backed them. Qualcomm then sponsored the formation  of a new competing standards group in   late 1998. This group - confusingly  named 3GPP2 - continued developing   the cdmaOne standard as a North  American one, creating CDMA2000.

Now WCDMA and CDMA2000 aren't all that dissimilar,   the differences are largely superficial.  For example they both depend on CDMA,   which led to problems because Qualcomm  held a lot of key patents in CDMA.   So by being unwilling to license, they were  essentially holding the 3G ecosystem hostage. This dragged on until Qualcomm and Ericsson  finally settled their lawsuits in March   1999 with a broad cross-licensing  patent agreement. It came ahead of  

a deadline set by the ITU, which  found the whole thing appalling. But Ericsson more likely chose to  settle and embrace CDMA because   they realized that Qualcomm was about to  hit the titanic mainland Chinese market. Business newspapers reported that the Chinese  government told the United States that they   would open up certain key markets if the US backed  their ascension to the World Trade Organization. This included dividing the old Chinese phone  monopoly into three companies - China Mobile,   China Telecom and China Unicom - and then having  each of the new companies adopt a standard.  

China Telecom used the American CDMA2000  standard. This was seen as a major coup. ## Crazy Spectrum Outside of the patent fights, the  early 2000s saw very slow 3G adoption. Maybe we can blame it on there being two major  but incompatible 3G standards. But as I mentioned,   the standards aren't all that different.  And Nokia and Qualcomm eventually released   chipsets supporting both standards.  So there were other issues at play. There might have been too much hype over a  potential 3G rollout - perhaps tied in to   the larger dotcom and fiber boom going on at the  time. Anticipating a great mobile bonanza ahead,  

the European governments charged the telecoms  a great deal of money for 3G spectrum. In April 2000, the British government held  a 3G spectrum auction that raised some $30   billion from six telecom bidders.  Note that this is just the spectrum,   we haven't even considered the cost  of building the actual network.

But people at the time were thinking  that the 3G industry would be worth $200   billion. Considering that, $5 billion  for a single license made sense right? More troublingly, the British government  pushed the telecoms to pay all upfront   for their licenses. The telecoms had no  choice but to borrow from the financial   markets to make those payments, which  had severe consequences down the line. The Germans and French launched similarly  massive auctions. Germany sold a set of  

licenses for a staggering $45 billion. One  went for $7.7 billion alone, won by a joint   venture of partners including Hong Kong-based  Hutchison Telecom, run by the cunning Li Ka-shing. Shortly after that, Hutchison  sold their 50% stake in that JV,   saying that the price was too  high considering that they would   be just one of six different telecoms  in the German market. Li told reporters: > The prospect of 3G (third-generation mobile  phone networks) is very good but anything,   even the best, should have  a price in the commercial   society ... I don't believe the  license price can go up endlessly Li Ka-shing should know. His  empire includes a telecom in  

the super-crowded Hong Kong market. When  he sells, everyone should pay attention. Over in France, France Telecom also  pulled out from the bidding after the   French government set a fixed price  of $5 billion for their 3G spectrum. Later auctions in the Netherlands, Switzerland,   Italy, and Poland also turned out to be  disappointments - reflecting the impact   of the crashing telecommunications  bubble and the deflating of 3G hype. And then when the 3G networks finally went live,  they disappointed consumers. People were promised   FAST data speeds of 2 megabits per second, but  that turned out to be theoretical. The best thing  

available in 1999 was EDGE, which only got to 384  kilobits per second. So the customers stayed away. ## Where is the Money? Sure the industry players - from the telecoms to the banks issuing debt for them  - can upgrade their networks. But what would be the thing or service to get more   people to sign up for these pricey  3G services? Nobody knew at the time.

The best value-added data service that  early adopters like the telecom 3 in   the United Kingdom could come  up was "video telephony". But   that required both sides to  have a more advanced handset. Other telecoms focused on offering business  services like corporate VPNs, advertising,   and facilitating transactions. Industry analysts  at the time projected advertising revenue growing   from $600 million in 2000 to $31 billion in  2010, 51 times over. Unrealistic by any measure. One operator joked that 3G stood  for "Games, Girls and Gaming".

Another joke went that UMTS really stood for  "Unproven Market, Technology and Services". Jokes aside. At the very heart of it, 3G  needed a killer app. Without it, all you   were going to get from the telecoms were vague  "statements of intent" but no real commitments.

## Then Came the iPhone And then came the iPhone, the first of a  class of smartphones with large screens,   full web browsers, and the  compute power of a mini-computer. The first iPhone was a 2G device - presumably for  battery life reasons. And in the United States,   launched exclusively on the AT&T mobile carrier. It sold fairly well - taking about 74 days  to hit a million devices sold. Even taking   into account how restricted the old 2G iPhone was,   users still ate 15 times more data than before,  and 50% more than what AT&T internally projected. But the real boom came when Apple released  its second-generation iPhone 3G and opened   up its App Store with third-party apps. The  iPhone became the center of its users' lives,  

used to check email, send messages,  watch unsavory video, and more. Plus AT&T started subsidizing the $599   handset if they signed up for a two-year  contract. After subsidy it costs just $199. Nice. We also had the rise of Android, which rose  in the iPhone's wake. And since Android was  

supported by several manufacturers and  was not locked to a single carrier,   these grew even faster than the Apple device. ## Data Crunch Adoption, data consumption, and strain  on the networks all accelerated. iPhone users consumed 24 times more  data than the average. Their old-school   unlimited data plans - at just $30 a  month, no less! - did not help either. iPhone usage grew data traffic on  the AT&T mobile network by 5,000%   from 2006 to 2009. From 2007 to 2010,  AT&T mobile data volume grew by 8,000%. And competing US telecom Verizon,  a notable backer of Android,   found that Android users also consumed several  times more data than ordinary phone users.

Locked in a race with Google to capture market  share, Apple was in no mood to artificially   restrain its world-changing product to accommodate  AT&T's pleading requests to help its network. AT&T saw no choice but to invest billions.  From 2007 to 2010, they invested a total of $37   billion into network upgrades. Nevertheless, the  ATTFAIL hashtag continued to circulate on Twitter. ## Conclusion The iPhone turned out to be  the Killer App that everyone   in the telecom industry had been waiting for. It got so bad, that analysts were out   there predicting a "data crunch"  apocalypse looming ahead in 2013. A time when the wireless networks might grind   to a halt - rendering all the  beautiful smartphones useless.

The wireless industry folks fought for a  glorious wireless future for 3G. But when   that future finally came, it was nothing  like what they thought it would be. The   industry scrambled to adapt, with lasting  effects on the coming 4G wireless era.

2025-02-13 00:26

Show Video

Other news

Fastest Biggest Firewood Processing Machine Technology | Firewood Processor In Action #54 2025-04-07 14:06
How Technology Can Help Build a Living World 2025-04-07 12:56
How to Prepare for a Python Interview: A Complete Guide 2025-04-05 23:34