There is no 4G in the USA (and anywhere in the world) yet and why earliest 4G network is 2 years away

For those who have been working in the field of wireless communications it’s hardly news: But it was entertaining to see PC Magazine declaring that CES 2011: The Year Mobile Took Over. I can understand the North American (or at least Western World) centric view of people attending shows like CES, but considering the fact that it’s been almost 10 years we surpassed the 1 Billion mark for worldwide mobile subscriptions, it’s rather amusing to see such declarations. Considering there are only 800 Million Internet hosts as opposed to 5.5 Billion mobile devices as of the end of 2010, I believe the supremacy of mobile technologies as the prime driver for computing and communications fields has been settled long time ago.

Ongoing marketing wars in North America to call various technologies as 3G or 4G is another source of amusement. It is quite fascinating that marketing teams in mobile operators work so hard on achieving that coveted 4G designation. I believe apart from a small group of people indulged in techno-babble, majority of public doesn’t care whether a technology is called as third or fourth generation. Ultimately customers pick and choose devices, services, operators based on their perceived value: Benefits of being connected for the amount of money they spend. However, the designation of 4G carries four attributes that eventually make a significant difference in that perceived value provided by a technology:

  • Least important of these attributes is the peak throughput, that gets the biggest press
  • Quality that determines factors like latency and duration of service interruption due to mobility (impact of handovers)
  • Adaptability in various environments (such as indoors, outdoors and high mobility)
  • Total system capacity that determines the unit cost per bit (or per voice call minute) delivered to a user

During the last 15-20 years, ever since 2G systems with data capabilities started appearing in the marketplace, peak throughput has been designated as the prime marketing figure to explain the prowess of a given wireless technology. Many in the technology press started throwing numbers like 2 Mb/s or 100 Mb/s as the sole criterion for technology generation designation.

Back in early 1990s, International Telecommunications Union (ITU) has developed the criteria for IMT-2000. Two widely adopted 3G standards W-CDMA and CDMA2000 fulfilled the IMT-2000 requirements including the original throughput objectives (384 kbit/s full mobility and 2 Mb/s nomadic use) while making other significant improvements in total system capacity, spectral efficiency, service quality, and ultimately reducing the unit cost significantly.

For the 4G designation (IMT-Advanced) ITU has adopted a similar set of criteria while setting the bar substantially higher. Following are the essential requirements for a technology to be considered as 4G by ITU, as described in ITU-R M.2134 Recommendation:

  1. Cell spectral efficiency
  2. Peak spectral efficiency
  3. Bandwidth
  4. Cell edge user spectral efficiency
  5. Latency
  6. Mobility
  7. Handover
  8. VoIP capacity

A more detailed description of 4G criteria and specific thresholds are in the following table:

Based on these thresholds, none of the wireless technologies, including LTE from Verizon and MetroPCS, HSPA+ from AT&T and T-Mobile and WiMax from Sprint and Clearwire can be considered as 4G technology, as they are deployed in 2011. Here is why:

  • For HSPA+, main limitations are spectral efficiency, bandwidth and latency. As of today, there are no HSPA+ networks along with publicly available MIMO capable devices (HSPA+ category 16). Few Dual Carrier networks (such as Telstra since mid-2010 and T-Mobile planned for 2011) with category 24 devices provide a peak spectral efficiency of 4.2 bit/s/Hz while the cell spectral efficiency level being around 1.5 bit/s/Hz for the Dual Carrier networks (assuming only category 24 devices). Even though the standards (3GPP release 10) already allow up to 20 MHz bandwidth, neither AT&T nor T-Mobile have enough suitable spectrum to have such large channels in a single band. Instead as recently proposed by NSN, HSPA+ standards will be extended to aggregate carriers in two different bands. Certainly the feasibility of terminals with such capability remains to be seen. On the latency front, HSPA+ cut the user plane latency dramatically but it is still measured to be around 30-40 ms (round-trip delay) for best networks.
  • For LTE, being a much newer revolutionary technology, technology designers tried to improve the spectral efficiency by heavy use of MIMO. LTE networks have been deployed with 2*2 MIMO from the start. This allowed the spectral efficiency to be boosted up, reaching peak spectral efficiency of 7.3 bit/s/Hz (10 MHz) while the cell spectral efficiency level of 1.8 bit/s/Hz (10 MHz). Even though LTE was defined for channelizations of 20 MHz, 700 MHz band isn’t large enough for such large bands to be assigned to a single operator. Probably this is the major motivator for AT&T to buy as much spectrum as it can in the lower 700 MHz band in the recent months. Based on the results from Verizon network, LTE user plane delay looks comparable to T-Mobile HSPA+ network.
  • WiMax is fairly comparable to LTE in terms of spectral efficiency, and delay capabilities since they share the same multiple access scheme and use similar modulation, and coding schema. Similar to LTE, WiMax also uses 2*2 MIMO. WiMax latency profile is similar to LTE due to flatter network compared to HSPA+ but recent experiences on Sprint network (round trip delays ranging between 100-140 ms) tells us it is quite far from ITU targets.

Even though none of these technologies are fulfilling the ITU 4G criteria, each one has an evolution path to get there:

  • HSPA+ is being expanded to allow 40 MHz channelization with 4*4 MIMO capability that allows it to achieve 16.8 bit/s/Hz peak spectral efficiency. Details about how to achieve the other targets are all part of the release 11 that will be completed by the end of 2012. Assuming an 18-month development cycle, we may see an ITU 4G criteria compliant network based on the evolved HSPA+ by mid-2014 assuming that operators can find 40 MHz of spectrum in two bands and manage to get 4*4 MIMO implemented successfully in terminals.
  • LTE is closer to 4G thresholds compared to HSPA+. Primary reason is LTE was designed to incorporate MIMO from the start. If 4*4 MIMO is implemented in the terminal form factor to have a LTE category 5 device then using the LTE-Advanced that allows 40-100 MHz channelization will bring true 4G capabilities. LTE-Advanced is part of 3GPP release 10 and will be finalized by June 2011. Assuming an 18-month development cycle, we may see an ITU 4G criteria compliant network based on LTE-Advanced by the end of 2012 assuming that operators can find 40 MHz of spectrum and manage to get a category 5 LTE device (4*4 MIMO).
  • On the WiMax front, IEEE 802.16m development is ongoing. Considering the IEEE process I believe they are at least a year behind LTE-Advanced. However, the bigger problem for WiMax is the commercialization. Considering the exodus of many operators from the WiMax camp, I find it very unlikely that we will ever see an ITU 4G criteria compliant network based on 802.16m.

Going back to where we started: customers expect great connectivity with respect to the amount of money they pay to wireless operators. Wireless technologies are tools to achieve this objective. Three technologies (HSPA+, LTE, WiMax) we analyzed are pretty similar in terms of current capabilities and they all fall short of ITU 4G criteria.

It is unlikely that WiMax will continue to evolve. Instead it will slowly stagnate similar to CDMA2000. LTE being the most commonly adopted wireless technology standard will flourish and will be the foundational wireless technology for the next 20 years. However, deep adoption of LTE will take significant time. HSPA+ on the other hand will be the work-horse technology for this decade. It will eventually grab the coveted ITU 4G designation. However, even without such designation it will continue to carry majority of wireless broadband connections throughout this decade.

Advertisements
This entry was posted in General and tagged , , , , , , , , , , , , , , , , , , , , , . Bookmark the permalink.

3 Responses to There is no 4G in the USA (and anywhere in the world) yet and why earliest 4G network is 2 years away

  1. ric caselli says:

    Very informative assessment of the state of the art. However, does it matter? I see other issues actually decreasing the performance of the network. Pilot pollution and tower hopping are getting worse in high rise environments. I surveyed a location in New York last month where with -60dBm signal level you could not make a call.
    Femtocells also often fail to deliver due to inadequate or lacking broadband connections, in rural areas for example.
    I would like to know how you see the evolution of mobile technology in the enterprise and if 802.21 is going to be a possible alternative to repeaters.

    • wirelesse2e says:

      Ric,

      Thank you for your comments. Does it really matter that there is no 4G yet anywhere? As I tried to explain in the original post, generation number doesn’t matter but attributes to attain the stamp of a newer generation translate into price/performance improvements for end-users. I believe that eventually impacts a very big business (over $150B per year only in the USA).

      Indoors (especially high buildings) are more susceptible to the situation where there is no clear cell a device can use. If -60 dBm is due to signal from 5 different equal cells, it is not much use. In GSM with frequency planning this was possible to resolve. In (W)CDMA, things are different due to frequency reuse of 1. I believe if an operator is rich in spectrum they can rely on using indoor DAS or femtocell deployments using a distinct carrier if DAS or femto are only serving a closed user group. In parallel, I am sure constant network optimization of antenna tilt adjustments, correcting neighbor cell lists as well as use of repeaters are used. Difficult part is, such problems start appearing after networks start getting loaded.

      Enterprise deployments are two types:
      1- enterprise owns the radio network (typical femtocell) deployment
      2- operator owns and operates the radio network

      Model 1 assumes enterprise provides the broadband backhaul, pays for the nodeB and has the right to restrict service to a closed user group. I believe this model will require a dedicated carrier instead of using macro cell frequencies to solve the near-far interference scenarios.

      Model 2 is the traditional operator owned, designed and deployed solution. In this case operator can decide on the form factor (femto, pico, DAS). Furthermore operator can either place a new Ethernet backhaul or leases (borrows) enterprise owner’s IP transport facilities. However, the essential thing in this model would be radio network will be open to all allowed users for the operator’s larger network. Certainly this model doesn’t require a dedicated carrier and more importantly it will help to solve the pilot pollution problem.

      Success for 802.21 seems to be closely tied to the adoption of WiMax. Many of its original promoters were from the WiMax (WiBro) community. On the other hand, 3GPP came up with IP Flow Mobility (IFOM). Considering IFOM relies on DS-MIPv6 and 802.21 assumes MIPv4 and MIPv6 for L3 mobility, probably there is not a substantial difference. However, I believe 3GPP’s approach is more sound: rely on the most generic Mobile IP mechanism, avoid dealing with layer 2 mobility issues for heterogenous networks and leave that to implementation innovation. Take a look at SNR Labs (www.snrlabs.com). They seem to be on the right track.

      Thanks,
      Murat

  2. Ben says:

    I received a letter from my carrier last week, AT&T informing me in no uncertain terms that I was on a 4G network. That was all the letter said – 4G. Yup, 4G is here right now. The marketing people jumped the gun – I think it was Sprint who called it first, but for all intents and purposes, 4G has arrived. Now, what shall we call the “real 4G”? 5G maybe?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s