For those who have been working in the field of wireless communications it’s hardly news: But it was entertaining to see PC Magazine declaring that CES 2011: The Year Mobile Took Over. I can understand the North American (or at least Western World) centric view of people attending shows like CES, but considering the fact that it’s been almost 10 years we surpassed the 1 Billion mark for worldwide mobile subscriptions, it’s rather amusing to see such declarations. Considering there are only 800 Million Internet hosts as opposed to 5.5 Billion mobile devices as of the end of 2010, I believe the supremacy of mobile technologies as the prime driver for computing and communications fields has been settled long time ago.
Ongoing marketing wars in North America to call various technologies as 3G or 4G is another source of amusement. It is quite fascinating that marketing teams in mobile operators work so hard on achieving that coveted 4G designation. I believe apart from a small group of people indulged in techno-babble, majority of public doesn’t care whether a technology is called as third or fourth generation. Ultimately customers pick and choose devices, services, operators based on their perceived value: Benefits of being connected for the amount of money they spend. However, the designation of 4G carries four attributes that eventually make a significant difference in that perceived value provided by a technology:
- Least important of these attributes is the peak throughput, that gets the biggest press
- Quality that determines factors like latency and duration of service interruption due to mobility (impact of handovers)
- Adaptability in various environments (such as indoors, outdoors and high mobility)
- Total system capacity that determines the unit cost per bit (or per voice call minute) delivered to a user
During the last 15-20 years, ever since 2G systems with data capabilities started appearing in the marketplace, peak throughput has been designated as the prime marketing figure to explain the prowess of a given wireless technology. Many in the technology press started throwing numbers like 2 Mb/s or 100 Mb/s as the sole criterion for technology generation designation.
Back in early 1990s, International Telecommunications Union (ITU) has developed the criteria for IMT-2000. Two widely adopted 3G standards W-CDMA and CDMA2000 fulfilled the IMT-2000 requirements including the original throughput objectives (384 kbit/s full mobility and 2 Mb/s nomadic use) while making other significant improvements in total system capacity, spectral efficiency, service quality, and ultimately reducing the unit cost significantly.
For the 4G designation (IMT-Advanced) ITU has adopted a similar set of criteria while setting the bar substantially higher. Following are the essential requirements for a technology to be considered as 4G by ITU, as described in ITU-R M.2134 Recommendation:
- Cell spectral efficiency
- Peak spectral efficiency
- Cell edge user spectral efficiency
- VoIP capacity
A more detailed description of 4G criteria and specific thresholds are in the following table:
Based on these thresholds, none of the wireless technologies, including LTE from Verizon and MetroPCS, HSPA+ from AT&T and T-Mobile and WiMax from Sprint and Clearwire can be considered as 4G technology, as they are deployed in 2011. Here is why:
- For HSPA+, main limitations are spectral efficiency, bandwidth and latency. As of today, there are no HSPA+ networks along with publicly available MIMO capable devices (HSPA+ category 16). Few Dual Carrier networks (such as Telstra since mid-2010 and T-Mobile planned for 2011) with category 24 devices provide a peak spectral efficiency of 4.2 bit/s/Hz while the cell spectral efficiency level being around 1.5 bit/s/Hz for the Dual Carrier networks (assuming only category 24 devices). Even though the standards (3GPP release 10) already allow up to 20 MHz bandwidth, neither AT&T nor T-Mobile have enough suitable spectrum to have such large channels in a single band. Instead as recently proposed by NSN, HSPA+ standards will be extended to aggregate carriers in two different bands. Certainly the feasibility of terminals with such capability remains to be seen. On the latency front, HSPA+ cut the user plane latency dramatically but it is still measured to be around 30-40 ms (round-trip delay) for best networks.
- For LTE, being a much newer revolutionary technology, technology designers tried to improve the spectral efficiency by heavy use of MIMO. LTE networks have been deployed with 2*2 MIMO from the start. This allowed the spectral efficiency to be boosted up, reaching peak spectral efficiency of 7.3 bit/s/Hz (10 MHz) while the cell spectral efficiency level of 1.8 bit/s/Hz (10 MHz). Even though LTE was defined for channelizations of 20 MHz, 700 MHz band isn’t large enough for such large bands to be assigned to a single operator. Probably this is the major motivator for AT&T to buy as much spectrum as it can in the lower 700 MHz band in the recent months. Based on the results from Verizon network, LTE user plane delay looks comparable to T-Mobile HSPA+ network.
- WiMax is fairly comparable to LTE in terms of spectral efficiency, and delay capabilities since they share the same multiple access scheme and use similar modulation, and coding schema. Similar to LTE, WiMax also uses 2*2 MIMO. WiMax latency profile is similar to LTE due to flatter network compared to HSPA+ but recent experiences on Sprint network (round trip delays ranging between 100-140 ms) tells us it is quite far from ITU targets.
Even though none of these technologies are fulfilling the ITU 4G criteria, each one has an evolution path to get there:
- HSPA+ is being expanded to allow 40 MHz channelization with 4*4 MIMO capability that allows it to achieve 16.8 bit/s/Hz peak spectral efficiency. Details about how to achieve the other targets are all part of the release 11 that will be completed by the end of 2012. Assuming an 18-month development cycle, we may see an ITU 4G criteria compliant network based on the evolved HSPA+ by mid-2014 assuming that operators can find 40 MHz of spectrum in two bands and manage to get 4*4 MIMO implemented successfully in terminals.
- LTE is closer to 4G thresholds compared to HSPA+. Primary reason is LTE was designed to incorporate MIMO from the start. If 4*4 MIMO is implemented in the terminal form factor to have a LTE category 5 device then using the LTE-Advanced that allows 40-100 MHz channelization will bring true 4G capabilities. LTE-Advanced is part of 3GPP release 10 and will be finalized by June 2011. Assuming an 18-month development cycle, we may see an ITU 4G criteria compliant network based on LTE-Advanced by the end of 2012 assuming that operators can find 40 MHz of spectrum and manage to get a category 5 LTE device (4*4 MIMO).
- On the WiMax front, IEEE 802.16m development is ongoing. Considering the IEEE process I believe they are at least a year behind LTE-Advanced. However, the bigger problem for WiMax is the commercialization. Considering the exodus of many operators from the WiMax camp, I find it very unlikely that we will ever see an ITU 4G criteria compliant network based on 802.16m.
Going back to where we started: customers expect great connectivity with respect to the amount of money they pay to wireless operators. Wireless technologies are tools to achieve this objective. Three technologies (HSPA+, LTE, WiMax) we analyzed are pretty similar in terms of current capabilities and they all fall short of ITU 4G criteria.
It is unlikely that WiMax will continue to evolve. Instead it will slowly stagnate similar to CDMA2000. LTE being the most commonly adopted wireless technology standard will flourish and will be the foundational wireless technology for the next 20 years. However, deep adoption of LTE will take significant time. HSPA+ on the other hand will be the work-horse technology for this decade. It will eventually grab the coveted ITU 4G designation. However, even without such designation it will continue to carry majority of wireless broadband connections throughout this decade.