Anda di halaman 1dari 14

HSDPA Throughput: Do

Today’s Devices Really


Perform?
January 2007

Part Number 79-000781 Rev.0 0107


Spirent Communications, Inc.
15200 Omega Drive
Rockville, MD 20850 USA

Spirent Communications
541 Industrial Way West
Eatontown, NJ 07724 USA
T: +1 732.544.8700

Email: sales-spirent@spirent.com
Web: www.spirent.com

North America
T: +1 800.927.2660

Europe, Middle East, Africa


T: +33 1 6137.2250

Asia Pacific
T: +852 2511-3822

Copyright
© 2007 Spirent Communications, Inc. All Rights Reserved.
All of the company names and/or brand names and/or product names referred to in this
document, in particular, the name “Spirent” and its logo device, are either registered
trademarks or trademarks of Spirent plc and its subsidiaries, pending registration in
accordance with relevant national laws. All other registered trademarks or trademarks are
the property of their respective owners.
The information contained in this document is subject to change without notice and does
not represent a commitment on the part of Spirent Communications. The information in
this document is believed to be accurate and reliable; however, Spirent Communications
assumes no responsibility or liability for any errors or inaccuracies that may appear in the
document.
HSDPA Throughput: Do Today’s
Devices Really Perform?

Contents
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

Why Conduct Data Throughput Performance Testing? . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

Physical Layer Data Throughput Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

Application Layer Data Throughput Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

Spirent Communications White Paper 3


HSDPA Throughput: Do Today’s Devices Really Perform?
Introduction

Introduction
Today’s mobile applications and services require much more from a mobile device than
the ability to make a voice call. From location-based services to video streaming, these
applications generally have one requirement in common: higher bandwidth. To meet the
need for increased bandwidth, most 3G network operators are turning to the High-Speed
Downlink Packet Access (HSDPA) enhancements from Release 5 of the 3GPP’s
WCDMA / UMTS network specifications.

These enhancements are based on the concept of an optimized “shared” data-pipe.


Performance improvements result from the use of adaptive modulation and rate control
techniques that depend heavily on UE interaction and feedback. HSDPA allows more
efficient use of network resources to maximize total aggregate throughput. These
carefully calculated resource trade-offs are intended to enable optimum performance on
not just one but all UEs in the network.

More than ever before, thorough testing of these devices is critical to ensuring they will
not adversely impact either the network or the subscriber’s quality of experience (QoE)
of new applications and services. The 3GPP has developed test standards to establish a
UE’s conformance to the requirements of the specifications. However, while
conformance testing may establish a common minimum performance baseline, it does not
provide true real-world characterization metrics that can be used to further optimize both
the network and ultimately the end-user’s QoE.

The key performance metric for HSDPA is data throughput, which is highly dependent
on the RF multipath and interference environment experienced by a UE. While the 3GPP
conformance test specifications include a test that addresses data throughput scenarios,
this test uses a Fixed Reference Channel (FRC) with specified fading and noise. Under
these conditions, the performance of the UE is measured only at the physical layer, and
the test produces a single data point for each set of conditions. While this data point may
enable a simple pass/fail diagnosis, it indicates very little about how well the UE is
performing from an end-user’s perspective.

To truly understand UE performance, additional data throughput testing needs to be


conducted, including testing at the application layer. This paper uses a Spirent APEX
UMTS Data Performance test system (high-level architecture shown in Figure 1) to
analyze the data throughput performance of multiple Category 6 (CAT 6) UEs in the
presence of varying RF fading and noise conditions at both the physical and application
layers. Comprehensive analysis of the performance test results is carried out to uncover
important differences that can significantly impact the network and subscriber QoE.

HSDPA Network UE Under Test


Emulator Ior Channel Io
Tx
Emulator
Rx/Tx
Rx

Figure 1 – Test System Architecture

Spirent Communications White Paper 4


HSDPA Throughput: Do Today’s Devices Really Perform?
Why Conduct Data Throughput Performance Testing?

Why Conduct Data Throughput Performance Testing?


When measured in the presence of multipath fading and noise, the data throughput rate is
a fundamental indication of a UE’s HSDPA performance. Since throughput performance
is also a key enabler for the successful launch of new data-centric applications, network
operators and UE manufacturers need to conduct this testing to fully maximize their
return on investment. Failure to find a UE performance flaw prior to launch could end up
costing all involved parties dearly.

Given the highly dynamic configuration of an HSPDA channel, the UE will constantly
compete for resources on that shared channel. The data rate will thus continuously vary
as a function of the channel quality seen by the UE and be will also affected by network
configuration. Given the adaptive nature of the technology, operators and UE
manufacturers should conduct testing under a wide range of varying configurations to
ensure the UE, network and application all work together and perform as designed.
While metrics such as data throughput qualify performance, it is QoE, the subscriber’s
perception of performance, that ultimately determines acceptance of the device or
application.

3GPP Release 5 specifies twelve categories for HSDPA UEs. Category 12 (CAT 12)
UEs were used on most early HSDPA deployments. CAT 12 UEs support up to 1.8
Mbps using a Quadrature Phase Shift Keying (QPSK) modulation scheme.
Comprehensive testing of these devices reveals some performance differences between
UEs. However, the less complex modulation scheme combined with the number of
available HS codes typically offers additional coding protection, resulting in an adequate
performance margin.

Observed performance differences between UEs are bound to increase as potential


throughput rates increase, especially when combined with the more complex 16-
Quadrature Amplitude Modulation (16-QAM) scheme supported by Category 6 (CAT 6)
UEs. Factors such as the Channel Quality Indicator (CQI) algorithm and its estimate of
channel quality will ultimately result in a modulation trade-off analysis (QPSK vs. 16-
QAM) which will have a direct impact on the throughput capabilities of the UE. A
poorly implemented CQI algorithm on one model of UE can have a significant adverse
impact on the efficient allocation of network resources, which will affect all other UEs on
the network. As CAT 6 UEs appear commercially in ever-greater numbers, it is the
performance of this UE category that is currently of greatest interest.

Physical Layer Data Throughput Testing


CAT 6 UEs support data rates up to 3.6 Mbps. There are several contributing factors to
this higher data rate capability including a larger available Transport Block Set (TBS)
size, an increase in the number of soft bits and, most significantly, the ability to support
16-QAM modulation. While these improvements may provide real value for the end-
user, they further increase the complexity in developing valid test methodologies.

Testing at the physical layer is a good starting point for initial performance evaluation of
a UE or application. Figure 2 shows the results of physical layer downlink data
throughput tests conducted on four CAT 6 HSDPA UEs under two sets of static channel
conditions.

Spirent Communications White Paper 5


HSDPA Throughput: Do Today’s Devices Really Perform?
Physical Layer Data Throughput Testing

4000 Reduction in Maximum Throughput @ weakened


channel conditions (>30% drop on all CAT 6 UEs tested)
3500

Data Throughput (Kbps)


3000

2500 UE A
UE B
2000
UE C
1500 Throughput Differences @ weakened channel UE D
conditions (>10% when UE B is compared to UE D)
1000

500

0
HS-PDSCH = -3 dB / Ior = -60 dBm HS-PDSCH = -6 dB / Ior = -80 dBm
Static Channel Conditions

Configuration = HS-PDSCH = -3 or -6 dB, Ior = -60 or -80 dBm (No Noise)


TTI = 1, # of H-ARQ = 6, CQI = Fixed at 22, TBS = Set per TR 25.214

Figure 2 - 6 UE Physical Layer Data Throughput Results (Under Static


Conditions)

The data throughput results shown in Figure 2 were obtained under both favorable and
weakened static channel conditions, with the HSDPA channel configured to allow for
maximum throughput (TTI = 1, number of HARQ process = 6, and CQI = Fixed at 22).

The results indicate that under favorable conditions (HS-PDSCH = -3 dB / Ior = -60
dBm) all four UEs perform at close to the maximum expected data throughput rate of 3.6
Mbps. However, under weakened channel conditions (HS-PDSCH = -6 dB / Ior = -80
dBm) there is a significant reduction (> 30% on all of the UEs) in data throughput.

To understand the reasons for this, it is important to remember that the UE must rely on
16-QAM modulation and demodulation to obtain the maximum data throughput rate. 16-
QAM is more complex to successfully decode than the QPSK modulation used by CAT
12 UEs. CAT 6 UE performance is also influenced by code power reduction; these UEs
have little available margin before the median CQI drops below the desired maximum
value of 22.

Significant variation is apparent between UEs under weakened static conditions. While
physical data throughput may provide an indication of UE performance, it does not
isolate the root causes of poor performance. For this, additional metrics such as the
Median CQI or the Mac-HS Statistics are helpful. Figure 3 shows a histogram of a
typical CQI distribution (Median = 16) under a standard fading profile, recorded during a
standard data throughput test. A histogram that does not approximate a bell-shaped curve
could indicate an issue with the UE’s CQI algorithm or with its receiver.

Spirent Communications White Paper 6


HSDPA Throughput Testing: Do Today’s Devices Really Perform?
Physical Layer Data Throughput Testing

14.00%
CQI Data (Median 16)
12.00%

10.00%

8.00%

6.00%

4.00%

2.00%

0.00%
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30

Figure 3 - CAT 6 UE: CQI Histogram Example - Under a Standard Fading Profile
(Median CQI = 16)

Mac-HS Statistics can also play a critical role in determining the root cause of poor
performance. Table 1 contains the Mac-HS Statistics for both static tests shown in Figure
2.

UE Under Throughput Median ACK % NACK % Stat DTX


Test (Kbps) CQI %
Static Conditions – HS-PDSCH = -3 dB / Ior = -60 dBm (favorable conditions)
A 3583.76 24 99.99 0.1 0
B 3575.08 24 99.75 0.25 0
C 3583.99 24 100 0 0
D 3583.99 24 100 0 0
Static Conditions – HS-PDSCH = -6 dB / Ior = -80 dBm (weakened conditions)
A 2402.26 20 67.03 32.97 0
B 2223.86 21 62.05 37.95 0
C 2459.44 21 68.6 31.4 0
D 2507.98 20 69.97 30.03 0

Table 1 - Mac-HS Statistics (ACK / NACK / Stat DTX Performance)


for CAT 6 UEs

While all the UEs experienced a drop in the physical layer throughput rate under
weakened channel conditions, further investigation shows that all the UEs did not report
the same channel quality. Although two UEs obtained a Median CQI of 21, a higher CQI
does not guarantee a higher throughput rate. For example, while UE B reports a higher
channel quality, it is experiencing a higher percentage of NACKs. This results in a
throughput rate more than 10% lower than that of UE D, which reports a Median CQI of
only 20. The conclusion: UE B is requesting network resources that it does not use very
effectively, resulting in an inefficient resource allocation scenario that can adversely
impact the entire network.

Spirent Communications White Paper 7


HSDPA Throughput: Do Today’s Devices Really Perform?
Application Layer Data Throughput Testing

It is important to note that physical layer testing is conducted using a fixed CQI value,
ignoring the actual CQI reported by the UE. This allows an “apples-to-apples”
comparison across various UEs by removing the CQI algorithm from consideration.
However, since for CAT 6 UEs the CQI value determines the modulation scheme, great
care must be taken when choosing the fixed value used in testing to ensure its potential
impact on the results is clearly understood. For example, under certain channel
conditions, the fixed CQI value used in a test could result in 16-QAM modulation used
for the duration of the test when the actual CQI value reported by the UE indicates the
use of QPSK.

Naturally, this scenario differs from a live network, which would listen and respond to
the CQI reports supplied by the UE and dynamically change the resources allocated to
that UE based on the reported channel conditions. For CAT 6 UEs, this would include a
switch from 16-QAM to QPSK modulation if the reported CQI drops below 15.
Performance on a live network can also be affected by differences in CQI algorithm
implementations between UE manufacturers. A fixed CQI test methodology does not
take into account the dynamic resource allocation nor the contribution that CQI algorithm
variations can have on UE performance in a live network.

It should be clear by now that conducting data throughout testing only at the physical
layer with a standards-based fixed CQI test methodology is insufficient and may lead to
an inaccurate picture of actual UE performance on a live network. To truly determine the
performance of a new application or UE, application layer data throughput tests should be
conducted. The result of these tests is also referred to as goodput, the measure of the data
throughput actually available to a mobile application.

Application Layer Data Throughput Testing


Testing at the application layer is more representative of the performance of a UE from
an end-user’s perspective. However, application layer testing also introduces new test
methodology concerns. For example, what should the Radio Link Control (RLC)
Window size be, what transfer protocol does the application require and how will that
impact throughput performance?

Figure 4 shows the results of application layer downlink data throughput tests conducted
using a File Transfer Protocol (FTP) for the same four CAT 6 UEs under favorable
channel conditions with four standard fade models applied: ITU Pedestrian A Speed
3km/h (PA3), ITU Pedestrian B Speed 3km/h (PB3), ITU Vehicular A Speed 30km/h
(VA30) and ITU Vehicular A Speed 120km/h (VA120).

Spirent Communications White Paper 8


HSDPA Throughput: Do Today’s Devices Really Perform?
Application Layer Data Throughput Testing

2000
Clear Performance Differences
1800 UE D shows a 30% increase in data throughput
rates when compared to the other UEs tested.
1600

Data Throughput (Kbps)


1400

1200 UE A
UE B
1000
UE C
800 UE D
600

400

200

0
PA3 PB3 VA30 VA120
Fade Model (Per 34.121 - Appendix D)

Configuration = HS-PDSCH = -3 dB, Ior = -60 dBm, Ior/Ioc = 10 dB


TTI = 1, # of H-ARQ = 6, CQI = Based on UE Reports, TBS = Set per TR 25.214

Figure 4 - CAT 6 UE Application Layer (FTP) Data Throughput


(Under Favorable Channel Conditions)

As anticipated, testing the UEs at the application layer reveals significant performance
differences between them. The first observation is that actual downlink data throughput
rates are significantly less than the 3.6 Mbps obtained earlier at the physical layer. The
UEs averaged a 57% drop in data throughput for the PA3 fade model compared with the
maximum throughput allowed at the physical layer, and greater than a 70% drop (on
average) for all of the other fade models (PB3, VA30, and VA120) tested.

It is also evident that strong performance with one fade model is not necessarily
replicated with other fade models (even at the same modeled speed). While UE C
performed quite well relative to UEs A and B with the PA3 fade model, with all other
fade models tested (including the PB3 fade model which models the same velocity as
PA3) its performance was comparable to that of UEs A and B.

Although actual throughput rates are lower than those anticipated from the physical layer
test results, it is nonetheless clear that UE D has a significant performance advantage
compared with all other UEs tested. Under certain test conditions (ex. PB3 and VA30
fade models), UE D is capable of successfully transferring at least 30% more data than
any of the others under the same channel conditions.

Given the large differences in observed performance, additional statistical data or logs
from these test runs were analyzed. Table 2 contains the Mac-HS Statistics obtained for
the VA30 Fading Profile.

Spirent Communications White Paper 9


HSDPA Throughput: Do Today’s Devices Really Perform?
Application Layer Data Throughput Testing

UE Under Throughput Median ACK % NACK % Stat DTX


Test (Kbps) CQI %
A 878.68 14 72.36 27.64 0
B 930.51 15 75.50 24.49 0
C 932.5 15 65.4 34.6 0
D 1344.05 17 71.09 28.91 0
Table 2 - Mac-HS Statistics for CAT 6 UEs under the VA30 Fading Profile

A clear performance difference is apparent in terms of Median CQI even though ACK
and NACK percentages are approximately the same. Under these channel conditions, UE
D outperforms the others in Median CQI level and associated throughput. Results
suggest UE D has a better-performing receiver or a more sophisticated CQI algorithm, or
both.

As previously indicated, available code power and noise level play a significant role in
the data throughput capabilities of a UE or application. Figure 5 shows the results of
application layer downlink data throughput tests using FTP for the four UEs under the
PB3 fade model, with varying code power levels. Figure 6 shows the results of testing
the same four UEs under the PB3 fade model with varying Carrier to Noise (C/N) Ratio.

1600 Code Power Reduction yields Throughput Degradation


1 dB = 16% (on average) & 2 dB = 29% (on average)
1400
Data Throughput (Kbps)

1200

1000 UE A
UE B
800
UE C
600 UE D

400

200

0
HS-PDSCH = -3 dB HS-PDSCH = -4 dB HS-PDSCH = -5 dB HS-PDSCH = -6 dB
Code Power Level (HS-PDSCH Level) - For Fade Model PB3 (Per 34.121 Appendix D)

Configuration = HS-PDSCH = -3 to - 6 dB, Ior = -80 dBm, Ior/Ioc = 10 dB


TTI = 1, # of H-ARQ = 6, CQI = Based on UE Reports, TBS = Set per TR 25.214

Figure 5 - CAT 6 UE Application Layer (FTP) Data Throughput Results at


Varying Code Power Levels

Reducing the code power on the UEs clearly results in a reduction in data throughput. A
1 dB drop in available code power yields, on average, a 16% drop in data throughput,
while a 2 dB drop in available code power results in an average 29% drop. These results
reinforce the idea that allocated code power plays a significant role in throughput
capability of a UE (or application). They also indicate once again that UE D has a
noticeable performance advantage over the other UEs.

Spirent Communications White Paper 10


HSDPA Throughput: Do Today’s Devices Really Perform?
Application Layer Data Throughput Testing

1000
C/N Ratio Reduction yields Throughput Degradation
900 (UE D still out performs other UEs by 28% even at 2 dB C/N)

800

Data Throughput (Kbps)


700

600 UE A
UE B
500
UE C
400 UE D
300

200

100

0
10 dB 5 dB 2 dB
Carrier to Noise Ratio (Ior/Ioc) - For Fade Model PB3 (Per 34.121 - Appendix D)

Configuration = HS-PDSCH = - 6 dB, Ior = -80 dBm, Ior/Ioc = 10, 5, & 2 dB


TTI = 1, # of H-ARQ = 6, CQI = Based on UE Reports, TBS = Set per TR 25.214

Figure 6 - CAT 6 UE Application Layer (FTP) Data Throughput Results at


Varying Carrier to Noise Ratios

Reducing the C/N Ratio also forces a decrease in overall data throughput rates across all
of the UEs. It is interesting to note the C/N Ratio reduction also impacts UE D’s data
throughput performance advantage over the other UEs. However, even with an 8 dB
reduction in the C/N Ratio, UE D still outperforms the other UEs tested by an average of
28%.

As mentioned earlier, another area of potential impact on data throughput performance is


the data protocol used for the file transfer. All examples so far have used FTP protocol,
typically employed for applications that require acknowledgement of received packets.
Some applications, in particular those that are delay sensitive (such as video streaming),
cannot support the overhead required for FTP.

These applications typically employ User Datagram Protocol (UDP). Since UDP is not
subject to acknowledgements, data throughput performance using UDP is, in principle,
higher than with FTP. Figure 7 shows the results of application layer downlink data
throughput tests conducted using UDP, under favorable channel conditions and with the
same standard fade models applied.

Spirent Communications White Paper 11


HSDPA Throughput: Do Today’s Devices Really Perform?
Conclusions

2000
UDP File Transfer Performance is as expected
1800
UE D still has a clear advantage over the other UEs
1600

Data Throughput (Kbps)


1400

1200 UE A
UE B
1000
UE C
800 UE D

600

400

200

0
PA3 PB3 VA30 VA120
Fade M odel (Per 34.121 - Appendix D)

Configuration = HS-PDSCH = -3 dB, Ior = -60 dBm, Ior/Ioc = 10 dB


TTI = 1, # of H-ARQ = 6, CQI = Based on UE Reports, TBS = Set per TR 25

Figure 7 - CAT 6 UE Application Layer (UDP) Data Throughput


(Under Favorable Channel Conditions)

When compared with the application layer downlink data throughput results using FTP
(see Figure 4), results are as expected. The UDP downlink data throughput rates slightly
exceeded the FTP downlink data throughput rates (by an average of around 1%) in all
cases. Once again, UE D has a clear advantage over the other UEs tested when it comes
to downlink data throughput performance.

Conclusions
The most obvious conclusion is that thorough testing of an HSDPA UE is a complex
process! This paper has focused on just one aspect: determination of downlink data
throughput rate capabilities of a device under varying conditions. It has shown that
physical layer testing alone, while able to provide an initial indication of performance
differences, is not sufficient to obtain an accurate picture of how a UE will perform under
real-world scenarios.

As an example, test results indicate that 3.6 Mbps data rates can only be obtained from
CAT 6 UEs under ideal channel conditions, which will almost certainly never occur in
the real world. Furthermore, data throughput rates at the physical layer (the metrics most
often publicized by manufacturers and network operators) are unlikely to be
representative of throughput rates at the applications layer. Yet application layer rates
have the greatest impact on subscriber QoE.

Each layer of the protocol stack has multiple interactions, any or all of which can have an
impact on the final application layer throughput delivered to the end-user. As a result,
application layer testing needs to be conducted under a range of environmental conditions
and file transfer protocols to reveal the true throughput capabilities of the UE or
application. Network operators and UE manufacturers should use this data to ensure
subscriber QoE for a new feature implementation or application launch.

Spirent Communications White Paper 12


HSDPA Throughput: Do Today’s Devices Really Perform?
References

Rather than rely on a simple pass/fail result from a conformance test, network operators
and UE manufacturers should determine the “headroom” for an application from data
throughput rates obtained during testing. From these UE test results, it is clear there is a
large degradation in maximum application layer data throughput rates under
representative real-world conditions.

The results also indicate that variables such as code power levels, C/N ratios and channel
configurations can all significantly impact the throughput potential of a UE or
application. Other factors also shown to impact the performance include TBS Block
Size, the number of HARQ processes allocated and the number of HS Codes assigned.
For example, the greater the number of codes allocated, the more coding protection
provided to a particular UE. This may result in an improvement in the UE’s ability to
absorb errors caused by adverse channel conditions.

One of the more important conclusions stems from the clearly differentiated performance
of UE D. The fact that this UE uses the exact same chipset as UE A begs the question of
whether conformance testing alone adequately evaluates an HSDPA UE’s performance.
Should UE manufacturers and application developers rely on the assumption that
technology providers have adequately tested the performance of their chipsets under a
wide range of representative conditions? Given the complexity of HSDPA devices and
networks and the applications they are expected to support, it appears that the final UE
implementation itself should be more thoroughly tested under the constantly changing
conditions and resource allocations that are part of the real HSDPA environment.

While a thorough test process for an HSDPA device may start with the standards (i.e.
3GPP), it really needs to be extended beyond these minimum conformance-based
requirements to reflect the dynamic WCDMA/HSDPA environment which is built and
implemented based on trade-offs. Failure to extend the scope of testing beyond
conformance can lead to inefficient network resource allocation and performance issues –
thus negatively affecting both the network and end-user QoE.

References
1. 3GPP Technical Specification 34.121-1, V7.3.0 (2006-12) – 3rd Generation
Partnership Project, Technical Specification Group Radio Access Network, User
Equipment (UE) conformance specification, Radio transmission and reception
(FDD); Part 1: Conformance Specification (Release 7).

2. 3GPP Technical Specification 25.214, V7.3.0 (2006-12) – 3rd Generation Partnership


Project, Technical Specification Group Radio Access Network, physical layer
procedures (FDD) (Release 7).

Spirent Communications White Paper 13

Anda mungkin juga menyukai