Written by Jens Berger | November 22, 2021
It's hard to imagine our world without mobile communications and smartphones are much more than just telephones. They are the primary access to internet services for many. When talking about network performance, analyses should include all the typical use cases for the most benefit when optimizing the network experience.
Network performance is more than just data rate testing, which is important. Networks can react differently when faced with more complex demands for data transmission and when connected to content servers. Benchmarking, drive tests and walk tests typically involve applications such as video streaming, messaging or web-browsing for an end-to-end view of the performance.
End-to-end tests of popular services are called ‘app tests’. However, they do not test the app but the service behind it. The main issue is how the network performs relative to the individual app requirements.
Test case classification
Classifying test cases is a useful first step. This should not be done according to app or brand name but by the needs served and data transmission requirements. OTT messaging applications also offer telephony or data transfer and social media platforms provide streaming, telephony or push-to-talk services along with the posting of pictures.
When assigning a test case to a class, it should be grouped by the service provided and network requirements.
If keeping with such classification, testing WhatsApp® telephony is completely different in terms of network usage and has completely different KPIs than WhatsApp® messaging, Facebook® Post Picture and video streaming in Facebook® Watch. They may share a brand name but the individual services are for different user needs, present different challenges to networks and result in different KPIs to quantify performance.
This would mean that YouTube® video streaming and Facebook® Watch should be grouped together, just as WhatsApp® messaging and OTT messengers or WhatsApp® telephony should be matched with other telephony services and even VoLTE. The same set of KPIs apply to similar applications, such as ‘call setup time’ or ‘speech quality’ for telephony or ‘time to post a picture’ for messenger services or social media portals. Using the same KPIs allows a direct comparison for individual services and their end-to-end performance.
Rohde & Schwarz mobile network testing organizes solutions by grouping tests by subscriber services and network resources use, from test definition and KPI alignment to reporting and documentation. This keeps those things together that belong together.
Figure 2: Rohde & Schwarz mobile network testing organizes solutions by grouping tests by subscriber services and network resources use, from test definition and KPI alignment to reporting and documentation
Efficient data collection for mobile network performance measurement
Data collection should be efficient. Equipment and resources are expensive but produce high measurement resolution for time and physical location using moving drive or walk tests. The sparser the results, the more time and space gaps that go unobserved.
Efficient data collection involves gathering as much network insight as possible in a given period of time. This does not always mean transmitting as much data as possible in a short time as with speed tests over the network or covering as many apps as possible. The task is obtaining as many different transmissions as possible for a given period of time, so that performance can be specified for the entire network or a portion of it.
Typically, the focus is more on network performance and less on comparing the same type of different services. Applications that have identical or similar network resource use and are in the same group generate similar data patterns, place the same demand on networks and have similar KPIs. These services may use different compression schemes, time-outs or side-information and temporary bad connectivity is also possible. However, the basic data transfer scheme transfer and load patterns are the same. Therefore, running different applications from the same group provides almost no additional insight from the network perspective. If a network smoothly delivers YouTube® video streams, probably it can do the same for other video streaming services. Also, if WhatsApp® messaging works well, other OTT messenger apps will not be treated substantially differently during transmission.
Data speed tests are not enough on their own, they only provide information about the network under full load in a single transmission scheme. The network may behave differently when less data is transmitted, when transport is bulk-wise or other protocols are used.
When focusing on overall network performance, different applications should be combined in a test cycle, placing different demands to the network and resulting in the varied data patterns and protocols of a typical user. In line with the classification above, the tested services should represent the individual application and service classes based on their use of network resources. This should include popular services such as the retrieving of web-content, streaming media and posting information along with simple data downloads and uploads and telephony.
Integrative performance testing
ITU-T E.804.1, ETSI TR 103 559 and TR 103 702 also reference this spread across many services. The scoring methodology in ETSI TR 103 559 specifically compiles points for individual service classes and their dedicated KPIs and reports overall network performance that includes different network usage aspects split by region. The following scheme shows the aggregation methodology. The example uses ‘highways’ as regions, but the scheme is identical for all other regions in the campaign compiled to form an overall network score.
Figure 3: Integrative performance testing
Note: Rohde & Schwarz supports the Network Performance Score methodology in all their mobile network testing products and solutions, pre-configured measurement campaigns and dedicated views in post-processing tools.
Returning to the individual service classes such as web-content delivery or streaming, ETSI TR 103 559 provides reasonable examples of the importance of individual service and app classes. Originally designed to generate an integrative network performance scores, it can also be seen as a guideline for optimizing tasks to provide the best network performance for daily smartphone use.
Figure 4: Typical weight for individual data services in performance scoring
The example is clearly weighted to the delivery of web-content as files (web-pages, pictures, maps, …) or audio and video media streams. It is a rough estimation of the current situation.
Even though new use cases will be created under 5G, the picture may not change much but the services behind each component will become more interactive. Cloud gaming is a good example, a media stream is still delivered but it reacts to a player’s actions. In the same vein, interactive social media platforms will arise and deliver reactive web content.
Latency and interactivity will become just as important as data speed and transfer completeness are today. Latency and interactive applications are already at the heart of ETSI TR 702 103: QoS parameters and test scenarios for assessing network capabilities in 5G performance measurements.
The product line at Rohde & Schwarz mobile network testing already includes realistic transport latency and network interactivity measurements in their Interactivity Test.