Mobile network testing blog

Stories & insights

Mobile network testing

Written by Jens Berger | December 21, 2016

Evaluating visual quality and video MOS based on subjective perception

Video services are very popular, and so is their use today in various services and media. YouTube and Netflix are merely the pioneers in video streaming; many other services have followed. Broadcasting stations move more and more content online, videos are shared via social media, and peer-to-peer applications are on the rise; for example, video chat services (e.g., Skype), or the private broadcasting of live captured videos (e.g., Periscope). But what's their visual quality like?

Evaluating visual quality and video MOS based on subjective perception

For an answer, we have to dig deeper and ask: how does the customer subjectively perceive visual quality? Is he satisfied and continues to use the service, or will he change to another one? The quality of experience (QoE), the user-centric metric capturing the overall acceptability of a service, has to be monitored and optimized. For that, it must be compared to competing services and the app's previous solutions.

The specifics of visual quality

"How to measure the video service's quality and user's quality of experience" sounds like a simple question, but it is not. At first, an obvious answer would be: just measure the video quality. Is the recording smooth and without interruptions? Are the images vivid and crisp? Is the display size in line with the screen resolution?

These are all valid questions. However, in a broader context, the video's actual quality is only one of many factors determining the customer's satisfaction with the service. Other factors to be considered include: how is the service structured, can I find what I am looking for, and how long does it take for the video to appear on my display?

Do I experience long freezing times or even "lose" the video? Finally: what are my expectations? Again, the answer depends on my experience: did I experience better or worse quality using competing services? Did I pay for the service, or was it free? Am I checking the weather on the go, or am I planning to watch a movie at home?

Best practice: measuring video quality

Video quality is based on experience and perception. Typically, video is scored by human viewers. There are standardized experiments in which the scale and structure are defined; during the viewing, the same display requirements and light conditions apply. In principle, viewers are placed in a lab and instructed to behave as if they were video chatting or watching movies in their natural environment.

During the viewing session, many video sequences are presented to the viewer, who is asked to individually score each sequence's visual quality based on a defined scale, usually from 1 to 5. The presented videos have been previously captured from different services, transmission channels, or other processing conditions as coding.

During the session, each viewer sees a wide range of quality, from very sharp to very blurry images to jerky presentations so as to correctly anchor perception. To avoid a context effect, the order of the video sequences is different for individual viewers.

At the end of the viewing session, each presented video will have received a score from each person. The average of these individual scores is considered the video sequence's video quality, usually called MOS (Mean Opinion Score). The MOS describes the quality on a quality scale and allows the differentiation between different videos, tested services, or underlying processing conditions.

Reference measurements

Experiments with human viewers usually require pre-captured video material and, of course, the presence of individuals. This type of experiment is primarily used for defining new codecs or formats and as reference measurements for calibrating automated measurement tools.

For real field video quality measurements, experiments with human viewers are not applicable; here, automated measurement algorithms for video quality are available. These automated measurement algorithms try to model the MOS scores derived by video signal analysis in the viewing experiments.

Since 2014, Rohde & Schwarz mobile network testing solutions offer a real-time video quality analysis application running on all state-of-the-art Android smartphones, including the latest phone models that exceed full-HD resolution and support 1440p and UHD video resolutions.

Learn more about testing mobile video quality and video MOS on our dedicated web page.

Related stories

Proofing and optimizing a network’s readiness for video transmission

Read more

Challenges of mobile video services for network operators

Read more

Measuring QoE of mobile data applications for network optimization (part 3)

Read more

Subscribe MNT blog

Sign up for our newsletter

Stay up to date and get stories and insights with our frequent mobile network testing newsletter.

Stories by category

Benchmarking & optimization

More information

Field services & interference hunting

More information

Innovations in mobile network testing

More information

Testing from RF to QoE

More information

Request information

Do you have questions or need additional information? Simply fill out this form and we will get right back to you.

Permiso de marketing

Se ha enviado su solicitud. Nos pondremos en contacto con usted lo antes posible.
An error is occurred, please try it again later.