Real-time communication is the communication technology of our time. WebRTC’s dominance over all other forms of communication is indisputable and yet, many of the variables that determine webRTC media quality are beyond our control.
WebRTC’s availability across all modern browsers along with native clients for all major platforms means media quality is subject to massive variance depending on, amongst other things, bandwidth, frame size, frame rates, codecs, bitrates, latency, and transport protocols.
In this blog, we will take a look at how best to manage the unmanageable and to subdue the more compliant and predictable webRTC elements to help guarantee a more satisfying user experience.
The key players
There are a number of protagonists playing key roles in determining the quality of real-time communications media. Some of these key players are controllable, provided we employ the right framework of operations, while others are of the more slippery variety. This latter group is problematic because we cannot predict how and when they will affect the quality of media output.
Let’s take a look at the media influencers we must grapple with to gain as much control as possible of webRTC media quality.
Bandwidth
First, out of the outside-of-our-control bucket, we have the bandwidth.
For the uninitiated, bandwidth is the maximum level of data transfer over an internet connection in a given time period – and there is a frustratingly long list of variables making it practically impossible to influence it:
- We cannot predict how far the user might be from their access point.
- We cannot control the quality of the user’s internet reception.
- We cannot vouch for the caliber of the user’s hardware – CPU/RAM resources may be insufficient, for example.
- We cannot forecast with complete accuracy how many people may use the same access point at the same time.
- We cannot know how users have configured their firewalls and whether they are intentionally or unintentionally throttling bandwidth.
- We have no way of knowing the number of background uploads and downloads that are taking place at any given time.
These are just some of the issues that make bandwidth a particularly tricky player in the overall media quality drama. It’s not all doom and gloom though. There are some simple measures that can help to bend bandwidth to our will.
- We can position our servers closer to our users, or add more servers to achieve this purpose.
- We can encourage our users to turn off their automatic update processes.
- We can inform our users that certain router QoS settings can prioritize device bandwidth use for more demanding applications, most commonly upstream flows.
And most importantly – We can attempt to accurately estimate bandwidth using webRTC’s bandwidth estimation tools.
Transport protocol
In a nutshell, transport protocols are communication mechanisms that use the internet to carry application information from one computer to another, ensuring the right information is sent, in the right order, to the right computer.
Data is usually carried across networks using either UDP or TCP transport protocols. In our experience, prioritizing TCP (TURN or ICE) connections over everything else is doing media transmission and quality a disservice. Yes, TCP comes with its delivery and retransmission guarantees, but these guarantees mean little for data that requires real-time transmission. Waiting for retransmission with TCP in the event of packet loss is much more of a disruption than it’s worth. If packet loss occurs, TCP will break, meaning your media quality is seriously degraded. Stick with UDP and decrease overall latency – happy days!
Now on to the controllable. First, to those handled within our own infrastructure.
Bitrate
Bandwidth dictates how much data the network can send or receive, and is beyond our purview, however, bitrate, is what we actually send or receive is within our sphere of influence.
The goal is to achieve the optimal bitrate for our needs. We hit this sweet spot by accurately estimating the available bandwidth and using as much of this as we can, mindful at all times of the quality of the media output. The rule of thumb is to maintain a bitrate that’s lower than or equal to our estimate.
Latency
Often called lag, latency refers to the time it takes to capture, transmit and process data through multiple devices and channels. Obviously, we are constantly striving for low latency environments, however, real time communications is a constant trade-off between latency and quality, the trick here is to stack the deck in our favor right from the start.
The right infrastructural design means that, while we cannot control where our users are, we can control where we are. By placing media servers closer to our users, we can greatly reduce latency issues. Rather than using a single server, distributing media servers over shorter distances in larger group calls reduces latency and increases media quality.
Latency analysis must become second nature; use RRT to measure it regularly, and determine server distance by measuring latency along with bandwidth,packet loss, and jitter combined
Codecs
Codecs, devices or applications that compress and decompress media files for transmission across devices and networks, have a significant impact on media quality and are a huge time drain if you fall down the rabbit hole. There are quite a number of codecs to choose from and, while webRTC is codec agnostic, there are still some things you must consider before you make your selection.
- VP8 and H.264 are popular and work well across all browsers. H.264 has more hardware acceleration available, while VP8 has temporal scalability
- At the same bitrate, VP9 can provide higher quality than VP8 or H.264 but is not as widespread and its SVC may require you to perform some reverse engineering
- While AV1 has the potential to produce the best quality of all other codecs, it is still new and not very well understood
- HEVC is an Apple exclusive. For group sessions, you may need simulcast or SVC, which may not be available with HEVC
- AV1 is very new and is a CPU guzzler
Let’s move to the controllable elements related to our users’ devices.
CPU
Media, especially video, is a CPU hog. It is important to ensure that media doesn’t eat too much of your user’s CPU, which will cause it to overheat, drop frame rates, or skip them altogether. The result is poor media quality, lowered battery life, and network congestion.
Monitor CPU directly and continually, and adjust bitrates when usage is too high. Our top tips in this area are to:
- Look at your code; don’t run logic that switches up UI elements too often
- Shave bitrates using simulcasts, which send different quality media to different users in a session.
- Where possible, mute conference participants who are not contributing – take care to do this in a manner that doesn’t affect the overall satisfaction of the users.
Media type and placement
Bandwidth estimators, as we have discussed, tell us the expected range we can ‘safely’ transmit our media within. It is up to us, however, to decide the parameters within which our media will transmit. The key questions to ask here are:
- What resolution do I want my media to display at?
- What frame rate does my media need to reduce motion?
While webRTC ultimately decides the resolution and frame rate based on available bitrates, you must nudge it in the right direction. WebRTC needs you to fill in the following blanks:
- Is sharpness or motion more important to you?
- What resolution is your user’s screen?
- Which content elements are more or less important?
- This will determine the bitrate investment
It is up to you to complete the puzzle and set the necessary limitations/restrictions on your media behavior and quality.
The webRTC holy trinity
We have already discussed the impact bitrate, frame rate and resolution have on media quality individually, but it’s important to understand how their symbiotic relationship to one another affects webRTC output.
Bitrate is arguably the cardinal component of the communications triumvirate. Changes to bitrate immediately affect both frame rates and bitrates. Start with bitrate, know your limitations here, and determine your available frame rate and resolution options.
Now that you know your bitrate, you can tend to frame rate and resolution concerns. Changes to one of these players will have immediate consequences for the other.
The following guidelines will help when tweaking frame rates and resolutions:
- For static content, aim for higher resolution at a lower frame rate. Where possible, choose VBR encoding to optimize transmission.
- For more dynamic content, choose a higher frame rate. 30fps is fine here, but if the bitrate is low, you must lower the framerate to meet it.
- When sharing video content from YouTube or another video streaming platform, the frame rate is always more important than resolution.
- When video/image sharpness is your priority, bump up resolution; don’t worry so much about the frame rate.
- For large group calls, it’s perfectly fine to drop the frame rate to 15fps or less.
- Always check that you are not receiving video at resolutions higher than what you are displaying.
And that’s it. Hopefully, you can now set about achieving a comfortable, low latency real-time communication experience. Remember the media quality mantra; measure frequently adjusts accordingly and update appropriately.
The post Maximise Your WebRTC Media Quality appeared first on VoIP-Security.Net.