New WebRTC Tools Break Barriers to Multi-Camera Live Streaming

WebRTC Multi View Camera Live Stream

Why is multi-camera live streaming still missing from the vast majority of sports productions? The answer is simple: efforts to shoehorn multiple simultaneous camera feeds into existing live streaming operations don’t work. As a result, searches for better approaches to this much-needed enhancement have reached the mission-critical stage across a broad range of consumer, government,… Continue reading New WebRTC Tools Break Barriers to Multi-Camera Live Streaming

Why is multi-camera live streaming still missing from the vast majority of sports productions?

The answer is simple: efforts to shoehorn multiple simultaneous camera feeds into existing live streaming operations don’t work. As a result, searches for better approaches to this much-needed enhancement have reached the mission-critical stage across a broad range of consumer, government, and enterprise use cases.

While, in the M&E realm, producers have aspired to make watching live events from multiple camera angles a part of the viewer experience going back to the early days of cable TV, nothing has taken hold even at this late date in the over-the-top (OTT) era. Nor is support for multi-camera live streaming doing much to facilitate the shift to dispersed collaboration in high-value content production.

And when it comes to capitalizing on what real-time multi-camera streaming can mean to emergency responses, crowd control, traffic management, military operations, and other non-entertainment applications, there’s no solution to be found with reliance on traditional approaches to managing surveillance feeds. This is where mission critical translates to matters of life and death.

The Red5 TrueTime Breakthrough to Live Multi-Camera Streaming

Fortunately, there’s every reason to expect that this anomalous state of affairs will soon end. For anyone contemplating how to address the need for multi-camera streaming flexibility in any of these use cases the good news is there’s no longer any reason to be stymied by these dead ends.

In fact, that’s been the case for some time, but it’s truer now than ever.

Over the past few years Red5 has leveraged its Experience Delivery Network (XDN) architecture to help customers in many fields support end users’ seamless view changes across multiple camera angles delivered simultaneously in real time. Now we’ve streamlined the process with introduction of TrueTime MultiView™ toolsets designed for three categories of multi-camera live streaming use cases: fan engagement, live production monitoring, and surveillance.

As part of a suite of recently introduced TrueTime tools, MultiView leverages the combination of standards and Red5 innovations best suited to optimum performance in each of these market segments with SDKs that enable rapid yet highly customizable implementations of multi-camera live streaming in all the leading OSS environments, including web, Android, iOS, MacOS, Windows and Linux. Adding to the application versatility, all the tools in the TrueTime suite rely on a common standards-based foundation with multi-OSS compatibility, which allows customers to put them to use in any combination they deem appropriate to creating market-moving solutions and services.  

With more to come, the other toolsets in the TrueTime lineup include:

  • TrueTime DataSync™ – By ensuring frame-accurate data synchronization with live real-time video streaming, DataSync enables data overlays that can be used to enhance images, embed key information, and add predictive modeling.
  • TrueTime Studio™ – This is a production tool that facilitates creation of interactive video content and the ability to bring in remote guests and other collaborators in real-time streaming scenarios.
  • TrueTime WatchParty™ – Watch parties with the unlimited scalability and feature-rich functionalities enabled by XDN architecture are now easier to implement with this tool.

Live Multi-Camera Streaming in the Consumer Market

What a viable approach to multi-camera live streaming in the consumer market could mean to providers of streamed sports and other live events in their efforts to draw and retain audiences can’t be overstated. Such enhancements have become especially significant to engaging Z-generation users, whose comparatively lower interest in mainstream sports has become a major concern to rights holders.

The dimensions of the Z-gen drop off in engagement were underscored by two surveys conducted by researcher Morning Consult. Findings in one case showed that 60% of adults born since the late ‘90s watch live sports compared to 72% of all adults while the other survey found that only 53% of those Gen Z adults describe themselves as sports fans compared to 69% of millennials.

Of course, multi-camera live streaming isn’t just a key element in the pursuit of Z gen viewers; its appeal extends to people of all ages. In a 2023 survey of 3,000 U.S. sports fans aged 14 and over, Deloitte found strong demand for more immersive streamed sports viewing experiences with over a third of respondents across all ages ranking control over multiple camera angles as one of the two most sought after features, along with more advanced replay controls like slow-motion activation. An earlier Verizon Media-commissioned survey of 5,000 sports fans in the U.S. and four European countries produced nearly identical results.

Yet support for multi-camera streaming is rare in live-stream sports and other productions, with occasional exceptions like NASCAR and Formula 1 races and some experimentation elsewhere. But major pro sports producers like Major League Baseball, the National Football League and the National Basketball Association that have worked hard to create innovative live streaming experiences have yet to make viewing from multiple camera angles widely available.

Here it’s important to note there’s a distinction between performance parameters bearing on full-screen access to different live video streams from a simultaneous display of thumbnail videos versus applications where the options have to do with different viewing angles provided by multiple cameras from a single event. In the former case, a second or two delay in the switch from one stream to another is tolerable, whereas a change of viewing angles in the same streamcast must occur without perceptible delay.

In either case the problem with trying to support instantaneous access to multiple live streaming options on Hyprtext Transfer Protocol (HTTP) streaming platforms stems from the fact that the only way to eliminate lag time is to deliver all the full-screen renderings together in a single stream. The resulting high levels of per-user bandwidth consumption can choke bandwidth availability across the local broadband service area, disrupting people’s quality of service irrespective of what they’re watching.

But this isn’t the case when a service provider implements TrueTime MultiView for Fans on the XDN platform. In such instances, all the live video streams are ingested at XDN Origin Nodes and relayed directly or through intermediate Relay Nodes to intelligent Edge Nodes serving a given segment of the end user population, which varies in size depending on the proximity of each Edge Node to the service group. Users’ choices of camera angles or featured alternative programming are delivered over unicast live streams from the Edge Nodes at imperceptible latencies.

This real-time distribution allows whichever camera angle or separate game feed a viewer wants to access to be displayed in full-screen resolution as soon as the user clicks on one of the thumbnail videos that display the viewing options in an adjoining multi-screen window. These thumbnails, streaming each low-resolution version of the video at one frame per second, are packaged in a low bitrate data stream that goes out to all session users with no meaningful impact on bandwidth consumption. 

Unlike any attempt to enable HTTP streaming-based multiviewing by offering just two or three camera options to curtail bandwidth consumption, there’s no limit on how many camera feeds can be supported by TrueTime MultiView for Fans. It’s just a matter of how many thumbnail videos a provider wants to squeeze into the MultiView selection space.

The audience-drawing benefits of multi-camera live streaming could also be introduced as a natural enhancement to in-venue smartphone viewing experiences on offer at an expanding array of sports centers worldwide. In the U.S., for example, Verizon, the leading provider of enhanced in-venue 5G services, now has in-venue 5G links operating in 25 NFL stadiums and 35 other sports and concert venues, according to the Groupe Speciale Mobile Association (GSMA).

There have been some attempts at supporting multi-camera live streaming over stateside in-venue live streams and elsewhere as well. For example, in Germany Vodafone is working with Bundesliga, Germany’s largest soccer league, to deliver a multiview and replay app to Sky Sports customers attending league games. Other carriers experimenting with in-venue multi-camera streaming and other advanced features include Orange at a stadium testing site in France and U.K. MNO Three at Premier League games in  London.

But a serious problem with delivering in-venue streamed viewing experiences, whether from one or multiple camera feeds, arises when audio from sound systems reaches users ahead of the streamed audio, which results in a disorienting echo effect. Red5 has been directly involved with tests of TrueTime Multiview over in-venue connections at recent major sports events which we’re not authorized to name.

These engagements have proved beyond any doubt that it’s now possible to offer in-venue multiviewing experiences in real-time sync with onsite sound systems. Stay tuned for what comes next as these trials move to commercial rollouts.

Multi-Camera Live Streaming in Production Operations

Meanwhile, there are other barriers imposed by conventional streaming that are holding things back in the evolving realm of production and postproduction. Here multi-camera streaming in real time is critical to the movement toward dispersed collaboration in live productions.

While the ability to switch from one camera angle to another has long been a part of the live TV production process, there’s now a need to lower live-content production costs by reducing workloads at event venues to camera crew operations while locating final production at core studios or dispersed locations. Moreover, there’s a need to include camera feeds from remotely positioned commentators and influencers in the live production process.

These requirements can’t be met without synchronized reception and shared workflow engagement in real time at all workstations wherever they might be. TrueTime MultiView for Production Monitoring makes this possible.

Here the primary difference with MultiView Fan has to do with the inclusion of Red5’s Mixer Node technology in MultiView Production Monitoring, which facilitates mixing any combination of audio and video feeds into the stream going out over the XDN infrastructure to end users. This gives producers collaborating over any distance in real time great latitude in determining what end users see moment to moment, ranging from the content in a single A/V feed to split-screen displays to composites of multiple video streams or single image captures. 

In cases where distributors make use of Red5’s transcoding support for delivering multiple bitrate profiles in emulation of the adaptive bitrate (ABR) approach to accommodating variations in bandwidth conditions, the Mixer output is fed into Red5’s Caudron transcoder for multi-profile distribution. All of this is done while maintaining end-to-end latency at or below 400ms.

Multi-Camera Live Streaming in Surveillance Monitoring

As for exploiting the potential of synchronized multi-camera streaming in surveillance applications, there’s a need to assemble camera feeds across the broad sweep of unfolding emergencies, crowd scenes and traffic flows into perfectly synched matrices for real-time delivery to control centers. This applies as well to video feeds going from a battlefield to regional command centers.

Red5 has made it possible for surveillance operators in multiple use cases around the world to meet this challenge. For example, one of the latest video monitoring applications utilizing Red5’s real-time multi-directional XDN streaming architecture can be found in San Diego County, CA. There the sheriff’s department and partner agencies are operating a drone streaming command center where they can control and simultaneously view a virtually unlimited number of drone camera feeds through an easy-to-use interface provided through Red5 partner Nomad Media’s content management system.

While in most instances TrueTime MultiView for Surveillance applications rely on the XDN architecture’s ability to implement highly scalable uses of WebRTC, MultiView for Surveillance users can take advantage of the fact that the Real-Time Transport Protocol (RTP) underlying WebRTC also supports the Real-Time Streaming Protocol (RTSP) widely used in video surveillance cameras and smartphones. With no need to establish WebRTC connectivity, those cameras’ outputs can be directly fed into the Red5 Pro servers positioned at the multi-stream aggregation points.

These servers, consisting of Red5 software running on commodity appliances, ingest and package any number of camera feeds for synchronized distribution over XDN infrastructure to one or more monitoring posts, including any cloud locations where analytics engines are in play. Real-time performance can be achieved using the RTSP protocol end to end, provided that receiving devices are running a client player that supports the protocol.

If clients are not natively equipped to support RTSP, the XDN platform repackages content ingested from RTSP streams for distribution via WebRTC without adding latency. XDN infrastructure is also ideally suited to stream video from surveillance cameras that use Secure Reliable Transport (SRT), another protocol based on RTP which has gained traction as an open-source real-time alternative to RTSP for cameras streaming high-resolution video.

A Common Framework for Mixing and Matching TrueTime Applications

As explained in the foregoing discussion, given the different challenges associated with activating multi-camera live streaming in each use case category, there are significant elements differentiating the TrueTime Multiview Fan, Production Monitoring, and Surveillance toolsets. But they all have in common reliance on the unique capabilities that distinguish XDN architecture, not only from HTTP streaming architecture but also from other platforms that rely on WebRTC as the primary streaming mode.

WebRTC, the most commonly used XDN transport mode, is ideal because client-side support for interactions with the protocol have been implemented in all the major browsers, including Chrome, Edge, Firefox, Safari and Opera, eliminating the need for plug-ins or purpose-built hardware. XDN architecture has been designed to overcome the scaling issues widely associated with WebRTC, enabling real-time streaming at any distance to any number of end users at end-to-end latencies below 400ms and often below 200ms when transcontinental distances aren’t in play.

In addition, along with supporting real-time transport via RTSP and SRT as dictated by the types of devices in use, the XDN platform can ingest video formatted to other leading protocols used with video playout, including Zixi, Real-Time Messaging Protocol (RTMP), and MPEG-TS. RTMP and MPEG-TS streams can also be packaged for streaming on the RTP foundation with preservation of the original encapsulations in cases where clients compatible with those protocols can’t tap browser support for WebRTC.

XDN instantiations can also be configured to hand off content for conventional streaming over HTTP Live Streaming (HLS) in rare instances when there’s no other way to deliver the content to end user devices. And, as noted, XDN-based Caudron transcoders can be used to replicate ABR profiles with the real-time streamed content, which applies with any XDN-supported transport protocol.

TrueTime MultiView, along with leveraging WebRTC as the transport foundation for building multiviewing applications in the consumer, production, and surveillance segments, takes advantage of other key standards to streamline implementations. These standards, which are used in all TrueTime toolsets, include:

WebRTC-HTTP Ingestion Protocol (WHIP) and WebRTC-HTTP Egress Protocol

(WHEP) – These are the default modes used in TrueTime-enabled use cases to expedite, respectively, ingress and egress of content streams on the XDN platform.  They greatly simplify mass scaling of WebRTC.

On the origination side, WHIP defines how to convey the Session Description Protocol (SDP) messaging that describes and sets up sessions allowing content streamed over WebRTC from individual devices to be ingested across a topology of media servers acting as relays to client receivers. On the distribution end, WHEP defines the SDP messaging that sets up the media server connections with recipient clients.

Key Length Value (KLV) – KLV is a globally used SMPTE standard activated in TrueTime DataSync to ensure synchronization between primary WebRTC-streamed content and ancillary data that’s delivered over the standardized WebRTC data channel. KLV defines how metadata that’s used to describe the data elements is formatted, thereby enabling automated, frame-accurate association of data with the relevant content.

Notably, with delivery of real-time video and metadata over a single connection, the automated tagging makes it easier to search and manage video and other repositories, including with the aid of AI, to ensure that time-sensitive contextual data is always displayed as intended with the video. This opens the way to an unlimited array of possibilities for building advanced features into real-time streaming applications, from personalization of user experiences and advertising in consumer services to telemetry streams that enrich collaborative capabilities in production, surveillance, and any other sphere of enterprise, institutional and government operations.

For the first time, it’s now possible to realize the full potential of multi-camera live streaming in all the use cases where its absence has become a growing source of concern. In fact, just what that full potential is has been greatly expanded with the introduction of TrueTime MultiView.

As immediate needs for live multiviewing capabilities drive the implementation of MultiView across multiple market segments, users of the technology will find they are well positioned to take advantage of XDN-based real-time multi-directional streaming for any use case at any scale over any distance. To learn more about the possibilities contact us at or schedule a call