How to Support the 2 Major Approaches to Multi-View Live Streaming


Consumers’ exposure to multi-viewing experiences is raising the bar for providers of live-streamed sports, esports, music performances and other events who want to attract viewers by delivering services that look nothing like broadcast TV. Maturing technologies along two tracks are finally antiquating the one-view status quo: one approach allows viewers to choose from multiple camera… Continue reading How to Support the 2 Major Approaches to Multi-View Live Streaming

Consumers’ exposure to multi-viewing experiences is raising the bar for providers of live-streamed sports, esports, music performances and other events who want to attract viewers by delivering services that look nothing like broadcast TV.

Maturing technologies along two tracks are finally antiquating the one-view status quo: one approach allows viewers to choose from multiple camera feeds as they watch; the other enables them to look at whatever they choose across a seamlessly captured 3600 panoramic view, in some cases requiring use of virtual reality (VR) head-mounted devices (HMDs) but in many not.

Presently, commercial momentum centers around the former. While the 3600  approach was a focus of activity across major sports leagues three years ago, deficiencies in the quality of experience led to a sharp drop in support last year.

Nonetheless some providers have stayed the course, and others have just begun to get involved as the technology has improved. The trend is sure to accelerate as 5G takes hold, as reflected in a 2019 Amdocs survey that found 63% of the top 100 network operators worldwide plan to support VR-based 3600 viewing and/or augmented reality (AR)-aided features with their 5G services in the years ahead.

Meanwhile, support for viewers’ ability to choose from multiple camera feeds is taking off after years of failed attempts dating back to the early days of cable TV. Now that it can be done cost effectively in the streaming era, ever more service providers and event producers worldwide are jumping on the opportunity.

Consumers Want Multi-Viewing Options

And no wonder. As recorded in numerous consumer surveys, multi-viewing offered on connected TVs and devices of every description is proving to be a major draw along with other advanced features that were never available to traditional TV viewers. For example, Verizon Media in a 2020 survey of sports fans worldwide, found that 30% want the ability to switch viewing angles while they’re watching live sports, more or less matching the percentages expressing a desire for time shifting (30%), the ability to skip ads (30%), and easier access to game highlights (42%).

Similarly, a survey of 15,000 sports fans conducted by Deloitte in 2018 found that, beyond quality of service, device diversity and support for time shifting, the ability to customize viewing options was a top priority for more than 25% of respondents. “While streaming platforms should first focus on getting picture quality right, the introduction of these customizations can help drive users from casual fans to fanatics, increasing their platform stickiness, and generating significant potential returns,” Deloitte said in its survey report.

As a 2020 report from Singula Decisions on the results of another consumer survey notes, there remains a sameness to user experience (UX) that often reflects disregard for these preferences. In a statement accompanying that survey’s results, Singula CEO Bhavesh Vaghela said the findings showed that “OTT brands must think differently about how they build a service and experience that best suits the needs of their customers – and do a better job to emotionally connect with their customers to build trust and loyalty.”

Whichever approach to multi-viewing service providers want to pursue, a big factor in the opportunity today has to do with streaming infrastructure. The heavy lifting once required to implement multi-viewing is rapidly becoming a thing of the past thanks to the availability of production platforms built on real-time interactive streaming infrastructures that overcome the limitations of traditional CDNs.

As a result, support for multi-viewing options is likely to become essential for everyone involved in producing live-streamed content. The more consumers gain exposure to such viewing experiences, the harder it will be for providers to adhere to the old norms.

The Increasing Momentum Behind Multi-Viewing

Developments with both approaches to multi-viewing tell the story. Here are some examples of how service providers are responding to consumer demand for such options, not only in sports but in esports and music as well:

  • Formula 1 and NASCAR Online coverage of car races was one of the first arenas where multi-viewing became a routine aspect of streaming services. Both of the leading producers in professional racing deliver a customizable viewing experience featuring live streams from in-car and pitstop cameras.
  • NBA – Starting with the 2020-21 season, the NBA Digital service made it possible for viewers to select alternate “Rail Cam” and “Courtside Cam” angles as they watch games.
  • Fox Sports – In a preview of things to come as 5G takes hold, Fox Sports teamed with Samsung during the 2020 MLB National League Championship Series, the entirety of which was staged at the Texas Rangers’ stadium in Arlington, to deliver an Android app that offered Samsung 5G phone owners multi-viewing options from five cameras positioned around the playing field.
  • Verizon – With New York area coverage of a NY Giants-Tampa Bay Buccaneers game in November 2020, Verizon kicked off rollout of 5G-streamed NFL games enabling viewing from multiple camera angles along with some augmented reality (AR) features with plans to expand the offering in tandem with availability of 5G Ultra Wideband service.
  • Blizzard Entertainment – The producer of esports events tied to its Call of Duty and Overwatch titles has taken a different approach to multi-viewing in conjunction with the commentary on gaming action provided by multiple professional observers. Taking turns at delivering their commentary during the frenetic action, observers can choose from five different feeds to pick the one they want viewers to see as they talk about the state of play.
  • Big Hit Entertainment – Starting in 2020 this South Korea-based concert producer began offering viewers freedom to choose from multiple camera angles as they stream live events. The kick-off to BHE’s “Multi-view Live Streaming” occurred with a performance by boy band BTS that allowed fans to choose viewing angles from six cameras.
  • Les Ballets de Monte-Carlo – This ballet production company recently introduced multi-viewing options allowing online viewers of its performances to switch back and forth among camera feeds tracking individual dancers and the orchestra as well as the show as a whole.
  • FuboTV and DAZN – Both of these OTT sports service providers are taking another, increasingly common, tack in the multi-viewing trend by enabling subscribers using Apple TV to switch back and forth for full-screen viewing among any group of four channels they choose to run on their displays.
  • 3600  viewing – Not to be confused with immersive VR experiences, 3600 viewing is a 2D planar approach to panning across the FOV with screen swipes on a handheld device or by a turn of the head when wearing an HMD. Unlike immersive VR, services supporting the 2D mode frequently allow viewers to zoom in on points of interest. Current offerings from some leading providers include:
  • BT – The U.K. operator, which has been offering 3600 viewing of select soccer competition with smartphones and HMDs since 2017, has stayed the course with more frequent coverage.
  • NBA – Another long-time provider of 3600, currently in partnership with Facebook’s Oculus and the VR production platform supplied by Verizon’s RYOT unit, the league’s 2021 season has offered seven games with the VR viewing option, which is more immersive than 2D but not to the fully immersive extent associated with what is known as six degrees of freedom.
  • WNBA – Another Oculus partner, the women’s professional basketball league has featured seven games supporting 3600 viewing during the 2021 season.
  • Verizon – The carrier has been aggressively promoting 5G Ultra Wideband service by offering 3600 streaming from multiple event venues, including the Indianapolis 500, Liga MX soccer games, the 2021 Oscars and Live Nation music clubs throughout the U.S.
  • VR Master League Esports competition – This young producer of VR game competitions has streamed 3600 viewing of six championship events to Oculus Quest HMD owners in 2021.

The Right Approach to Multi-View Streaming Opens Many Doors

Surging adoption of multi-viewing with live-streamed content worldwide as evidenced by these many examples makes clear the need for real-time interactive streaming infrastructure. That’s not just because there’s no better way to support either of the two approaches discussed here. Equally important, consumer demand for these options is part of a broader demand for the kinds of features enumerated in the previously cited Verizon Media and Deloitte surveys.

Moreover, as discussed at length in a previous blog, consumers want to socialize around live event viewing with synchronized real-time viewing and video-streamed interactions at scales beyond many of the watch-party features now on offer. The only way to incorporate all these capabilities into a compelling live-event streaming service is to move beyond the limitations of one-way CDN infrastructures.

For example, in the case of multi-viewing supported by simultaneously streaming individual camera feeds for selection by each user, delivering all those options at once to each end user is a costly and unnecessary use of bandwidth. A much better approach utilizes the capabilities embodied in real-time multi-directional streaming architecture as instantiated by Red5 Pro’s Experience Delivery Network (XDN) technology.

XDNs rely on a server software stack hierarchically deployed and orchestrated in three-tiered clusters that can be scaled without limit across one or more public or private clouds. Each cluster consists of one or more core origin nodes where just one version of each live stream encode is ingested and streamed out to relay nodes and from there to intelligent edge nodes, which deliver a stream matched to each end user in the node serving area.

When multiple camera feeds are in play, all streamed views arrive in perfect sync at the edge nodes, enabling each user to choose which view they prefer. Typically they make the selection from a low-bandwidth mosaic of options that can be called up on command from the user to run as a sidebar with the currently viewed screen. As is the case with all content streamed over the XDN, the selected view streams to each user within a 200-400ms end-to-end latency window, which is perceptually tantamount to real time.

Real-time streaming is also an essential aspect to any service offering the 360° approach to multi-viewing. Here a different process is in play that avoids the massive per-user bandwidth consumption that would occur were the entire 360° view scape  FOV to be delivered to each device. Instead, most of the 360° viewing platforms now in commercial use are designed to deliver a high-resolution portion of the field of view (FOV) that the viewer is focused on while the rest of the FOV visible at that instant on screen is delivered at lower resolution. These techniques require an instantaneous response to each hand swipe or turn of the head

When an XDN is streaming live content supporting either of these multi-view options, service providers are positioned to support the other advanced features mentioned earlier, including access to highlights, graphics and data related to what the user is looking at or whatever else interests them as well as watch parties. This is exactly how Singular.Live, one of the leading enhanced feature platform providers, is using XDN architecture to support the per-camera feed approach to multi-view streaming for some of the entities listed above, along with any of the other features providers might choose.

Singular, a sister company to graphics and interactive applications developer Reality Check Systems (RCS), supports delivery of these enhancements as independently streamed overlays that can be rendered in recipients’ browsers in sync with the primary video, allowing every user to have their own customizable viewing experience on any type of connected device.

Along with requiring synchronization of the selected screen view and overlay feeds on every device, an effective UX with live-streamed sports, esports and other events requires the kind of ultra-low, imperceptible latency attained through streaming on the Red5 Pro platform, says RCS president Andrew Heimbold. “We’ve been working with Red5 Pro because they have an amazing scalable infrastructure that allows you to synchronize distribution of the video and overlay streams at ultra-low latency to millions of end users,” he notes.

The Multi-Protocol Flexibility of XDN Architecture

The XDN relies on the real-time communications capabilities of the Real Time Transport Protocol (RTP), which underlies IP-based voice communications and is the foundation for both WebRTC (Real-Time Communications), originally developed for peer-to-peer video communications, and RTSP (Real-Time Streaming Protocol), a one-to-many video streaming alternative to HTTP widely used with mobile devices.

WebRTC, the most commonly used XDN transport mode, is ideal because client-side support for interactions with the protocol have been implemented in all the major browsers, including Chrome, Edge, Firefox, Safari and Opera, eliminating the need for plug-ins or purpose-built hardware. XDN architecture has been designed to overcome the scaling issues widely associated with WebRTC, enabling real-time streaming at any distance to any number of end users. But, for streaming live content to mobile devices, Red5 Pro SDKs enable use of RTSP as the simplest approach to reaching those devices,

Along with ingesting any content delivered via WebRTC or RTSP, the Red5 Pro XDN can ingest video formatted to all the other leading protocols used with video playout, including Real-Time Messaging Protocol (RTMP), Secure Reliable Transport (SRT) and MPEG Transport Protocol. These are packaged for streaming on the RTP foundation with preservation of the original encapsulations for egress to clients that can’t be reached via WebRTC or RTSP.

The XDN also preserves the benefits of adaptive bitrate (ABR) streaming without incurring the multi-second latencies of HTTP-based CDNs. The XDN Origin Nodes, having ingested those ABR ladder profiles, stream them over the RTP-based transport system in push mode to Edge Nodes. From there the content is streamed in profiles matched by node intelligence to each session in accord with client device characteristics and access bandwidth availability.

Initiatives underway worldwide are demonstrating that service providers who leverage recent advances linking cloud-based feature enhancement with real-time streaming can enable far more compelling engagements with live-streamed content than ever before. Multi-viewing is now an essential component of UX. So, too, is personalization and support for highly scalable video-based interaction during watch parties.

To learn how all these ingredients to next-gen UX can be readily implemented with recourse to XDN technology, contact or schedule a call.