5 Big Use Case Categories for Developing VR Live Streaming Experiences

metaverse VR

After years of hype and dashed expectations, the growing presence of virtual reality in entertainment and other fields is opening opportunities for providers who can surmount the challenges of delivering live VR live streaming experiences over fixed and mobile networks. Critically, much better viewing experiences, often delivered through untethered computerized head-mounted devices (HMDs) are supplanting… Continue reading 5 Big Use Case Categories for Developing VR Live Streaming Experiences

After years of hype and dashed expectations, the growing presence of virtual reality in entertainment and other fields is opening opportunities for providers who can surmount the challenges of delivering live VR live streaming experiences over fixed and mobile networks.

Critically, much better viewing experiences, often delivered through untethered computerized head-mounted devices (HMDs) are supplanting the stomach-churning, low-resolution visuals that have long plagued VR. Along with higher resolution, technical advances have led to more accurate responses to user actions with improved motion-tracking, which, in some cases, can generate responses to eye movement as well as head and hand motion.

Issues remain, of course. Good HMDs, priced anywhere from $300 to well over $1,000, are still too pricey for many consumers. Set-up can be complicated with some models. And there’s a lot of room for reductions in form factors, as evidenced by recent prototype demonstrations featuring eyewear not much bigger than sunglasses.

VR technology will continue to get better, but the fact that it is now good enough to merit attention by producers and service providers who want to be first movers with network delivery of live VR experiences seems beyond dispute. The point is buttressed by recent research studies projecting rapid growth across multiple fields in the years ahead.

For example, IDC predicts global spending for VR content, equipment and services across all consumer and enterprise segments will hit $10.5 billion in 2020 and grow to $103 billion in 2023. Fortune Business Insights predicts the spend will top $120 billion by 2026, marking a 42% CAGR from the 2019 total.

These projections are in accord with broad recognition that the technology is taking off as a tool in design, training and other non-entertainment applications. But spending on the consumer side is on an ascending curve as well, averaging about a 40% share of predicted spend. In the U.S., the number of consumers using VR jumped from 43.1 million at yearend 2018 to 52.1 million a year later.

During the long time it has taken for VR technology to reach commercial viability in everything from game playing to design engineering, worker training and much else on the industrial front, the networking challenges have grown less daunting, thanks to progress on compression, rendering techniques, latency compensation and support for real-time streaming.

These gains are fueling confidence among mobile carriers that VR and extended reality (XR) modes in general will contribute to demand for 5G connectivity. Qualcomm, banking on a big 5G market for its Snapdragon 855 and 865 chipsets, has taken a leading role in fostering development of eyewear and XR functionalities with launch of an XR Optimized Certification Program aimed at certifying interoperability among 5G devices using its chipsets.

Most of the carriers involved in the program have not committed to specific dates and services, but some have made clear they are moving forward on a fast track to overcome the proverbial chicken-and-egg problem by ensuring there will be VR content available to induce HMD owners to sign up for 5G. Deutsche Telekom, which has already experimented with VR coverage of music events, is one case in point with an initiative aimed at delivering sports, gaming and other VR content over its 5G network.

South Korea’s KT, too, is on an aggressive VR track. Having already launched a “Super VR TV” IPTV service, the carrier has signaled it will be among the first to introduce VR on 5G networks.

In the U.S., Verizon has taken a leading role in VR development with launch of Envrnmt, a seedbed for developers at its New Jersey facilities, and acquisition of RYOT, an L.A.-based immersive content studio. The carrier is also partnered with Walt Disney Studios to explore the possibilities of 5G connectivity for VR entertainment and has been engaged with various partners in delivery of live VR content over its own and other fixed broadband networks.

But the question remains, what will ignite the shift to mainstream usage? Evidence of VR’s growing appeal and the opportunities for networked support is not hard to find. Following is a summary of some of the more salient cases in point.


Most content developed for VR, ranging from games to documentaries and even full-length dramas and movies, has been made available for downloading over networks without a live streaming requirement. The exceptions, all involving intensifying activity, include multiplayer gaming, sports, concerts, and other live events.

Multiplayer Gaming and Socialization

The migration of multiplayer fast-action competition to the VR space is supported in many cases by networking a handful of players over premises LANs. But ever more developers are targeting mass engagement over Internet connections.

In the latter case, parties have designed sophisticated synchronization mechanisms that compensate for sub-second latency delays in gaming interactions among competing users. Judging by reviews from online game commentators, several titles are drawing sizeable audiences.

At the same time, more socially oriented multiplayer environments like The Playroom VR and Rec Room, both offered as free apps on various HMD platforms, are taking hold as well. Such platforms allow players in avatar mode to interact with each other in settings where they can play darts, paintball, laser tag and other games. VRChat, another popular app, supports immersive interactions among people watching game-playing by professional streamers.

The Evolving Sports Scenario

Sports have become a major area of VR development by leagues and distribution affiliates who are exploring delivery of 1800 or 3600 immersive viewing experiences. Prior to the pandemic, VR coverage was beginning to appear intermittently on season schedules in most major sports and was prominently featured during the 2018 Winter Olympics and the FIFA World Cup.

So far, the NBA has shown the biggest commitment to the technology with dozens of VR broadcasts over the past three seasons. During this year’s Covid-aborted season, the NBA relied on technology supplied by Verizon’s RYOT unit after the previous supplier, NextVR, was acquired by Apple. This introduced a new wrinkle to the viewing experience with support for communications among viewers “seated” near each other.

There’s also been growing support for non-immersive (2D) 3600 viewing of live sports events, which allows users to view playing fields from any angle with screen swipes on smartphones and other devices. BT, for example, has been offering such viewing experiences with select soccer matches since 2017.

Across all sports, there have been stumbles along the way, with complaints about bad viewing experiences leading some leagues to back off even as others doubled down. But the user experience keeps improving, as evidenced by an article published by Fast Company following last year’s NBA playoffs with this headline: “I watched the NBA Playoffs in VR, and it’s going to change how you watch sports.”

Non-Entertainment Applications

Collaboration in Design, Research & Everyday Business Operations

VR, spurred by the pandemic, is gaining wide traction for remote collaboration in business, government and institutional operations. Often, the collaboration involves participation in VR-based design applications. For example, a platform built by Iris VR is used by architects not just for their own internal work but also to provide clients immersive walkthroughs before moving to construction.

In aircraft design, Airbus, using what’s known as RAMSIS (Realistic Anthropological Mathematical System), is able to create immersive 3D renderings of aircraft cabin designs that allow developers to better understand ergonomic implications of their concepts and to review component installation and maintenance processes. Similarly, Boeing has employed VR for designing and simulated testing of aircraft, utilizing specially trained development teams to look at issues related to human comfort in the preproduction phase.

VR is prominent in auto design as well. Ford, which has long used VR technology in some design work, recently began using Oculus Rift headset technology to expand the role of VR for reviewing every facet of proposed designs down to the minutest details.

Training & Education

VR has captured wide interest as an education tool in a wide range of academic fields at all grade school and college levels as well as for business training purposes in multiple fields. So far, applications are taking hold faster in business and health than they are in academia, where costs of head gear and cultural resistance are major impediments.

But the experience of remote learning during the pandemic could lower such barriers. As costs fall and VR goes into wider use, the technology could emerge as an important learning tool that puts students in one-on-one interactions with virtual mentors whose approaches to teaching and responses to student input can be individualized through use of AI.

In one of the first studies devoted to analyzing whether VR learning aids recall, University of Maryland researchers discerned a nearly 10% improvement in learning by students using VR compared to students studying the same material without headsets. The more important driver behind educators’ interest, however, is likely to be the prospects for augmenting learning through the individualized mentoring made possible by VR in today’s crowded classrooms.

Meanwhile, VR-based training is cropping up with increasing frequency across the business world. For example, Walmart has used VR to prepare employees for Black Friday sales by immersing them in a virtual environment with big crowds and long lines. In Houston, people looking to work in the HVAC sector can get VR-based training in the trade at the Training Center of Air Conditioning and Heating.

A lot of what had been VR training involving large facilities, like the simulated flight centers used in commercial and military aviation, is moving to platforms that rely on use of headsets to make learning more portable with availability to more people at one time. The emergence of platforms supporting 6DoF (six degrees of freedom, which is to say, movement in all directions) has made the headset approach more palatable in situations where trainees have to move around, as in the case of the 200 law enforcement agencies that are reported to be using the VR training environment supplied by VirTra Systems.

As the technology-driven need for continuous workforce training intensifies, VR offers an alternative to allocating physical space for on-premises training. In a recent survey of marketing and sales professionals, the online training site HubSpot Academy found that 57% of respondents were interested in learning something new in a VR environment.


In healthcare VR is proving to be useful in many areas, including surgery, diagnostics, pain control, injury rehabilitation and treatment for conditions affecting mental health. And, as in other fields, the medical profession is also using VR for training and basic research.

As networked telemedicine gains more traction, some of these utilizations of VR technology will be available for remote treatment. Emergencies requiring pain mitigation offer a case in point.

For many institutions VR pain treatment began with use of “SnowWorld,” a virtual-reality game developed by University of Washington researchers Hunter Hoffman and David Patterson in the early 2000s to mitigate burn patients’ discomfort by immersing them in a polar world where they can throw snowballs at penguins and snowmen. Now such programs are widely available from multiple suppliers like AppliedVR, which says its VR kit with therapeutic content is in use at 100 hospitals nationwide.

VR treatment supplementing opioids in the early phases of high-pain situations has compiled a history of success stories that include over 2,500 cases administered at Cedars-Sinai Medical Center in Los Angeles since 2016 and treatments extending back to 2004 at nearby Children’s Hospital. As quoted by the Washington Post, Jeffrey I. Gold, director of the Pediatric Pain Management Clinic at Children’s Hospital, said, “Virtual reality is part of our culture now, so it’s not as alien of a technology as it once was. I think people look at it as an opportunity to deliver better patient care.”

Some of the most dramatic medical uses of VR involve surgical procedures that rely on guidance from pre-surgery virtual modeling based on CT, ultrasound and MRI scans. For example, surgeons at Masonic Children’s Hospital in Minneapolis employed body models of conjoined infant twins to plan a successful separation. In another case involving a baby born with one lung and half a heart, surgeons at Nicklaus Children’s Hospital in Miami used VR to shape an operation that saved the child’s life.

Such capabilities are spawning a VR supply chain. For example, Bioflight VR supports automated rendering of MRI and CT data in virtualized models of real patients’ anatomies.

With networking support, specialists in VR techniques could be called on remotely to help surgeons who don’t have that expertise.

Meeting the Network Distribution Challenge

VR introduces new challenges when it comes to streaming live, interactive immersive content. But they are no longer insurmountable in an era characterized by high fixed bandwidth connectivity, 5G and real-time streaming as enabled by Red5 Pro.

That’s the case even though the closer OEMs get to enabling a first-class VR experience, the bigger the distribution challenge becomes. These days it’s generally agreed that the optimal experience calls for delivering two 4K feeds, one for each eye, with at least 10-bit coding at 60 fps or better. Using HEVC encoding, that gets into 40+ Mbps territory.

But that assumes only the pixels comprising the user’s field of view (FOV) within the HMD’s viewport (display parameters) are being transmitted at any moment, leaving open the question of what happens when the viewer looks in another direction. Providing all the data needed to update the entire volumetric panorama on a frame-by-frame basis would run into the multi gigabits per second.

To keep the transmission rate at or below the 40 Mbps range requires transmission of whatever is essential to updating the FOV based on where the viewer is looking at each split second. That’s doable provided latency in delivering the new data is low enough to avoid unrealistic and potentially disorienting delays between the instant the brain expects to see what has entered the FOV and when the image is actually rendered on the HMD display.

Viewport-Independent Streaming

So far, as in the case of sports coverage, most network-based distribution of live VR experiences involves what’s known as viewport-independent transmission, which occurs when the entire 3600 or 1800 field is transmitted rather than just what’s seen at any moment within the HMD viewport. Developers have made it possible to use this approach to deliver a reasonable viewing experience over high-capacity broadband networks through techniques that reduce bandwidth consumption to some extent.

For example, distributors can load the entire field incrementally into buffers, assuming users have sufficient storage capacity in their hardware configurations. This causes delays at the outset of rendering the first volumetric frame but allows the experience to flow fairly smoothly with continuous buffering of succeeding frame segments. The ongoing amount of bandwidth consumed during the viewing session is further reduced by traditional approaches to video compression where only motion-induced changes in the scene need to be transmitted.

But even then, and at the low resolution and frame rates associated with VR broadcasts to date, the need to deliver the full 360o or 1800 field frame by frame imposes a bit-load penalty that compromises quality and limits service reach to users on high-capacity links, especially when variables introduced with 6DoF come into play. The problem is exacerbated by the fact that a certain amount of bandwidth is unnecessarily consumed to transmit changes in the full field that are not within the user’s viewport at any moment in time.

Viewport-Dependent Streaming

The alternative approach as embodied in MPEG’s Omnidirectional Media Format (OMAF) for 3600 VR and the OMAF-based specifications issued by the Virtual Reality Industry Forum (VRIF) involves viewport-dependent transmission, also known as “tiling.” Tiling is a function adopted for 2D compression in MPEG’s HEVC (High Efficiency Video Coding) Main 10 profile and now has been adapted for VR to provide a means by which only the content needed to fill the user’s viewport is transmitted.

Further contributing to bandwidth savings, the segmenting of the full viewing panorama into tiles allows the tiles assembled for each viewport to be compressed and delivered at varying degrees of resolution in tandem with how the eye registers different parts of the FOV in real life. Researchers have found that use of tiling with VR produces bitrate savings in the range of 40%-65% compared to viewport-independent methods.

It’s also been shown that the method maximizes quality of experience. In a paper delivered at the 2017 IEEE International Conference on Image Processing, researchers from Trinity College in Dublin reported viewport-dependent VR streamed over MPEG DASH compared favorably to other higher bandwidth-consuming transmission methods “in terms of PSNR [peak signal-to-noise ration] and SSIM [structural similarity index measure] inside the viewport.”

Adding to the potential bandwidth savings in tiling applications, some suppliers are exploring encoding processes akin to the update process used with the viewport-independent mode. Here the techniques rely on initial transmission of the viewport in a given session to provide the full volumetric view, which is then retained in the client so that succeeding transmissions only carry elements that are changing frame to frame within any viewport.

Developers anticipate such solutions will significantly reduce the bitrates required for delivering high-resolution feeds to each eye. Equally important, this approach will simplify use of tiling technology for 6DoF-level experiences, which has been a challenge in early application development.

Meeting New Latency Requirements

Whatever approach to streaming VR experiences is taken, support for sub-second latency is critical, especially in the case of multiplayer gaming. The speed at which the brain registers a scene with the turn of a head imposes latency restrictions that would be impossible to meet without the clever real-time manipulations alluded to earlier.

Depending on the game producer, these might include combinations of predictive intelligence using methods known as extrapolation or interpolation, which bring an estimation of what’s happening at a distance into the viewer’s present, and lag compensation, which does the opposite by shaping the user’s present around how everything was rendered by the server at a distance in the past equivalent to the lag time between server and user. Balancing these methods across all users achieves the smoothest and most accurate rendering of competitive action, like the shooting of another avatar, sustaining an illusion of real-time interactions on every client.

Such techniques have made VR multiplayer gaming possible over real-time streaming infrastructures like those enabled by Red5 Pro, which can be architected to achieve end-to-end latency as low as 150-200 ms. In non-gaming applications, like live sports viewing and most of those in the non-entertainment categories, these and even the higher 200-400 ms. latency levels experienced with some instantiations of the Red5 Pro platform are sufficient in themselves to deliver a superior user experience.

Every VR service strategy will be formulated in accord with all the variables related to quality of viewing experience, bandwidth consumption, latency tolerance and other requirements of the chosen use categories. To begin a discussion about how the Red5 Pro platform can be utilized in support of any given VR live streaming experience, contact info@red5.net or schedule a call.