Categories: Uncategorized

What’s Next For RTMP Servers: 4 Considerations


RTMP (Real-Time Messaging Protocol) has a long-established history as one of the original methods for live streaming.  Originally developed by Macromedia and now owned by Adobe. RTMP was designed for delivering on-demand media and live media (i.e live audio, video, and data) over the Internet between a Flash player and a Media Server. However, there are big changes coming for Flash… in that it’s going away.

The new year of 2020 marks the last year of official Flash support. By losing Flash, we also lose the ability to run RTMP in internet browsers. That leads to the question of “What’s next for RTMP servers?”


Latency is Critical

Quite simply, RTMP will be replaced by modern protocols. However, as RTMP created a delivery method for low latency video streaming, so too will the new protocols need to focus on latency as well. This makes latency a critical consideration.

Furthermore, low latency is necessary for creating high-value streaming applications. As we’ve discussed before live streaming should be live and that means delivering media at the fastest possible speed.

The only way to have truly interactive experiences with natural conversation is with the lowest possible latency. Additionally, ever-increasing smartphone adoption is further fueling the demand for real-time latency. As more and more users expect high speeds, the products they use will reflect that expectation.

Now that we’ve established the need, how do we actually achieve sub-second latency. We can approach it from both sides of creating a live stream: ingest (broadcast) and egress (subscribe).


Ingest

Both SRT (Secure Reliable Transport) and WebRTC are good options for ingesting live streams with minimal latency.

SRT is an open-source video transport protocol and technology stack that optimizes live stream delivery through firewalls and across unreliable networks. As SRT was originally designed for hardware encoders and has yet to be adopted as a Web standard, it’s not available in modern browsers.

On the other hand, WebRTC does work in browsers. Its tech stack allows for camera and microphone access on various devices. By using an efficient UDP based transport known as SRTP, WebRTC is able to transport video with the lowest latency currently possible. It can also still maintain a high-quality video, even in less than ideal network conditions.

In short, there are two replacements for RTMP; SRT for hardware encoders, and WebRTC for browsers.

Technically HTTP based protocols such as HLS or CMAF could be considered a replacement but their lackluster performance does not make them a viable option for real-time live streaming video.


Egress

Not only can WebRTC can perform the work of broadcasting (ingest) but it works for receiving video (egress) as well. Since WebRTC works natively in the browsers, you can connect to an egress WebRTC server and consume a video stream over SRTP.  All the decoding (and encoding for that matter) is performed directly in native code so it can be rendered directly in the browser.

Again, HTTP based streaming protocols such as HLS or MPEG DASH are capable of video egress, but they are not a practical choice for low latency streaming.

For more detail on how RTMP works, take a look at our other blog post.


Scaling Servers

No matter what protocol is used, it will need to handle multiple broadcasters and subscribers. For the best performance and distribution of streams, multiple server instances are necessary.

There is a common misconception that WebRTC is not scalable due to how it establishes a peer-to-peer connection. Under traditional modes of thinking this was technically true. However, some creative reconfiguration of scaling infrastructure resulted in the cloud-based Red5 Pro Autoscaling Solution supporting millions of concurrent connections all with under 500 milliseconds of latency.

Red5 Pro reimagined the entire architecture from back-end server to front-end browser and mobile app integration.

By leveraging cloud infrastructure, Red5 Pro’s Autoscaling Solution automatically scales the system up or down in response to current conditions. Under the operating logic of a Stream Manager– a Red5 Pro Server Application that manages traffic and monitors server usage– clusters or NodeGroups– a group of one or more active server nodes– are established in the geographic regions where the streaming will be happening.

More detail can be found in the Red5 Pro Documentation. If you have any questions about scaling or other aspects of low latency live streaming video, please send an email to info@red5.net or schedule a call.

Red5 Team

Share
Published by
Red5 Team

Recent Posts

Red5 on OCI vs AWS IVS: Why Oracle Cloud Infrastructure is an Excellent Choice for Live Streaming

As organizations evaluate live streaming solutions, Amazon Interactive Video Service (IVS) has emerged as a…

4 days ago

What’s New in Red5 Cloud v1.9.2? Valuable UI/UX Improvements and Bug Fixes

Let’s go over the latest updates introduced in Red5 Cloud since our previous blog covering…

4 days ago

Real-Time Streaming Solutions for 2025: Red5 vs. Phenix vs. Dolby Optiview

When businesses need ultra-low latency streaming capabilities, the choice of platform can significantly influence the…

2 weeks ago

AWS IVS vs Red5: Choosing the Right Live Streaming Solution for Your Business

1Understanding AWS IVS: Strengths and Limitations2Red5: A More Flexible Alternative3When to Choose Red5 Over AWS…

4 weeks ago

What’s New in Red5 Pro v14.2.1 and Red5 Cloud v1.9.1?

Let’s take a look at the latest Red5 Pro and Red5 Cloud releases introduced since…

1 month ago

AV1 vs VP9 vs. VP8: Complete Codec Evolution and Comparison for Live Streaming

1Quick Comparison Overview2The Complete Evolution: VP8 → VP9 → AV13Technical Comparison: VP8 vs VP9 vs…

1 month ago