Ruby on Rails is an open-source programming language with a large following. According to their website, they optimize “for programmer happiness with Convention over Configuration.” No wonder that some very successful platforms such as Twitch, Basecamp, GitHub and Square are built using Ruby on Rails. Live streaming has enjoyed exponential growth, especially over the last… Continue reading 11 Platforms for Ruby on Rails Live Video Streaming
Ruby on Rails is an open-source programming language with a large following. According to their website, they optimize “for programmer happiness with Convention over Configuration.” No wonder that some very successful platforms such as Twitch, Basecamp, GitHub and Square are built using Ruby on Rails.
Live streaming has enjoyed exponential growth, especially over the last few months. From chatting with friends, to live entertainment, and even just conducting essential business services, more and more people are looking to create live streaming applications. Obviously, developers well versed in Ruby on Rails will be among many of the people tasked with creating new applications or modifying existing ones.
So, how do you go about using Ruby on Rails for live video streaming?
Building your own application from absolute scratch is certainly an option. You could follow a guide on integrating WebRTC into a Ruby on Rails app to create a bare-bones Signaling solution using ActionCable. While this is a viable option, it may not be the most realistic given the amount of time that is required to create the most basic functionality.
A better option could be using a live streaming platform or service to help jumpstart your development. Look for solutions that offer SDKs for HTML5 and mobile platforms. Rather than going through the arduous and time-intensive process of building everything from scratch, an SDK gives developers a springboard for advanced functionality.
Speeding up the development process is particularly important when dealing with a complex integration such as adding live video streaming. The process of encoding and decoding video and audio codecs, packaging them into transport protocols and ensuring a smooth (not to mention fast) delivery takes a variety of complex components. Like many things in software coding, these components must be seamlessly spliced together. With so many moving pieces particular to the challenge of live streaming media, there are many places things can go wrong. Any missteps will result in dropping connections or degrading the stream into a garbled video with blocky artifacts. Furthermore, the resulting live stream should be scalable so that as many people can interact with the stream as possible.
Of course, for live content no matter how high quality a stream is, it will still need to stream with low latency. Latency as it relates to live video streaming is the time it takes between the video being captured to someone viewing it. Many factors add to latency including encoding and decoding, the network overhead, and most important, the transport protocol. Even with a smooth stream comes the risk of high latency if the proper protocols are not used. High latency degrades the user experience with unnatural delays in conversation and synchronization problems which leave users vulnerable to spoiler alerts among other issues.
This post will cover different services or platforms available for Ruby on Rails live video streaming.
Kurento is a piece of open-source software that was written to integrate with WebRTC, Kurento is a solid choice for live streaming. Considering that their codebase is completely open-source, Kurento will certainly be among the cheapest options. Their Java source code (note, not Ruby) is pretty well documented, and Kurento has got a nice set of features focused on WebRTC streaming.
Being completely free means that you won’t have to spend anything on direct software costs. However, Kurento is not a stand-alone solution in and of itself. Rather, it is a DIY pre-solution meaning that you will need to spend a good amount of time to create a working application capable of standing on its own.
In other words, while it won’t require upfront software costs, it will require time and effort to build a full-featured application from the ground up. Oftentimes, those indirect costs can supersede the price of a more built-out solution. That said, for those with the knowledge and time resources to use it, Kurento provides a good backbone to create a live streaming application.
There are many other open-source WebRTC options available as well like Jitsi, Media Soup, and Janus. While these others can work well too, Kurento is one of the best known and most widely used of the many WebRTC based media servers.
Another open source option is NGINX. Initially created as a web server for delivering websites, a multifaceted solution for proxying web content in general. NGINX can now be used for media streaming, load balancing, caching, web serving, reverse proxying, and more. This versatility has elevated NGINX to the most widely deployed web server in the world.
In order to address scalability issues with solutions like Wowza, and Flash Media Server (run on plain RTMP), NGINX reimplemented RTMP support to address the inherent scaling limitations with plain RTMP streaming. NGINX’s RTMP support created stability and better performance.
Despite addressing the scalability issues of plain RTMP, RTMP itself still has many limitations. RTMP was originally designed for delivering on-demand media and live media (i.e live audio, video, and data) over the Internet between a Flash player and an RTMP Media Server. Currently, newer developments of protocols and streaming methods, have often resulted in RTMP being used only for fallback support. Especially considering that Flash is reaching end of life status in 2020, RTMP is falling further and further away from the primary choice for a streaming protocol.
While not completely ineffective for the transport of internet content, newer innovations such as the development of the new Web Standard WebRTC, are quickly replacing RTMP. Designed around creating low latency, WebRTC has proven itself to be a very effective protocol for browser-based live streaming.
After examining the current shortcomings of RTMP, it is reasonable to conclude that a platform relying solely on RTMP, will be limited in regards to sustainability and performance. RTMP lacks HTML5 Support, which means it will not work in the browser, unlike WebRTC which can run directly in the browser without the use of a plugin. Of course, you can stream RTMP over HTTP, but that will negatively affect the performance.
Latency is a concern as well. As a TCP based protocol, RTMP will typically introduce more latency than protocols that can use UDP such as WebRTC. TCP (Transmission Control Protocol) is a transport layer protocol used for sending bits of data—known as packets—over the Internet. Since TCP is a connection-oriented protocol, it requires frequent communication exchanges to order packets and ensure their complete delivery. This constant exchange of back and forth messages introduces backed up packets and thus latency in many networking scenarios.
Lastly, there are issues with firewalls and encryption due to the fact that most RTMP content is sent via the non-standard port 1935 rather than the standard HTTP port 80. This issue can be prevented by streaming in RTMPT (tunneling over HTTP), which comes at a server and client performance cost.
We recommend looking for solutions that add modern real-time protocols like WebRTC. Though NGINX also supports HLS as a streaming protocol, HLS creates streams with latencies measured in multiple seconds. That means NGINX currently does not provide for truly interactive live streams which will negatively affect the user experience.
Dacast is a convenient hosted solution with an HTML5 player that supports broadcasting and subscribing through internet browsers. One of their standout features is expanded VOD support with the useful ability to insert ads.
Browser support works for both laptops and mobile devices as both kinds of devices can use internet browsers. However, mobile browsers do not always provide the best user experience due to how different browsers implement various methods and elements. Dacast lacks native SDK support for both Android and iOS. This means you cannot build a dedicated mobile app to ensure the best performance. UI and feature performance can suffer meaning a bad UX for your customers.
Dacast also suffers from high latency due to the tech stack they implemented. They use CMAF which requires segmenting the stream into small chunks, inevitably producing high latency. Furthermore, their website states a latency of 10 seconds which is further problematic since CMAF should actually be able to get down to 3 seconds or so. Any platform that measures latency in seconds cannot, by definition, produce low latency video streaming.
For those looking to get a streaming application up and running in a short amount of time, Mux’s easily managed service might be a good fit. Regardless of any specific live streaming video experience, developers with a basic understanding of software code can use Mux’s detailed documentation to create a simple video streaming platform.
Mux’s most impressive feature is their approach to CDN hosting. Their multi-cdn, cross-cloud approach helps optimize stream distribution across different regions and avoids some of the concerns of being locked-in to a specific CDN. However, you are still unable to stream outside of their network so while it’s certainly a larger network it is not a completely wide-open network which could still be confining.
Though the multi-cdn approach is quite impressive, it still relies on a CDN infrastructure. This will result in inherent limitations, due to the fact that CDNs use high latency producing, HTTP based protocols for stream delivery. Furthermore, Mux uses HLS which is the worst protocol in regards to latency. HLS produces latency anywhere between 10 – 30 seconds. Even if they are using Low Latency HLS (LHLS) that would still produce a latency of 2 – 5 seconds. With a delay that high, any sort of live, interactive experience is impossible.
If one is to believe the rumors, Phenix is a well-functioning streaming service with reliable performance. Prominent in their feature set is sub 500 ms of latency at scale as well as a very fast time t first frame.
One issue is the fact that we have not been able to confirm this ourselves as they do not have any exposed example code, free trials, or live demos. We’ve also had a few of our customers who’ve tried it mentioned that HD streams lack the expected clarity and look more blurry than they should. Without any way of testing, you might just have to purchase it to see how it works for yourself.
It’s also said that they are expensive but there is nothing posted on their site to either confirm or deny that claim. If you really want to find out more you will have to contact them. One would assume you could do that through their website but with all their secrecy, they might need to run a background check on you first.
Led by the outspoken Dr. Alex, Millicast is a hosted solution integrated with WebRTC. WebRTC was used in order to produce a low latency of 200-500 milliseconds. Thus, Millicast combines high streaming performance driven by WebRTC with the convenience of pre-packaged hosting. This means that you can stream relatively easily while still getting real-time interactivity.
Of course, convenience comes with its own price. Since you will be completely relying on third party architecture, any changes that third party makes will affect your application. If those changes have a negative impact on you, you may be faced with limited recourse due to the lack of flexibility and customization. This is known as a service trap since you are essentially bound to the service (in this case, Millicast) that your application was built on.
Another problem with Millicast’s hosted model means that your app’s infrastructure is shared with other companies’ apps. For some, this might not seem like a major issue. However, we have heard reports of Millicast servers being blocked by network-based child content protection filters. This is due to many Millicast customers being in the gambling space and/or other inappropriate non-kid-friendly businesses. If your app is geared towards education, for example, this may be a show stopper for you.
Furthermore, the back end infrastructure is inaccessible, so customization options will be limited. Without the ability to implement your own server-side logic, useful and/or necessary features such as transcoding, custom authentication, and pushing your stream out to other processes will all be limited. Rather than creating an application specifically tailored to your needs, you will be confined to a pre-built sandbox. You can avoid this issue by using hosting agnostic solutions, such as Red5 Pro.
TokBox is one of the best among all the hosted solutions on this list. Featuring a quick setup time and relatively easy to use interface, they are as close to plug and play app development as it gets. With fully-functional and well-built SDKs, TokBox excels at point-to-point communication. These features make TokBox ideal for quickly building a POC to meet an investor deadline or testing new concepts during a hackathon.
Over two years ago, TokBox was acquired by Vonage, so it’s official name is the “Vonage API”. However, most people still refer to it as TokBox so we continue to use that name.
As we mentioned above, convenience comes at a cost: customization potential, extra features, and scaling costs are all affected.
ToxBox’s pricing model charges per stream per minute. Normally, the intention is to grow your application to more and more users. That’s the normal path to building a successful application. With the lack of pricing based on tiers, every single user costs money and your TokBox expenditures will continue to climb. This creates a situation where success can actually hurt and cause you to lose money. One the other hand, if your application is designed around a small number of predetermined users, then TokBox could be an excellent choice.
Furthermore, “advanced” features such as recording and interactive broadcast (which excludes a specific mention of expected latency) come at an additional cost. Not only that, anything over 2,000 connections will switch over to CDN delivery. As we’ve covered before, CDNs mean higher latency.
Not only are there additional costs with scaling but there are performance tradeoffs as well. After 3,000 viewers TokBox switches to HLS delivery. Since HLS adds many seconds of latency, the stream will be sluggish and the user experience unbalanced. TokBox clearly cannot deliver real-time streaming at any sort of real scale.
The fact that TokBox is simple to install has negative consequences as well. Namely, it will limit the options for customization. If you are looking to expand functionality or add new features, you may be forced to wait on TokBox’s roadmap rather than implementing it yourself.
At first glance, ANT Media is a really great deal: sub 500 milliseconds of low latency, WebRTC-based live streaming at a cheap price. However, the real-world results are a little less promising.
Right away, ANT Media suffers in regards to scalability. Their Large Enterprise Instance hosted package limits you to 8 concurrent broadcasters and 400 HLS (high latency) subscribers. WebRTC subscribers are confined to only 300 clients. This low threshold will greatly throttle any growth of your application in the future. However, they do offer a Custom Scalable Cloud with a “flexible” number of viewers and publishers.
Scalability is far from the only issue, however. Maintaining relevance in a constantly evolving tech sector such as the live streaming industry requires the constant addition of new features demanded by consumers or technical adjustments due to industry trends. Like Red5 Pro, ANT Media built their platform on top of the Red5 open-source software. However, there is an important distinction in that the team at Red5 Pro still retains control of the open-source Red5 code. This means that open-source Red5 will simultaneously update with Red5 Pro as new features and optimizations are added. This is done in order to ensure that both the Red5 Pro and open-source Red5 systems maintain parity and seamlessly synch together to minimize any regression bugs or conflicts.
Additionally, this lack of ownership has a negative effect on customer service considering ANT Media doesn’t have the same detailed understanding as Red5 Pro has. The act of creating something from the ground up will inherently benefit your understanding of the product. Thus it is fair to argue that we have a more complete understanding regarding the capabilities and possible configurations for custom feature development.
This lack of direct contributions to Red5, means ANT Media cannot ensure entirely consistent functionality. More importantly, innovation is stifled as they simply copy the features that Red5 Pro adds to their product. Without progressive development, this lack of originality results in slow product features and/or updates at best and outright dysfunction at worst.
However, if price is your main concern, and scalability of your app is minimal, Ant Media could be a good option. Their pricing options offer good choices for hobbyist developers and those doing quick MVPs on a budget.
Another company with a long history in the live streaming industry is Limelight. Setting up Limelight’s RTS (Real-time Live Streaming) is fairly straightforward as they are a managed solution. The reason they are easy to setup is that they hande the server infrastructure for you. Limelight also provides mobile support through SDKs. With their consistent and reliable content delivery, it is a well-functioning platform.
Limelight’s introduction of their RTS (Real Time Streaming) feature boosted them into the category of what can be considered true low latency. This gives your streamers the ability to respond to events in real-time and facilitates interactive experiences.
In order to achieve such low latency, Limelight built their RTS platform with Red5 Pro’s software. Despite unlocking live streaming with a sub 500 millisecond latency, there are still some limitations. All the current features of Red5 Pro have yet to be fully enabled by Limelight, thus limiting the complete functionality. Further limiting functionality is the fact that Limelight doesn’t allow its customers to deploy their own server-side apps, which means you can only use what they choose to expose.
Lastly, Limelight is a CDN which means that the delivery of their content depends upon a series of fixed data centers. CDNs were designed around the HTTP based infrastructure of the internet, meaning they have historically served an important role in the delivery of content. However, this is starting to change as new, flexible, cloud-based systems are starting to replace fixed bare metal-based data centers due to their elastic scalability. That makes cloud-based systems a little more performant than bare metal servers. As such, cloud hosting is a big problem for Limelight considering their entire business model is built around these constantly running data centers.
Wowza is a common choice for live streaming due to their long history in the live streaming industry. This has led them to enjoy a good amount of name recognition. A big part of their success is that they can support a variety of ingest types.
Accordingly, a large percentage of their clients use Wowza for a single purpose serving as an ingest point in their origin server. After taking in a multitude of streaming protocols, the origin then converts the streams for CDN-based delivery. This repackaging of streams for CDN delivery is a good use of Wowza’s software.
While the large size of Wowza denotes their ability to create a viable product, their large size can work against them as well. The tech industry is always innovating which means that the most successful tech platforms need to be flexible and responsive to shifts in their industry. Examples of things to look out for in the live streaming industry, often include new and more efficient protocols that can drastically improve streaming quality and performance. Live streaming software (and those using that software) must anticipate these changes to continue delivering a successful product.
As we mentioned before, the end of Flash means that new standards, such as WebRTC, are poised to replace or at least supplant the old RTMP protocol. With this ever-encroaching deadline, Wowza’s efforts to fully implement WebRTC have apparently stalled. If you have the time,you should certainly give Wowza’s WebRTC implementation a try and see the results for yourself. You will find that lack of transcoding between codecs, no client side SDK, and lack of support for core features like packet resending for subscribers and ABR can be show stoppers for any production application. This doesn’t even get into the scale limitations due to the fact that Wowza has no clustering deployment for WebRTC streams.
Without real-world ready scalable WebRTC integration, Wowza still relies on CDN based video delivery causing them to suffer from high latency of around two or three seconds (at the very lowest using CMAF for example). This is nowhere near the sub-500 ms delivery that WebRTC has produced on other streaming solutions. Even Wowza’s latest attempt at Apple’s latest streaming protocol LLHLS comes up short on latency. Two seconds is far too high for any sort of live, interactive experience, even if you can find a player and CDN that supports this new protocol.
Also, if you go down the path of using Wowza cloud, it could mean getting locked-in to their Wowza network. Without an easy way of porting your existing application to a different hosting provider, you would be forced to adopt any changes they make to their network like what recently happened with Wowza ditching support for their Ultra Low Latency SDKs.
Lastly, there is a limitation on customization. Wowza created a general-purpose platform with basic functionality which made their platform very accessible. However, that simple functionality means custom feature development or even modifications can be hard to implement or receive support on. Customization is the key to ensuring your product stands out and does everything needed.
As our tagline states, we provide live streams to millions of concurrent users with milliseconds of latency. Similar to others in this list, we use WebRTC to deliver sub 500 milliseconds of end to end latency. Where we differentiate ourselves, is the fact that we support millions of users with that same real-time latency. Furthermore, our hosting agnostic solution supports a variety of cloud providers thus avoiding any lock-in concerns. With the ability to port an existing application over to another hosting provider, this gives you a great degree of flexibility over your back-end architecture. With full-featured SDKs for iOS, Android and soon Linux, you can get the same performance with mobile devices.
As a highly customizable solution, there may be some additional configurations to get everything working as needed, but we’ve been told that our documentation is pretty useful for addressing that. Our knowledgeable and responsive technical support helps ensure that you can make the most of our software.
Though we don’t currently provide a hosted solution, we are working on one so keep an eye out for a future announcement about that. For those looking for a hosted solution right now, we also offer an Enterprise Plan where we can set up and manage your account for you. That way you still have the flexibility and security of running your own servers without having to do any additional maintenance.
To explore everything we can do check-out our ultra low latency video streaming demo and sign-up for a 30-day free trial.
Of course, if you think we’re leaving something out (or are just plain wrong), please let us know! Send an email to email@example.com or schedule a call. We are always looking to improve.