Best Sports Streaming App Features and the Future of Sports Media


Discussion with Mike Downey, Principal Program Manager, Sports at Microsoft Our CEO Chris Allen joined Mike Downey, the Principal Program Manager for Sports at Microsoft, for an easy-going conversation about the current evolution of live streaming sports, and how the current need for real-time, interactive experiences has never been more important. As pioneers of live… Continue reading Best Sports Streaming App Features and the Future of Sports Media

Discussion with Mike Downey, Principal Program Manager, Sports at Microsoft

Our CEO Chris Allen joined Mike Downey, the Principal Program Manager for Sports at Microsoft, for an easy-going conversation about the current evolution of live streaming sports, and how the current need for real-time, interactive experiences has never been more important.

As pioneers of live streaming, they reminisce about the early days of Flash video, Adobe’s attempts to monetize the Flash player with a server offering, and how that led to the Red5 Open Source project–spearheaded by our founders–to reverse engineer the Adobe Flash Media Server.  From YouTube, HTML5 video, and smartphones they trace how Flash founder Jonathan Gay envisioned the future of live streaming and laid some of the groundwork for modern browser-based live streaming. Through an examination of the split between HTTP protocols and WebRTC, they reveal how Microsoft and Red5 Pro are proving you can have real-time latency and scalability.

Check out their conversation into how we can “make virtual as realistic as possible.”

Full Transcript:

Mike: All right, hello, everyone. Mike Downey back again. If you don’t know me I work at Microsoft, I focus on sports technology and entertainment, but primarily sports. Currently, I’m dedicated to our partnership with the NBA. I’m working on that for, gosh, going back to January of 2019, so almost two years been working with them. We announced our partnership in April I think of this year, so it took a while to put that all together. I’m in an engineering organization called commercial software engineering and we work with Microsoft’s top strategic customers, partners, on building technology with them. So we work with their engineering teams and we write software together.

So my world has been very heavily focused on streaming media over the last nearly 11 months now, I guess probably a little bit longer, because that’s the focus of what we’re doing with the NBA. It’s all about streaming media. And in fact, it’s specifically what they refer to as their direct to consumer business, and that’s taking live games and distributing them over the top, OTT, which it just means digitally, over the internet versus the traditional cable or satellite delivery. And so in doing that work I’ve really had to go back to an area of technology that I spent a lot of my career working on which is streaming video technologies, going all the way back to my early days at Macromedia which became Adobe later and some stuff we do with Flash.

And so as I’ve been digging back into that old world and diving back into the technology and learning more about it I’ve been reaching back out to some of my old friends, and today I have a guest on, Chris Allen who I’ve been friends with for years, we go all the way back to the early Flash days. And I’m going to bring Chris here into the stream. There he is. Okay, Chris.

Chris:  Hey, everyone.

Mike: And Chris is the co-founder and CEO of a company called Red5 which does low latency live video streaming. And so Chris, I’ll let you go ahead and introduce yourself, tell us a little bit about Red5 and then we’ll go back to the early days and we’ll talk a bit about you and I and the work that we did on Flash early on and then talk about kind of how that video technology has evolved over the years. But please, go ahead and introduce yourself.

Chris:  Yeah, fantastic. Thanks for having me on this show, I like this setup. This is really fun. So I’m Chris Allen, I’m a technical co-founder and CEO of Red5. We focus on live real time video streaming and doing that at scale. And when we say scale we kind of mean like both the one-to-many kind of use case where like you have hundreds of thousands of people or millions of people watching the stream all at once, but also doing what Mike and I are doing right now where you’re accessing somebody’s camera on their device and then getting all of those into a stream. It could also be IOT devices too like IP cameras, traffic cameras, drone streams, all kinds of other crazy stuff.

We’re very much a developer-focused platform so we enable developers to build applications on Red5 and Red5 Pro. And I mean, we’re seeing a lot of uptick in sports streaming especially around the use cases like sports gambling which is really hot right now and kind of it’s just starting to emerge, especially here in the US. Other countries have had it for a while but …

Mike: Well, there’s new laws recently that have been making that available, right? So there’s been a new interest in sports gambling.

Chris:  That’s exactly right. And then I think the other aspect that’s really accelerating the need for real-time latency in streaming with sports is that interactivity level. And Covid has obviously accelerated the need for this because everybody’s stuck at home, they can’t go party with their friends and watch an NFL game or an NBA game or whatever, and they’re lacking that kind of interactive experience. And then the same thing, like what Microsoft teams is doing with the NBA with the together mode, bringing people into the stadiums virtually. And I think the drive right now is really around this like let’s make virtual as realistic as possible. And the cool thing about it is the Red5 software stack is really helping create that.

Mike: This is exciting stuff. It’s fun, like when I … throughout my career I’ve kind of gotten in and out of different technologies or areas of focus, and like I said early on, a lot of my time was spent on video and internet video, streaming video, including my time when I first joined Microsoft and started working on Silverlight. But then as I shifted more into sports a lot of the work I did was more in the data and analytics and using data to optimize performance and those types of things.

And then now I’m kind of coming back into the media world and seeing kind of this latest phase of streaming video and learning more about it and thinking through these scenarios where getting super, super low latency of the broadcast out to the fans so that you can account for betting scenarios and stuff like that or trying to … there’s this scenario we talk about a lot in the projects I’m working on around even just synchronizing the playback experience across connected users who are all remote from each other so that they can watch a game together. It’s kind of a Covid thing in many ways. If my buddies and I can’t get together in the same room anymore then how can we use some more modern streaming technologies to enable us to synchronize that experience? And things like that. So it’s been really interesting to learn more about WebRTC and the products and services out there like what you guys are doing.

So before we get into that I thought it would be fun to kind of step back a bit because we’ve known each other for a while. And it was funny that the timing this morning there was a thread on Reddit that referenced Flash and it was a pretty funny thread about the death of Flash which is it happened a while ago, the death.

Chris:  In technology end of life is actually like what in the 2020 or something like that it’s like …

Mike: I don’t even know, yeah.

Chris:  I think there’s like some definite … well, at least from Google Chrome they’re like they’re totally killing it.

Mike: Oh, that sounds right, yeah. But anyway, so I was reading this thread and people were asking questions about the history of Flash so I was able to kind of jump and be like, “Yeah, I know some things about that. I can share some stories.” So I got really nostalgic about it. But to be specific to our topic here in video, I think historians of the internet and people who are particularly interested in this world of streaming media really need to understand where it all started and how the impacts that Flash specifically had on how we experience video on the internet.

So in the pre-Flash days before we had video support in Flash the way you got video through the internet was primarily through a plug-in like RealPlayer or QuickTime or Windows Media Player, and the problems with those technologies is that they were kind of an out of browser experience, or if you could load it within the browser Chrome what you got was the kind of branded RealPlayer look and feel or the QuickTime look and feel, and so it felt very much like a disconnected experience. And then those players themselves were quite large because they endeavored to do a lot of things, support a lot of codecs and a lot of especially like content protection, a lot of capabilities that were still during the dial-up days of the internet.

And so the opportunity was like, “How do you deliver video through a website when you don’t want to force people to do these really slow downloads before they can even watch the video?” And that’s where Flash came in. So at the time Flash was the most pervasive piece of software on the internet. And I won’t go into all the details, but every version of Windows and Mac shipped pre-installed with the Flash player and it was always a really, really small download itself so when new versions came out they got pushed out really quickly.

And you may not know this, and again, we’re getting like the more nerdy history of video online, but the founder of Flash Jonathan Gay at Macromedia he took a sabbatical in the … I think it was the Flash four to five transition period. So after Flash forward come out he went off to prototype adding video support to the Flash player. And that was a tough challenge because the number one thing about Flash was how small it was, that was the whole point, like it’s a small plug-in, small file.

Chris:  It was crazy. Yeah, I mean, just the engineering alone to keep it that small was incredible which actually made it easier for us to reverse-engineered it with our team.

Mike: Yeah.

Chris:  It was a lot of reuse of the same stuff. So anyway, that’s a whole other kind of aspect to it.

Mike: Yeah, so he went off on a sabbatical and he prototyped something that he code-named “SALSA” which was the code name for a messaging server. And the point of that was to enable what we’re doing right now, so it was he put video support into Flash because he envisioned doing browser-based communications like a Skype type of scenario, but the way we’re doing it right now which is all integrated into the browser so we can talk to each other. And that’s how it all started.

And so he had prototyped a server and then he put a very lightweight H.263 video codec, the lightest one he could get into the runtime so that he could try to create something like what we’re using right now, Streamyard, for those of you can’t see it, is this hosted application that’s enabling us to broadcast and record all of this.

Chris:  That’s right. And we’re technically using WebRTC to communicate with each other right now.

Mike: Right.

Chris:  But back in the day this would have been Flash, yes, absolutely.

Mike: Yeah, exactly, exactly. So he created a server and used kind of a variation of I think RTMP originally to do that.

Chris:  That’s what it was, yeah. And Mike, not to go too far down the rabbit hole here, but do you know if AMF had already been out by then that which is this messaging format?

Mike: That’s a good question. I think that might have been part of what he prototyped, and that was …

Chris:  I think so too. The whole thing was like …

Mike: It was a protocol for the like chat messaging and stuff like that.

Chris:  That’s right.

Mike: Because they called it Flash communications server when the product finally came out. And the business objective was to have a high price server because at Macromedia and Adobe one of the biggest challenges we always had was we couldn’t really effectively monetize Flash. The main way we made money off of it was the authoring tool, the design tool, which it’s like it’s a tool like Photoshop, right? Like, it’s just packaged software. But we had this really, really, really pervasive run time and we were always trying to figure out like how do we make more money off of this. We never did a very good job of it by the way. We tried a lot of different things, Flash communication server was one of the ways we tried to do that.

Chris:  Well, we kind of screwed that up for you guys too, because …

Mike: We transitioned to what you guys have. So innovative entrepreneurs out there figured out like, “Hey, we can reverse engineer some of this and kind of create our own solutions on it,” which happened throughout the entire history of Flash – there were third-party compilers, there were all kinds of things that went outside of our ecosystem to enable people to do great stuff with Flash.

And so to button that story up, out of that we ended up getting a video player into the Flash runtime, super small file size that everybody had, and the internet was forever changed because of that. Because now you had YouTube come into existence. That only happened because of Flash.

Chris:  Yeah, exactly. It was crazy.

Mike: And then everything went from there. And ultimately, you got the video tag in html5, that wouldn’t have happened without Flash bridging between those moments.

Chris:  Yeah, I don’t think people would have necessarily thought about it that way, without it really happening. So I think that’s right. And the Flash communication server, speaking about making money off of it, you guys were charging I think it was $50,000 a CPU to run it.

Mike: That sounds about right. It was probably something like that, yeah. It was going to change the world, Chris.

Chris:  Yeah, I know. Enough of us got frustrated with that that we ended up, a group of guys and me, reverse engineering and figured out how this RTMP protocol worked. And like I said, part of it that made it easy was like some other people would figure out the AMF object format which is actually what was being transported over RTMP. So once we figured out the transport with wire shark and all that stuff it was pieces fell together and then we created an open-source alternative. That was I think fall of 2005.

Mike: Oh, right. So right after the Adobe acquisition. Yeah, right around that time frame. That was that was 2005. So that’s really the origins of Red5.

Chris:  That is the origin of Red5.

Mike: So why don’t you walk us through for people watching who may not know this technology space much, so get into streaming for a minute, so you had HLS, you had then kind of later the emergence of smooth streaming from Microsoft which is designed kind of with Silverlight in mind, based on IIS. But I think the early ones were MPEG-2 transport stream-based Apple, I think QuickTime streaming server they called it or something like that.

Chris:  No, that was RTSP actually, the QuickTime.

Mike: RTSP, okay.

Chris:  So there’s all the … it was a mismatch of all these different things. Yeah, and HLS is actually TS-based. That’s all it is, it’s like TS segments to kind of overwrite (FTP?), right? Yeah, I mean, all of those kind of came out to replace Flash because Flash was being phased out as plug-ins were nasty, they don’t work on the phone, the notorious Steve Jobs letter which is kind of like the nail in the coffin, right?

Mike: It was really the phones. It was just the emergence of the Smartphone I think was what really drove that, yeah.

Chris:  Yeah. I think that’s fair enough. I mean, the funny thing is now that those phones are way more powerful, like our phones today are way more powerful than anything that Flash ran on back in the day that we’re talking about, but anyway, that’s a whole other discussion.

The thing that was happening kind of in parallel though was WebRTC, right? So the broadcast industry and people like YouTube and like all of these different groups which they kind of went into this camp of like recreating television on the internet, right? I would say that’s not the best description of it, but Netflix is another one like that which actually we were just talking about before we got on this, they were using Silverlight which was the Flash competitor for quite some time.

But I guess WebRTC started to emerge to replace this video communication thing, which the funny thing about RTMP is it could do both. And I think we lost that ability and these kinds of really creative applications which would involve lots of real-time stuff with the live broadcast and everything started to go away and people just thought, “Oh, well, now that’s not possible,” right? And one of the projects we did with Adobe pretty early on with you guys was I think it was 2008, it was with Adobe consulting, they got a gig with the NFL and Sunday night football with NBC Universal. And that was a pretty crazy project because we ended up creating this … it was pre-cellphone or it’s pre-Smartphones, obviously we had cellphones back then.

Mike: Yeah, right. Not modern, yeah.

Chris:  Right, but people would use their laptop, right? And so the idea with the app is you’d sit and watch TV with your laptop. So this is one of the first second-screen experiences. And they could pick the camera on the field and watch it on their laptop while watching the game on TV and be able to switch between them and stuff. And we targeted … we had to cap it at a million concurrent viewers on that setup. And that was using FCS or Flash communication server.

It was one of the big massive distributions of Flash for a live event. And that was we were getting latencies within the one to two-second range, before you would even see it on the TV which was kind of interesting. But I guess where am I going with this? Like, I think what ended up happening with all these HTTP based protocols is everybody went down that path and then the latencies grew to be like 20 to 50 to sometimes over a minute in latency or the time between when it’s captured on a camera on one side and somebody sees it on the other. And then at the same time, all the WebRTC people were thinking of, “Oh, yeah, this is only for point-to-point communication or replacing essentially Skype,” right?

And I think what we’re seeing now and what we’ve really been pushing and we saw this probably five or six years ago when we launched the Red5 Pro, the pro version of the product and went down this path of really bringing it back essentially to the Flash experiences days where we knew, we had all these customers that we knew of that needed that real-time latency and they also needed the scalability of it which was kind of lacking from everything. So you either had high latency and scalability or really low latency and no way to scale it. And so that’s what we put together is to figure that out, and then on top of that be able to create the synchronization of the data streams and everything else that are going with it.

And I think we’re now back to a point, as you were talking about, the stuff you guys are working on with the NBA and other experiences you’re seeing out there are now requiring the real-time latency and the video communication aspect kind of becoming a hybrid experience.

Mike: Yeah. And it’s good, we should talk a bit about kind of where the mainstream, like what are the technologies that have been in place for the last say … I’d say 10 years, but even more recent, the last five years, the standard. And we’re like right now with our project we’re looking kind of like what’s the current standard for how you stream video online, but then obviously we’re looking at scenarios like yours and thinking like, “Uh, how can we think about using WebRTC-based streaming in the work that we’re working on?” And we haven’t figured that out yet.

Chris:  Right.

Mike: But to continue this kind of education of anyone who’s watching and how the streaming works, so you had HLS I think largely because of the success of the iPhone, HLS was just supported–was baked into the web player in iOS and it was pretty easy to do it that way, that kind of became the standard. It’s arguably maybe still the most commonly used, but in the middle of that kind of over the last say five to eight years maybe MPEG-DASH was being advocated as more of an industry standard, and it was kind of a hybrid.

Chris:  It was done by the regulator … the groups, like the …

Mike: The MPEG.

Chris:  Yeah, exactly. As opposed to one company, meaning Apple, right?

Mike: Apple, yeah. And early on in my years at Microsoft like I worked with the team, like for example, when I first joined Microsoft one of my first projects was an open-source video player for Silverlight, for html5, for windows apps. There was a whole we called the player framework, a creative name. And actually, first it was called the Silverlight media framework and then we renamed it when we went beyond Silverlight. But even then we were looking at smooth streaming which is Microsoft’s technology and then HLF.

Chris:  And now that we have TCP socket-based thing too, right? Oh, no, smooth streaming, was that HTTP?

Mike: Smooth streaming was more like DASH; it was fragmented mp4, yeah, all over HTTP. And Apple, Microsoft, and other companies all joined in the MPEG effort to say, “Let’s create a standard.” And I remember, I haven’t thought about this in a long time, but remember even when we’re … we, I don’t mean me, I mean smarter people than I at Microsoft who are participating in the definition of the standard, that even with DASH, you had DASH and then you had DASH TS which was the transport stream variant of DASH which was more in line with HLS and MPEG2 TS. And then there was the fragmented mp4-based DASH spec. And I honestly don’t even know how that all ended up. But I know that I think DASH fragmented mp4-based is kind of what has become closer to a standard nowadays, so I would imagine it’s now a combination of HLS and DASH which is most popularly used.

Chris:  And now it’s Apple’s next low latency one. I said low latency; it’s like still three or four seconds. Lower latency, right? And then there’s CMAF which is another approach to pre-send or pre-fetch, the requests are coming in from the HTTP stuff. But anyway, it’s a mismatch of technologies. And like I said, WebRTC was always around during these too, in various stages, right?

Mike: Yeah, right.

Chris:  But it was like just a totally different camp of people just focusing on it. And I think now this is where we’re starting to see everything kind of merge again. In our view the HTTP protocols are really I think they’ll be phased out because they’re not really the best way to do stuff, right? You already see Quick being another replacement for it which is UDP-based which is kind of trying to mimic the behavior of HTTP over UDP-based protocol. WebRTC uses a transport called SRTP or real-time protocol, the secure version of that. And I think that those are going to be the things that will replace it. And ultimately, we’re not going to have this high latency type of thing for any kind of broadcast. It’s just going to be a thing in the past, we’re going to remember back, “Oh, do you remember when you had to watch the game and your friend would see it on the phone but then you would text him and then he would get it like a few minutes later,” or whatever.

Mike: Right, that’s the problem, yeah.

Chris:  I mean, and then I think everything … it’s funny, they even use the term OTT, really that’s going to be TV for us, like there’s not going to be anything else. I think it’s going to be all internet forever.

Mike: That’s the default.

Chris:  Yeah.

Mike: It’s all internet, yeah.

Chris:  It would be like us talking about landlines now. It’ll be like, “Oh, yeah, you still have one of those?”

Mike: Right, you’re totally right about that, yeah. But I mean, the big limiter was always bandwidth, right? And now that’s less of a problem.

Chris:  Yeah. And there’s 5G, the promise of that has just accelerated even more. I guess like I was saying I think a lot of … and the other problem I think is that the CDN’s have really like focused on HTTP. It’s great for static content or for VOD content or something like that, but it’s really terrible for live. But it’s super easy for them to scale it because they’ve got this whole caching mechanism, they can get everything distributed out to their edge nodes and it works well. So they’ve got a lot riding on making that work, because they have all the infrastructure to do it, so this is where I think the cool thing about what we’re doing is leveraging cloud. So we can deploy our clustering architecture and all the nodes that are delivering WebRTC is our cloud instances, so if we deploy on Azure, for example, we’ve got a huge amount of data centers we can leverage with you guys and particular Microsoft, right?

Mike: Yeah.

Chris:  And a tremendous backbone, and then we can deliver everything in real-time without any downside really. I guess the only downside is how you guys are charging for gigabyte transfer right now, the bandwidth charges. But I think that’s just a matter of economics and getting it to work out.

Mike: Yeah. So, I mean, it does beg the question, right? Like, you’ve built a product around the technology with really impressive capabilities, low latency. And as you said, WebRTC is not new, like it’s been around for a while. And a lot has happened in streaming media over the last 10 years while WebRTC has been out there, why now all of a sudden is this an option and even something that you advocate as being the best option? What changed in 10 years? Does nobody realized it or was there something about the cloud and the economics of the cloud and there’s certain things you can do now that are obviously less expensive than they were five years ago because of Azure and AWS and GCP and all those options? But what created this environment?

Chris:  It’s a great question. Look, there are a few things that are going on during that time, that kind of 10-year span, one is WebRTC finally got adopted by everybody. So that was only recently that it was added to iOS Safari, like in the last two or three years, right? And so before that you kind of were stuck with HLS, right? Just no matter what, you’re like, “Oh, well, if you want to get it on an iPhone you’re going to have to use HLS,” right? That’s no longer the case, everybody’s adopted it, it’s standard, it’s ratified. It’s not going away kind of thing. So that aspect had to happen.

The other part–and you’re absolutely right–the cloud infrastructure needs to be expanded enough to be able to make this viable and not have to rely on CDN’s who have their own data centers and they basically, I mean, if you think about it, they have their own cloud, right? It’s just they don’t have it for general compute, that’s the only thing. And now the ironic thing is I’m actually talking on a conference next week called edge hog day and it’s about edge computing, but edge computing is basically just taking CDN’s and turning them into cloud.

Mike: That’s true.

Chris:  You’re kind of … like it’s really weird. I guess, yeah, so cloud infrastructure needed to be there. And now, in terms of making the price right and making it work for everybody, there’s the alternative clouds like the smaller guys like Digital Ocean, Linode, and a lot of these others are pricing it effectively to be able to do live streaming over them. Like, Digital Ocean is incredible in terms of their pricing, it’s like they give you … like on a two-CPU instance they’ll give you four terabytes of free data transfer per month for that, and then the overage is at one cent per gigabyte. So that’s like now you’re in the CDN range in terms of the price for delivering the video and you’re good to go, you know what I mean?

Mike: Right.

Chris:  Azure, GCP, and AWS they all have higher rates but I think when you start dealing with volumes I’m sure the business people at these places are going to make that work. You know what I mean? It’s to their advantage to do so, right?

Mike: Yeah. Well, often I think it takes a demand for a specific use case like yours and business people going, “Oh, that’s a business opportunity that we’re not taking advantage of yet. Let’s put together a pricing model that works so that we can go after that business.” And I think we’re probably just still in the early stages for that.

Chris:  That’s exactly right, yeah. And I think the smaller guys have gotten it.

Mike: Right, which is often.

Chris:  That’s the way it goes, right?

Mike: We all watch and see which one of you guys wins. Yeah, exactly, yeah. So outside of the low latency are there other benefits in maybe just thinking more from a content creator, content producer point of view? If you’re a brand who has a live product, outside of low latency because that seems obvious … now, I would argue that low latency is only important in a subset of the live streaming scenarios, but probably a pretty big subset, but you’ve got like the gambling scenarios.

Chris:  You have auctions.

Mike: Yeah, so there’s lots of those. So outside of that, what other things should people be thinking about when they’re looking at WebRTC as an option versus DASH or HLS or something else?

Chris:  Well, I mean, our view on it is everything is becoming, at least in live, is becoming more interactive. So whenever you add some kind of interactive component to the experience all of a sudden the latency matters a whole hell of a lot, right?

Mike: Sure, yeah.

Chris:  And that usually has to do with some kind of other piece of data going along with it. Otherwise, you got to get into the whole business of synchronizing it, right? And like that’s why net insights business that they sold off to Amazon Prime that was their whole thing, they were synchronizing HLS streams and DASH streams. Like that was pretty cool because you get the sports score to go along with it and it wasn’t going to go off and it was going to be very consistent, but that was only one aspect of it. So our view is the latency makes a huge difference whenever you’re synchronizing with other people or you’re collaborating … the other thing that happens too is you got other things that are happening at the same time.

Twitter is a good example of that, right? People are tweeting in just about real-time. I mean, it’s pretty close to it, right? When you send a tweet out pretty much somebody sees it right away. And you’ve got these delays in a live stream that makes it completely inconsistent for people. And that’s why people are getting spoilers and all that kind of stuff. So I think that’s another aspect.

But I think it’s this new hybrid multi-directional stuff though that’s really making … so I guess what I’m saying is there’s the just straight data stuff and then there’s the video stuff like what we’re doing now and collaborating, imagine if we now all the people that are watching us on LinkedIn as you noticed in the beginning there is a huge delay because they’re using HLS for this, right?

Mike: Yeah, right.

Chris:  And what if it were actually delivered in real-time and then we could through LinkedIn’s browser-based interface be like, “Oh, let’s take a caller now. We would love to hear from you.”

Mike: Yeah, that’s a good point.

Chris:  Right? And it’s like, boom, they’re into the video with us and part of the experience.

Mike: Yeah, this whole Streamyard service that I’m using to host these live streams – you’re right, like there’s no reason that couldn’t be baked into each of the platforms and just doing this straight through LinkedIn.

Chris:  Absolutely.

Mike: Because, yeah, right now like it’s a clutchy experience because there’s about a 15-second lag, plus, in what you and I are talking about right now and when they see it. So from letting them ask questions in the chat and then responding to that – it sucks because they’re way behind.

Chris:  Hugely, right?

Mike: And it’s not a great experience. And even on the kind of the more mainstream consumer scenarios like the work that I do day to day and thinking about the future of sports scenarios apps solutions where you’re watching live sports, a lot of our ideation around what’s the stuff that hasn’t been done yet or hasn’t been done well in sports, most of the things we talk about really are thinking about bringing fans together and allowing them to collaborate with each other, communicate with each other while they’re watching a game and host their own little viewing sessions together.

Chris:  Or a unique content per like team or whatever and making it …

Mike: Oh, that’s a good point too, yeah.

Chris:  That stuff requires a pretty good amount of low latency and server activity to it.

Mike: Yeah, that’s a good point, that’s a good point. So I think in these media scenarios, these consumer media scenarios, like that’s where mostly the opportunities for innovation are. It’s really that, it’s these live communications, low latency scenarios, and then it’s just content personalization and how do you get more signal and less noise for people who are consuming media. And that’s not so much something specific to the streaming technology as it is just the overall app experience that you’re creating.

Chris:  Well, it can be. I mean, if you think about personalization and making … like one thing I think a lot about is like if you’re watching a sports game and you’re like you have your favorite player, wouldn’t it be cool to have like a view where you could like always watch that person or you get a special view that’s like your custom one? That’s actually possible with WebRTC-based streaming as opposed to like an HLS stream where all the segments are actually coming from different servers; this is like their strategy for doing it with CDN’s, right? When it’s coming as a straight UDP connection down to you then that edge that’s delivering that UDP stream to you, you could customize that however you want.

Mike: Interesting.

Chris:  There’s companies like this company called Skreens, that’s their whole view on this, is you can get a different … they call it an encoder per person, which is true, like in cloud there’s like using encoders on those CPUs, or not CPUs, but they could be GPUs that are generating this stuff. And everybody’s experience could be totally different. And obviously, advertising can be extremely targeted that way, all kinds of other stuff. We could even get into sleazier stuff like you might be able to influence somebody to bet more if you’re targeting the player that you know that person really likes.

Mike: Oh, right, yeah.

Chris:  This stuff gets a little eerie, but …

Mike: Yeah, that’s interesting. And you could really micro-target advertising too.

Chris:  Oh, absolutely. Because you could really [cross talking 37:49] exactly who that person is and it’s like down to …

Mike: And get really creative with the video technology in doing that and green screening and bugs and things like that. That’s interesting. Okay, I get it. There’s some really interesting things you could do with that. So cool, so what’s next on the horizon? So you guys, like what are you focused on right now? Are you focused on kind of scaling and hardening and those types of things or is there a feature set and a use case that you’re really looking at towards the future?

Chris:  Yeah, so we’ve really gotten the one to many down right now of like getting the real-time latency out to like tons of people at once. We’ve also gotten a pretty good solution for getting a lot of ingest in, but the problem that we’re trying to solve right now is how do you get all of the video streams that people are putting in into a single output that then people can watch in near real-time, right? So I think together with the mode that the NBA is doing with Microsoft right now is a perfect early example of that, right? It’s pretty limited in what it can do, but imagine being a comedian and being able to see hundreds of thousands of fans all at once. I mean, hundreds of thousands is probably overkill on a big view, like that, but you still get the idea that …

Mike: There’s a point where it’s not, yeah.

Chris:  Right, it’s not practical, but I think … so we’ve got some pretty interesting mixing technology that we’re going to be releasing in the first quarter of next year. It allows you to combine just about as many streams as you want into a single like a grid or you can lay it out however you want. And we’re using basically browser technology to do the layout which is pretty cool, because you can just CSS, HTML5, just whatever you want to configure it.

Mike: Okay, that kind of sounds like some old Flash stuff that we used to do, right? I mean, that’s how we used to treat … when we did Flash 8, I think it was, yeah, it was Flash 8 because I remember I was a product manager for Flash at the time and we were out interviewing users during the product planning process. And I can’t remember who it was, but at one point we had a meeting where I want to say it might have been TBWA Shayat … Shiatday, the big ad agency that did all of Apple’s most famous ads down in Southern California. We met with one of their interactive teams. I think that, this is a long time ago, but I think that might have been it.

And they were talking about how they would love to be able to do alpha channels in videos so that in Flash you could green screen, cut out the presenter. And then be able to animate content behind it and have those portions be transparent. And I remember the team like us thinking about that and thinking like, “Man, that actually would be really cool.” Because then you could do animated presentation talking head style stuff where once you’ve shot the video then the designer has full control over what happens with the content, even being able to composite things live during playback.

Chris:  That’s exactly, man. This stuff is starting to happen too. There’s another company called, I don’t know if you know them, they kind of do the graphics overlay.

Mike: I think so, yeah.

Chris:  It’s an insanely cool product. It’s like basically After Effects like that runs in the cloud, and then it does the overlay browser and you can just do whatever you want, so it can be like tickers and scores. And they have a lot of big customers. I think the NBA actually is a customer of theirs too. So they’re replacing the traditional in-studio kind of graphics packages with a cloud-based offering.

But what’s more interesting about it is like using their stuff in combination with us, all of a sudden you can have really customizable interactive kind of graphics with it too. And that’s more aligned to what you’re talking about there too where all of that starts to become possible. It gets a bit crazy like when you start thinking about all these different things that you can do with it.

Go ahead, Mike, what?

Mike: I think when you give people creative control, give them more creative options of what they can do and then just stand back and let them … that’s what I learned from my days working on Flash, is that we always were faced with this like, “Do we make the product easier to use or do we …” Like, there’s always these different directions we can go with features, but we almost always decided to lean towards, “Let’s just put the capabilities in the run time, let’s just make it possible. And then see what people do with it.” Because we felt like the new creative stuff, the capability side always ended up with more momentum and more adoption and more excitement around it.

And it’s a fine line because over time the tool got way too hard to use because we kept focusing on that. But like we felt like that was the virtuous path, it’s just the ad capabilities.

Chris:  Oh, man. It was pretty amazing, because the Flash community was one of the most creative like …

Mike: Oh, no doubt, no doubt.

Chris:  … groups ever. You know what I mean? That’s one of the things I really miss about it. I think we’ve become very fragmented as technologists. And there was a real community around like just these crazy experiences that people could do.

Mike: Everyone knew all the standout, most talented people, like the community, knew each other and they rewarded creativity. And everyone was always trying to come up with the next cool interesting creative thing. And then someone else would see that and they’d build on top of it. And it was worldwide: like it was everywhere. And you’re right, I haven’t seen another developer, designer type of community like that since the Flash days. It’s out there, but it’s just it’s different now.

Chris:  Yeah. Well, I mean, that’s been our goal, and hopefully, we don’t get too bloated. At least we can just try and just allow the futures to be there and see what people do with it, because I think that’s where it’s going to get really exciting. We could have taken a path of like trying to do one vertical and just focus on that, but it just doesn’t excite me as much.

Mike: Yeah.

Chris:  I’d rather see what people do with this stuff and make something out of it.

Mike: Same here, yeah. All right, man, well, hey, I really appreciate it. This is fun. I love catching up with my old friends and talking about tech and what you guys are doing at Red5, looks super interesting, I love it. And I wish you nothing but the best of luck. And thanks for your time today. I appreciate it.

Chris:  Yeah, thanks, Mike.