MPEG DASH the new kid in HTTP delivery

image

With my experience working with hundreds of handset vendors, Apple has been most challenging. This is not surprised since they took smart phone market in to a new era leaving competition light years behind. Market leaders such as Nokia had to leave Symbian behind and join forces with Microsoft. And handset manufacturers such HTC, LG and Samsung saw Google Android coming to their rescue.

Until Apple came with HTTP live streaming protocol with iOS 3.0, RTSP was the only supported live streaming for feature phone and early smart phones. It was built for real time delivery with all the tools needed to optimize fast delivery to end devices. RTP payload was on UDP and over the top session management was on TCP with HTTP like RTSP protocol. It was a nightmare for over the top  providers who did not control the mobile network with NATing an Firewalls. But traditional telcos built their network around these challenges.

But Apple made it a level playing field with Apple HTTP live streaming protocol. Protocol is simple you chunk the live stream into small files, create a running playlist pointing links to the segments. Put a CDN infront to cache these playlists and segments. Everything is on network friendly HTTP.

It was great until Microsoft built Smooth and Adobe built HDS. The technology got fragmented with device specific deliveries. This became a nightmare for us. Supporting three different HTTP deliveries with layers of DRM technologies.

Now the industry is changing with MPEG releasing a standard which has a acronym of MPEG DASH.

It is an interesting time to be in the media delivery world, and very interested to see how big guns like Microsoft, Apple and Google play their cards.

4 thoughts on “MPEG DASH the new kid in HTTP delivery

  1. live hls streaming has to much latency to do any real kind of live broadcasting.
    Those little ip camera for home security and child monitoring are showing up anywhere. And most only support rtsp and mjpeg.
    This makes it difficult for companies that develop for these devices, manufacturers are not liable to add HLS to the camera firmware anytime soon.
    So we find we have to build our own rtsp frameworks based on open source decoders like those provided by ffmpeg. Some of us wonder if the world really needed
    HLS.

    • Agree, HLS isn’t good for real time streaming. It has a minimum 30s delay. This has been a issue for us as well, specially for sports content providers with lots of betting.

      However smooth streaming is bit better than HLS, with a 2s fragment size we have brought the delay to closer to 15-20s.

      I don’t think Android will stop supporting RTSP anytime soon, There are couple of streaming apps available on iOS which supports RTSP.

      You can always convert these RTSP streams to HLS via Wowza or even FFMPEG.

      • I remember when we first started working on streaming apps, mooncatventures has been around in one form or another since 2009, its as old as the iPhone. We originally did progress download of course thats was all that was that you could do, then we got involved with the ffmpeg4iphone project, it was like the wild west back then an nobody knew if apple would approve the apps. Its so much easier today, biggest issues new developers or even seasoned have is getting ffmpeg to build, but we developed a framework that makes that drag and drop. We work with security cam companies , hey they make baby monitors , we had to get the lentancy down and we have that quite low. varies for device but around 2 – 3 seconds and a lot of that tends to be the fault of the camera’s themselves.

        As for Android we’ve heard numerous reports that rtsp is broken in new versions, we haven’t verified that, clients of ours that tested onboard rtsp have found it to be unacceptable and asked us to build custom players with ffmpeg.

Leave a comment