Nowadays, live streaming has become an inseparable part of the Internet, and platforms such as Twitch, YouTube Live, and Facebook Live make it possible to share content with the world in real-time. But the task of transporting a smooth streaming experience to millions of users is immensely difficult. In the debate of “RTMP vs. New Streaming Protocols,” much has changed with the protocols that allow us to enjoy live streaming in the last few decades to address increasing demands.
RTMP: The Original Live Streaming Protocol
Real-Time Messaging Protocol (RTMP) enabled the first generation of live streaming by providing a standardized way to send audio and video between a media server and a player client with low latency. Developed by Macromedia (later acquired by Adobe) in the late 1990s, RTMP was designed when internet connections had just 10-100 Kbps of bandwidth.
It set the stage for the ongoing RTMP vs. New Streaming Protocols comparison by offering:
- Server-client architecture: An RTMP server (encoder) captures and packetizes the audio/video while an RTMP client (player) reassembles the packets and decodes the stream. This avoids overloading the client.
- TCP transport: Using TCP instead of UDP allows lost packets to be retransmitted, ensuring reliable delivery.
- Interleaved streaming: Audio and video packets are interleaved within the stream, allowing the audio to be delivered with lower latency while maintaining sync.
- Compressed delivery: Video is compressed using Sorenson Spark or On2 VP6 codecs for efficient transmission.
When Adobe Flash Player gained widespread adoption in the early 2000s, RTMP became the IT modernization standard for delivering Flash video and ushered in a wave of live streaming providers. However, RTMP’s performance declined as video quality and viewer expectations increased.
Related Article:
The Evolution of Video Streaming: A Deep Dive into RTMP and SRT Streaming Protocols
The Need for Speed and Scale
By the late 2000s, RTMP was maxed out, not in terms of bandwidth, but due to its architectural limitations when faced with higher resolution cameras, faster internet speeds, and the demand to stream to thousands of viewers simultaneously. The single-threaded nature of RTMP capped the resolution and frame rates of the stream.
Moreover, RTMP had no native support for adaptive bitrates, which is critical for adapting to bandwidth fluctuations.
Furthermore, RTMP used custom server architectures that were not scalable cost-effectively. The protocol was also tightly coupled with Adobe Flash, and it was a bit limited in player options.
As alternatives emerged, RTMP became relegated to niche use cases, though its core techniques still influence modern protocols.
HLS and DASH: Scalable Adaptive Streaming
To achieve scalability, the next generation of streaming protocols leveraged standard web servers and HTTP-based delivery. This provided a content delivery network (CDN)-friendly architecture capable of reaching massive audiences.
HTTP Live Streaming (HLS)
Developed by Apple in 2009, HLS splits the video and audio into small file segments, with each segment delivered sequentially using conventional web servers.
Key HLS capabilities:
- Works with standard web servers/CDNs for scalability
- Small file segments enable adaptive streaming
- Compatible across devices without proprietary plugins
HLS ushered in the era of adaptive streaming, allowing the player to adjust the stream’s resolution to match fluctuations in bandwidth and network conditions. This provides an optimal viewing experience regardless of the viewer’s device or internet speed.
However, HLS initially only worked on iOS devices. This led to a parallel adaptive streaming standard for broader device support.
Dynamic Adaptive Streaming over HTTP (DASH)
Standardized by the Moving Picture Experts Group (MPEG) in 2012, DASH-like HLS breaks the video into small HTTP-based file segments. DASH leveraged many of the same adaptive streaming techniques but worked across a wider range of web browsers and devices.
Key DASH capabilities:
- Unified standard for cross-platform adaptive streaming
- Optimized performance with TCP-based Multipath request scheduling
- Integrates digital rights management
With support for every major browser and devices ranging from smart TVs to game consoles, DASH remains a ubiquitous standard for adaptive streaming video delivery.
CTC and LL-HLS: Pushing Live Latency Limits
HLS and DASH were both very scalable, but they were designed for video on demand. Segmenting approaches with 10-30 second latency were difficult to use for interactive live streaming.
To reduce latency, streaming providers developed two innovative transport techniques:
Chunked Transfer Coding (CTC)
Employed by services like YouTube Live and Facebook Live, CTC builds upon HLS and DASH fundamentals while optimizing round-trip time:
- Small media chunks (~0.5 sec) minimize packaging overhead
- TCP and TLS connection reuse reduces handshake time
- Concurrent chunked transfer allows instant playback
With CTC streaming, services can now achieve sub-4 second latency for live video. For comparison, traditional HLS streaming latency could exceed 60 seconds.
Low-Latency HLS (LL-HLS)
Standardized by Apple in 2017, LL-HLS introduces HTTP/2 server push support along with tighter client-server integration to further optimize round-trip time.
LL-HLS enables sub-3 second latency by:
- Pushing video chunks from server to client proactively
- Drastically smaller segment sizes (~0.3 sec)
- Client can request future chunks ahead of playback
While LL-HLS and CTC have slightly different technical approaches, they highlight the streaming industry’s shift toward ultra-low-latency protocols.
SRT and WebRTC: Opening New Possibilities
Alongside the maturation of adaptive streaming, two additional protocols are carving out specialized roles in the live streaming tech stack:
Secure Reliable Transport (SRT)
Initially created by Haivision in 2013, SRT optimizes streaming performance across unstable network conditions by securing unreliable transport with:
- Encryption and authentication safeguards
- Packet redundancy strategies
- Congestion and jitter mitigation
This makes SRT ideal for streaming from remote locations via bonded cellular connections or the public internet.
The SRT project is now an open-source project with a consortium of industry leaders. Services such as AWS and Microsoft Azure have integrated SRT to the advantage of the robust transport layer option.
WebRTC
Web Real-Time Communication (WebRTC) is an open framework that allows browser-to-browser video, audio, and data transmission without plugins.
Initially focused on enabling Voice over IP directly between browsers, WebRTC capabilities now include:
- Ultra-low-latency video streaming
- Screen sharing
- File transfers
- Video conferencing
- P2P data routing
With native browser integration across billions of devices, WebRTC removes friction for real-time peer-to-peer communication-opening possibilities for innovative applications.
The Future: Democratization and Personalization
Continued acceleration of 5G and edge computing will further the trajectory toward sub-1 second latency while expanding possibilities for immersive live streaming.
Democratizing Streaming Creation
Faster broadband and the ubiquity of camera-enabled smart devices have democratized video publishing. Going forward, expect advancements like light field capture to enable holographic-quality live streaming from prosumer devices.
Companies like Lightship and Immersive Labs are pushing the boundaries of volumetric video. As costs decline and tools simplify production, more creators will broadcast in augmented and virtual reality.
Personalized Streaming Experiences
Advancements in computer vision, AI recommendation engines, and data-driven customization will tailor live streams to individual interests. Features like interactive polls, integrated eCommerce, and targeted promotions will enable more personalized viewer experiences.
Conventional entertainment and gaming content is not the only content that is set to expand the streaming landscape. For instance, a retail brand might provide shoppable live streams that correlate promotions to TV events. Fans could direct their viewing of a sports game via such things as multi-camera stadium feeds.
Since its inception more than 20 years ago, RTMP has kicked off an evolution, one that continues to be accelerated by an ever-increasing pace of innovation. The ongoing debate of RTMP vs. New Streaming Protocols reflects not only the limitations of the old but also the vast possibilities brought about by new technologies. Live streaming has always been about possibilities, and with each passing day, they become more modern and less virtual. The next generation of live streaming will just be a little better.
OneStream Live is a cloud-based live streaming solution to create, schedule, and multistream professional-looking live streams across 45+ social media platforms and the web simultaneously. For content-related queries and feedback, write to us at content@onestream.live. You’re also welcome to Write for Us!