Streaming media

Streaming media is multimedia that is constantly received by and presented to an end-user while being delivered by a provider. Its verb form, "to stream", refers to the process of delivering media in this manner; the term refers to the delivery method of the medium rather than the medium itself.

A client media player can begin playing the data (such as a movie) before the entire file has been transmitted. Distinguishing delivery method from the media distributed applies specifically to telecommunications networks, as most other delivery systems are either inherently streaming (e.g., radio, television) or inherently nonstreaming (e.g., books, video cassettes, audio CDs). For example, in the 1930s, elevator music was among the earliest popularly available streaming media; nowadays Internet television is a common form of streamed media. The term "streaming media" can apply to media other than video and audio such as live closed captioning, ticker tape, and real-time text, which are all considered "streaming text". The term "streaming" was first used in the early 1990s as a better description for video on demand on IP networks; at the time such video was usually referred to as "store and forward video", which was misleading nomenclature.

Live streaming, which refers to content delivered live over the Internet, requires a camera for the media, an encoder to digitize the content, a media publisher, and a content delivery network to distribute and deliver the content.

History
In the early 1920s, George O. Squier was granted patents for a system for the transmission and distribution of signals over electrical lines which was the technical basis for what later became Muzak, a technology streaming continuous music to commercial customers without the use of radio.

Attempts to display media on computers date back to the earliest days of computing in the mid-20th century. However, little progress was made for several decades, primarily due to the high cost and limited capabilities of computer hardware. From the late 1980s through the 1990s, consumer-grade personal computers became powerful enough to display various media. The primary technical issues related to streaming were:
 * having enough CPU power and bus bandwidth to support the required data rates
 * creating low-latency interrupt paths in the operating system to prevent buffer underrun.

However, computer networks were still limited, and media were usually delivered over non-streaming channels, such as by downloading a digital file from a remote server and then saving it to a local drive on the end user's computer or storing it as a digital file and playing it back from CD-ROMs.

New technologies
During the late 1990s and early 2000s, Internet users saw:
 * greater network bandwidth, especially in the last mile
 * increased access to networks, especially the Internet
 * use of standard protocols and formats, such as TCP/IP, HTTP, HTML
 * commercialization of the Internet.

"Severe Tire Damage" was the first band to perform live on the Internet. On June 24, 1993, the band was playing a gig at Xerox PARC while elsewhere in the building, scientists were discussing new technology (the Mbone) for broadcasting on the Internet using multicasting. As proof of their technology, the band was broadcast and could be seen live in Australia and elsewhere.

RealNetworks was also a pioneer in the streaming media markets, when it broadcast a baseball game between the New York Yankees and the Seattle Mariners over the Internet in 1995.

The first symphonic concert on the internet took place at the Paramount Theater in Seattle, Washington on November 10, 1995. The concert was a collaboration between The Seattle Symphony and various guest musicians such as Slash (Guns 'n Roses, Velvet Revolver), Matt Cameron (Soundgarden, Pearl Jam), and Barrett Martin (Screaming Trees).

When Word Magazine launched in 1995, they featured the first-ever streaming soundtracks on the Internet. Using local downtown musicians the first music stream was "Big Wheel" by Karthik Swaminathan and the second being "When We Were Poor" by Karthik Swaminathan with Marc Ribot and Christine Bard.

Business developments
Microsoft developed a media player known as ActiveMovie in 1995 that allowed streaming media and included a proprietary streaming format, which was the precursor to the streaming feature later in Windows Media Player 6.4 in 1999. In June 1999 Apple also introduced a streaming media format in its QuickTime 4 application. It was later also widely adopted on websites along with RealPlayer and Windows Media streaming formats. The competing formats on websites required each user to download the respective applications for streaming and resulted in many users having to have all three applications on their computer for general compatibility.

Around 2002, the interest in a single, unified, streaming format and the widespread adoption of Adobe Flash prompted the development of a video streaming format through Flash, which is the format used in Flash-based players on many popular video hosting sites today such as YouTube. Increasing consumer demand for live streaming has prompted YouTube to implement a new live streaming service to users. Presently the company also offers a (secured) link returning the available connection speed of the user.

Consumerization of streaming
These advances in computer networking, combined with powerful home computers and modern operating systems, made streaming media practical and affordable for ordinary consumers. Stand-alone Internet radio devices emerged to offer listeners a no-computer option for listening to audio streams. In general, multimedia content has a large volume, so media storage and transmission costs are still significant. To offset this somewhat, media are generally compressed for both storage and streaming.

Increasing consumer demand for streaming of high definition (HD) content has led the industry to develop a number of technologies such as WirelessHD or ITU-T G.hn, which are optimized for streaming HD content without forcing the user to install new networking cables.

Today, a media stream can be streamed either live or on demand. Live streams are generally provided by a means called "true streaming". True streaming sends the information straight to the computer or device without saving the file to a hard disk. On-demand streaming is provided by a means called progressive streaming or progressive download. Progressive streaming saves the file to a hard disk and then is played from that location. On-demand streams are often saved to hard disks and servers for extended amounts of time; while the live streams are only available at one time only (e.g., during the football game).

Streaming media is increasingly being coupled with use of social media. For example, sites such as YouTube encourage social interaction in webcasts through features such as live chat, online surveys, etc. Furthermore, streaming media is increasingly being used for social business and e-learning.

Streaming bandwidth and storage
A broadband speed of 2.5 Mbit/s or more is recommended for streaming movies, for example to a Roku, Apple TV, Google TV or a Sony TV Blu-ray Disc Player, 10 Mbit/s for High Definition content.

Streaming media storage size is calculated from the streaming bandwidth and length of the media using the following formula (for a single user and file):
 * storage size (in megabytes) = length (in seconds) × bit rate (in bit/s) / (8 × 1024 × 1024)

Real world example:

One hour of video encoded at 300 kbit/s (this is a typical broadband video and it is usually encoded in a 320 × 240 pixels window size) will be:


 * (3,600 s × 300,000 bit/s) / (8×1024×1024) requires around 128 MB of storage.

If the file is stored on a server for on-demand streaming and this stream is viewed by 1,000 people at the same time using a Unicast protocol, the requirement is:


 * 300 kbit/s × 1,000 = 300,000 kbit/s = 300 Mbit/s of bandwidth

This is equivalent to around 135 GB per hour. Using a multicast protocol the server sends out only a single stream that is common to all users. Therefore such a stream would only use 300 kbit/s of serving bandwidth. See below for more information on these protocols.

The calculation for live streaming is similar.

Assumptions: speed at the encoder, is 500 kbit/s.

If the show lasts for 3 hours with 3,000 viewers, then the calculation is:


 * Number of MBs transferred = encoder speed (in bit/s) × number of seconds × number of viewers / (8*1024*1024)


 * Number of MBs transferred = 500 x 1024 (bit/s) × 3 × 3,600 ( = 3 hours) × 3,000 (nbr of viewers) / (8*1024*1024) = 1,977,539 MB

Protocols
The audio stream is compressed using an audio codec such as MP3, Vorbis or AAC.

The video stream is compressed using a video codec such as H.264 or VP8.

Encoded audio and video streams are assembled in a container bitstream such as MP4, FLV, WebM, ASF or ISMA.

The bitstream is delivered from a streaming server to a streaming client using a transport protocol, such as MMS or RTP. Newer technologies such as HLS, Microsoft's Smooth Streaming, Adobe's HDS and finally MPEG-DASH have emerged to enable adaptive bitrate streaming over HTTP as an alternative to using proprietary transport protocols.

The streaming client may interact with the streaming server using a control protocol, such as MMS or RTSP.

Protocol problems
Designing a network protocol to support streaming media raises many problems, such as:
 * Datagram protocols, such as the User Datagram Protocol (UDP), send the media stream as a series of small packets. This is simple and efficient; however, there is no mechanism within the protocol to guarantee delivery. It is up to the receiving application to detect loss or corruption and recover data using error correction techniques. If data is lost, the stream may suffer a dropout.
 * The Real-time Streaming Protocol (RTSP), Real-time Transport Protocol (RTP) and the Real-time Transport Control Protocol (RTCP) were specifically designed to stream media over networks. RTSP runs over a variety of transport protocols, while the latter two are built on top of UDP.
 * Another approach that seems to incorporate both the advantages of using a standard web protocol and the ability to be used for streaming even live content is adaptive bitrate streaming. HTTP adaptive bitrate streaming is based on HTTP progressive download, but contrary to the previous approach, here the files are very small, so that they can be compared to the streaming of packets, much like the case of using RTSP and RTP.
 * Reliable protocols, such as the Transmission Control Protocol (TCP), guarantee correct delivery of each bit in the media stream. However, they accomplish this with a system of timeouts and retries, which makes them more complex to implement. It also means that when there is data loss on the network, the media stream stalls while the protocol handlers detect the loss and retransmit the missing data. Clients can minimize this effect by buffering data for display. While delay due to buffering is acceptable in video on demand scenarios, users of interactive applications such as video conferencing will experience a loss of fidelity if the delay that buffering contributes to exceeds 200 ms.
 * Unicast protocols send a separate copy of the media stream from the server to each recipient. Unicast is the norm for most Internet connections, but does not scale well when many users want to view the same television program concurrently.
 * Multicast protocols were developed to reduce the server/network loads resulting from duplicate data streams that occur when many recipients receive unicast content streams independently. These protocols send a single stream from the source to a group of recipients. Depending on the network infrastructure and type, multicast transmission may or may not be feasible. One potential disadvantage of multicasting is the loss of video on demand functionality.  Continuous streaming of radio or television material usually precludes the recipient's ability to control playback.  However, this problem can be mitigated by elements such as caching servers, digital set-top boxes, and buffered media players.
 * IP Multicast provides a means to send a single media stream to a group of recipients on a computer network. A multicast protocol, usually Internet Group Management Protocol, is used to manage delivery of multicast streams to the groups of recipients on a LAN. One of the challenges in deploying IP multicast is that routers and firewalls between LANs must allow the passage of packets destined to multicast groups. If the organization that is serving the content has control over the network between server and recipients (i.e., educational, government, and corporate intranets), then routing protocols such as Protocol Independent Multicast can be used to deliver stream content to multiple Local Area Network segments.
 * Peer-to-peer (P2P) protocols arrange for prerecorded streams to be sent between computers. This prevents the server and its network connections from becoming a bottleneck. However, it raises technical, performance, quality, and business issues.

Applications and marketing
Useful - and typical - applications of the "streaming" concept are, for example, long video lectures performed "online" on the Internet. An advantage of this presentation is that these lectures can be very long, indeed, although they can always be interrupted or repeated at arbitrary places.

There are also new marketing concepts. For example the Berlin Philharmonic Orchestra sells Internet live streams of whole concerts, instead of several CDs or similar fixed media, by their so-called "Digital Concert Hall" using YouTube for "trailing" purposes only. These "online concerts" are also spread over a lot of different places - cinemas - at various places on the globe. A similar concept is used by the Metropolitan Opera in New York.