From: Rohit Khare (rohit@uci.edu)
Date: Wed Feb 16 2000 - 15:20:13 PST
[Un filtered marketing hooey. Exactly as suspected, it uses
higher-rate networks intelligently. Doable with range retreival and,
well, buffering. It's hard to imagine 8 patents... Rohit]
=================================================
One feature that sets Burstware® apart from real-time streaming
solutions is its ability to cache data to client disk buffers in
Faster-Than-Real-Time. Servers ``burst'' multimedia data across the
network into configurable client buffers at a rate faster than the
play rate. Client-side players read the data from their local
buffers, enjoying images and sound that are insulated from network
disruptions.
HTTP streaming similarly caches video and audio data in local
buffers, delivering data at rates faster than the play rate. But the
similarity between bursting and HTTP streaming ends here. HTTP
streaming falls far short of Burstware in the areas of network
management, cost savings, quality of service, and reliability, all of
which are crucial for networked video applications.
HTTP Streaming Defined
In the simplest sense, HTTP streaming is the process of downloading
data across a network and into client buffers on the other end using
the HyperText Transport Protocol. This is the protocol used to
download HTML pages to web browsers.
Designed to transport small chunks of information, such as the
contents of a single web page, HTTP is a quick and dirty way to move
data across the network. Web programmers looking for a cheap
alternative to proprietary video streaming servers, such as those
offered by Real Networks and Microsoft, turned to the familiar HTTP
download as a streaming solution, and coined the term ``HTTP
streaming''.
Right Solution, Wrong Problem
HTTP is designed to service requests that tend not to strain
available network bandwidth. For example, a company may offer a free
software update through its download site. Many thousands of users
can request to download the data, but typically the download requests
do not arrive simultaneously and the files requested are fairly small.
Moreover, because HTTP was designed to handle data that is not
time-based, the server can simply resend the entire file from the
beginning if some kind of problem arises during the download. If the
server itself fails, the user can download the file from another
server, or wait until the downed server comes up again.
In this world of ``timeless'' content, minimal pressure on available
bandwidth, and small files, HTTP is free to deliver data in small,
inefficient chunks without regard for bandwidth conservation and
without a failover scheme.
HTTP Feeds Greedy Clients, One At A Time
HTTP, like FTP and other simple download protocols, is
client-centric, rather than network-centric. The server attempts to
deliver to the client as much data as possible as quickly as possible
until delivery is complete.
In this greed-based model, the client with the biggest pipe and the
fastest processor gets the best service, without regard for the
overall network bandwidth picture, or the impact on all clients of
servicing additional clients. The server does not track the status of
individual clients to determine who is running out of data and who is
not, and adjust service accordingly.
Network Resources Wasted
Another consequence of a greed-based model is wasted network
resources. Without an overall picture of the network, an HTTP server
is unable to optimize bandwidth usage. Available bandwidth is either
monopolized by HTTP clients, regardless of their actual bandwidth
need, or goes unused and wasted. The resulting network inefficiencies
raise infrastructure costs.
Failure Means Starting Over
HTTP streaming has no failover solution. Delivery is not coordinated
across multiple backup servers. Should the network or the server
fail, the user must request that the server resend the entire file.
Multimedia Delivery Needs a Smarter Solution
Larger Files Means Slower Service, Bigger Storage Requirements
HTTP streaming's quick and dirty delivery works reasonably well for
small files. Under heavy load conditions, however, delivery becomes
sluggish.
Larger files consume much more bandwidth, which slows the rate at
which HTTP streaming drops files into client buffers. Because HTTP is
not monitoring bandwidth and adjusting delivery accordingly, the
large files create a bottleneck that results in poor service to
clients.
Moreover, HTTP employs an unsophisticated buffering scheme, loading
the client's hard disk with the entire multimedia file. A client
machine must have plentiful storage space available in order to view
a large file.
Multimedia Delivery Needs A Smarter Solution
The Burstware® architecture is tailored to address these specific
problems, offering sophisticated bandwidth management, reliable
failover, and delivery optimized for large files.
Burstware Manages The Whole Network
The Burstware® architecture manages the network system as a whole,
not just individual client-server relationships. Burstware® tracks
bandwidth usage across all of its servers and distributes client
requests accordingly. Because Burstware® monitors bandwidth
availability across the whole network, it can optimize allocation of
network resources, resulting in greatly increased network
efficiencies. These efficiencies allow Burstware® to service more
users for the same cost.
Burstware® Servers apply a need-based model, tracking the buffer
levels of each client they service and divvying up bandwidth based on
need. Clients whose buffers are running low are serviced before
clients whose buffer levels are higher.
Burstware® applies optimized connection acceptance criteria to
guarantee the highest quality viewing experience for all clients. If
taking on an additional client connection will reduce available
bandwidth enough to impact the quality of viewing experience for
existing clients or the new client, the connection is refused.
Time-Based Data Requires Reliable Failover
Multimedia files are isochronous, or time-based. This means that if
data is lost during transmission, the application cannot simply
resend the file from the beginning. Imagine how unhappy users would
be if they had to restart Gone with the Wind from the beginning if a
few bytes were lost or a network leg went down halfway into the movie!
Burstware® offers the reliable failover that time-based data demands,
guaranteeing uninterrupted service should a server, conductor, or
network component go down. Using backup servers and conductors, and
synchronizing all delivery components, Burstware® guarantees that a
video or audio file will continue playing uninterrupted should any
single component fail.
Burstware® Is Designed To Handle Large Files
Video and audio files are large, because it takes a lot of
information to encode-that is, describe the contents of-a series of
images or sounds. The larger the file, the more bandwidth required to
deliver the file quickly enough.
Burstware® is optimized to handle large files. Sending data in
regulated bursts, Burstware® varies the size of the burst according
to bandwidth availability at a particular moment. Because
Burstware®'s buffer size is configurable and not tied to the size of
the media file, the client machine is not required to accommodate the
entire media file, easing storage requirements.
Summary
Although HTTP streaming plays out of local buffers, serving up a
higher quality image or sound than real-time streaming, it fails to
guarantee that high quality for all users.
All contents ©2000 burst.com, Inc. All Rights Reserved. Legal Information
This archive was generated by hypermail 2b29 : Wed Feb 16 2000 - 15:33:40 PST