It’s Bandwidth, Stupid
Everything digital boils down to bandwidth –
- how much you have
- how much can you use
- how fast the data can move through it
Bandwidth comes in several forms. The network connection is the obvious one because we already use the term bandwidth to describe this. This determines how fast one computer can communicate with another computer through a network.
Storage space is another form of bandwidth, if anything needs to be stored. Even streaming files through the network will require some local storage and files saved to your device require space. There’s the raw space, and also the read/write time of the storage volume – both are a form of bandwidth.
There’s plenty more places to measure bandwidth inside of, and plugged into, the computer such as the motherboard busses between the various chips, the ports in and out of the computer, and the video output. All of these have a known bandwidth and engineers must take this into account when designing circuits.
The entire digital audio format debate boils down to bandwidth. How much sound bandwidth can your body pick up?
37 years ago when Phillips & Sony were working on the audio CD they knew that bandwidth would be a major issue. Digital audio generated very large file sizes and required lots of bandwidth to reproduce accurately. 50mb was literally HUGE in 1978, and that’s only 1 5-minute song on CD. This is a time when $500 hard drives were 10mb! The draw to the optical disc was the huge storage space it provided on cheap plastic discs.
Which brings us to the bandwidth of the disc and file format selected. The new CD design could hold roughly 600mb of data. What resolution to store the audio as became the driving force in finishing the standard, with engineers deciding a nice compromise was a 44k sample rate stored in 16bit files, allowing for about 60 minutes of runtime per disc, or just enough to hold the president of Sony’s favorite symphony (a rumored requirement of the new format).
This is the thing: bandwidth = cost. More money gets you more of it, especially in components. Want a motherboard with higher bandwidth? Costs more. Want a chip with higher bandwidth? Costs more. A port and cable that can move more data? Costs more.
So the engineers and designers of the CD knew there were better quality resolutions than 16/44, but the overall cost of making a player to play higher resolutions, and total bandwidth of the storage for them, just wasn’t there in 1980’s tech. Early digital production systems did use 20bit audio with sample rates from 40 to 88k, but they were expensive and specialized, not for the consumer.
By the 1990’s the price of higher-bandwidth components had come down enough to attempt a format upgrade, but like many things in the 90’s, the internet changed everything. Instead of consumers moving to a new optical disc holding higher-quality files and played through better players (SACD), the trend was to smaller, mobile files that could be moved around the internet and played on smaller and smaller devices.
The visual engineers who developed the JPEG compression format stepped in and put together an audio specification for shrinking CD-quality files down to something 90’s era computers could handle. This became known as MP3, and at first it seemed magical. How could that 50mb song from a CD become 5mb and play back almost perfectly from my hard drive? Impressive. Overall sound quality was deemed “good enough” because of the huge boost in convenience mp3 provided.
As we lived with MP3 and listened closer, many consumers were less than impressed. But time marched on, napster was built to trade illegal MP3, iPod shipped, then smartphones and tablets, and MP3 became the new consumer format in the early 00’s.
This, of course, is not the first time we consumers have taken a quality downgrade in the name of convenience.
Now is now. Almost all limits of bandwidth from the last 30 years are gone, as is evident with Netflix streaming everywhere, people running very fast computers packed with memory and fast storage on broadband network connections. There are now millions of servers talking to hundreds of millions of devices, each little device packing more bandwidth than a $50,000 computer from 1980.
The bottom line – We no longer need to reduce the art to fit the distribution. If an artist makes a record at 24/192 you should be able to buy it, store it, and play it at 24/192. If you want a lesser version for a lesser device/use you can easily make it yourself. If the artist makes the record at 16/44 that’s fine too, buy that one.
The point is that reducing from the audio master was only done in the name of bandwidth restrictions that are now gone.