Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: When would you say the push for loudness first started to reduce sound quality? (Read 3307 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

When would you say the push for loudness first started to reduce sound quality?

Was it 1982 when the CD was invented? But people were pushing for more loudness in pop records before then going back to the 40s and 50s. I'm curious about when the push for loudness first started reducing audio quality.

Re: When would you say the push for loudness first started to reduce sound quality?

Reply #1
I would say 1950s.  It got bad enough that juke boxes put in automatic volume controls - compressors - to even things out.  Radio's had it about as long. 

Re: When would you say the push for loudness first started to reduce sound quality?

Reply #2
I would say 1950s.  It got bad enough that juke boxes put in automatic volume controls - compressors - to even things out.  Radio's had it about as long. 
I'd say with the introduction of mass-market radios, or shortly thereafter.

Early radios struggled with bandwidth and sound reproduction of anything else than voice. Compressing music down ensured that most of it was actually ending up at the listener. Also often these radios were built down to a price, so they used very sub-par amps and speakers, with as little components as you could get away with. Early AM radios had a rather limited dynamic range (and those that still exist still do), so it was actually quite beneficial to compress music like that.

Re: When would you say the push for loudness first started to reduce sound quality?

Reply #3
In reality, some time around the mid 90s.

Re: When would you say the push for loudness first started to reduce sound quality?

Reply #4
Of course there has been a loudness war since the 1950s but limitations of analog tape and playback media kept it check.  Many people would say that Oasis' (What's the Story) Morning Glory 1995 CD was when things really started pumping up.  Funnily enough that 1995 loud mastering is tame by today's standards.

Re: When would you say the push for loudness first started to reduce sound quality?

Reply #5
Was it 1982 when the CD was invented? But people were pushing for more loudness in pop records before then going back to the 40s and 50s. I'm curious about when the push for loudness first started reducing audio quality.

Early CDs were mastered so the *album* peak -- usually a single peak across the entire album -- approached or reached 0dBFS.  No compression added (I don't think digital compression had been invented yet).  As a result the full dynamic range of the analog tape source  -- there were few all-digital recording then -- was preserved.  According to Wikipedia (with citations)  "the average level of the average rock song during most of the decade [the 1980s] was around −16.8 dBFS."

Also: "In 1994, the digital brickwall limiter with look-ahead (to pull down peak levels before they happened) was first mass-produced."  And that accords with my recollection of when it all really started going to hell.

It was a terrible shame because it overlapped a positive trend underway since the late 80s: companies were finally doing due diligence and sourcing remasters from original master tapes (not vinyl EQ-d production masters) .    Adding hard digital compression to a flat transfer of a classic original master tape is akin to adding googly eyes to the Mona Lisa.







Re: When would you say the push for loudness first started to reduce sound quality?

Reply #6

I'd say with the introduction of mass-market radios, or shortly thereafter.

Early radios struggled with bandwidth and sound reproduction of anything else than voice. Compressing music down ensured that most of it was actually ending up at the listener. Also often these radios were built down to a price, so they used very sub-par amps and speakers, with as little components as you could get away with. Early AM radios had a rather limited dynamic range (and those that still exist still do), so it was actually quite beneficial to compress music like that.
No, that's not right.  Mass market radios were introduced in the late 1920s, and by the 1930s they'd already improved a lot.  They were not narrow band devices yet, as stations were few and selectivity wasn't a big issue, and the RF limiter circuit wasn't invented yet either.  Dynamic range was not limited by the radio, it was limited by the medium being highly affected by electromagnetic noise, and the fact that broadcast audio processing had not been invented, so average modulation levels were quite low.  When peak limiters (actually invented by the film industry for optical sound) appeared, stations to raise modulation levels without peaks taking the transmitter off the air from an overload, then S/N ratio got better.  But the real loudness war on radio didn't start until rock and roll really took off and stations began some serious competition.  I'd put that at the late 1950s as the start.  But loud mastering on singles was already a huge problem, hence the jukebox compressors. 

Re: When would you say the push for loudness first started to reduce sound quality?

Reply #7

Early CDs were mastered so the *album* peak -- usually a single peak across the entire album -- approached or reached 0dBFS.  No compression added (I don't think digital compression had been invented yet).  As a result the full dynamic range of the analog tape source  -- there were few all-digital recording then -- was preserved.  According to Wikipedia (with citations)  "the average level of the average rock song during most of the decade [the 1980s] was around −16.8 dBFS."
There were actually quite a few early all digital recordings, many pre-dating the CD by several years (using the Soundstream system, for one).  In fact, the process of transfer of an analog master to CD involved the same gear you'd use to record digitally in the first place: the Sony PCM-1600/1610/1630 (those are different models) working with an slightly modified U-Matic video deck.  There were strict guidelines for CD mastering published by the big CD houses - Matsushita in particular - that dictated you put the highest peak of the entire CD at or below 0dBFS (it wasn't called that then), and that any audible transients be clearly logged and identified by the corresponding time code so they wouldn't be confused for errors in the CD master.  Somebody at the plant would actually QC these things by listening!

Mastering for CD, actually digital editing too, was handled via the Sony DAE-1100 editor, a crude device that controlled up to 3 U-Matic machines, time-code locked, and could accomplish digital cross-fade edits and apply simple digital gain control, fades, etc.  The final tape was sent to the CD plant where it was transferred to glass master with no data changes, but with PQ subcode added and formatted for CD.

Digital processing really began with reverb, the Lexicon 224, and Ursa Major Space Station, both in 1978, but dynamics processing wouldn't become practical until the late 1980s. 
Also: "In 1994, the digital brickwall limiter with look-ahead (to pull down peak levels before they happened) was first mass-produced."  And that accords with my recollection of when it all really started going to hell.

It was a terrible shame because it overlapped a positive trend underway since the late 80s: companies were finally doing due diligence and sourcing remasters from original master tapes (not vinyl EQ-d production masters) .    Adding hard digital compression to a flat transfer of a classic original master tape is akin to adding googly eyes to the Mona Lisa.
Yup.  All true.  However, loudness processing in the analog domain was already brutal in some areas decades before that.  Radio in particular, where multi-band limiting and deliberate clipping was already going on in the late 1960s, but also 45rpm single records, which were cut hot almost universally for several decades.  The pop hit "Please Go All The Way" by the Raspberries (1972) has blatantly audible peak limiting artifact (rapid attack and release) all over it, and was one of the loudest singles of that year.

Movie companies were brick-wall processing trailer soundtracks then too, even back to the Academy mono optical days.  The "loud trailer" problem became epic for a couple of decades, and it got really bad when digital film tracks became common in the 1990s.  With so many patron complaints, a program was instituted to pre-qualify trailers to the Leqm 85 standard.  Films would be run and metered on a system made by Dolby that would return an Leqm figure.  Trailers that didn't pass were "rejected", but it's not clear to me how much impact this actually had, since we still have loud trailers today.  Theater projectionists would respond to complaints by turning down the fader for trailers, then forgetting and leaving it low for the feature, which generated the inverse complaints.  It was/is a mess.

We think of brick-walled CDs as the beginning of the problem, but in reality it had already been going on for 30-40 years.  Yes, digital processing made it easier to make it really bad.