Skip to main content
Topic: TV audio and compression. (Or lack thereof.) (Read 968 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

TV audio and compression. (Or lack thereof.)

Back in the old days of television, programs were broadcast with extreme audio compression. I assume due to RF transmission limitations. But now that we're trying out Hulu, programs are played with full dynamic range. So, if I want to watch Kojak, for example, late at night, I have to literally ride the remote so that the music, car chases, and gunshots don't blast at full volume while being able to hear normal conversation. Upon further investigation, I see that many people are having this issue. The biggest complaint being the difference in volume between commercials and programs. Which brings me to the point of this mini rant. Since DSP is as common as hamburgers, I'm surprised that soundbars and such aren't made with built-in compression, or other DSP affects. There seems to be a market for it.

I've got a little stereo Alesis Nanocompressor that I'm going to try and connect between the TV and a soundbar. But a dedicated TV compressor, (optical or HDMI connections), would be ideal. Has anyone else dealt with this problem? If so, what kind of solutions have you come up with?

Thanks;
Artie


Re: TV audio and compression. (Or lack thereof.)

Reply #1
Pretty sure my TV has a feature to level out ad volumes and it's over 5 years old.

Re: TV audio and compression. (Or lack thereof.)

Reply #2
Pretty sure my TV has a feature to level out ad volumes and it's over 5 years old.
Yes, most TVs have a "speech" or similar command somewhere in the menu which helps to some degree.

However I find this is more of an issue with many later productions, particularly documentaries where the background sound dominates and often masks the dialogue.  Or perhaps my ears are just getting old.

Re: TV audio and compression. (Or lack thereof.)

Reply #3
I did find a menu item on the TV called "Clear Voice" that seems to help somewhat. I need to play with it some more to see how effective it is.

Re: TV audio and compression. (Or lack thereof.)

Reply #4
My audio receiver has a feature called Audyssey Dynamic Volume for dynamic range compression. It works very well. I depend on it for all TV and movie watching.

Re: TV audio and compression. (Or lack thereof.)

Reply #5
Back in the old days of television, programs were broadcast with extreme audio compression. I assume due to RF transmission limitations. But now that we're trying out Hulu, programs are played with full dynamic range. So, if I want to watch Kojak, for example, late at night, I have to literally ride the remote so that the music, car chases, and gunshots don't blast at full volume while being able to hear normal conversation. Upon further investigation, I see that many people are having this issue. The biggest complaint being the difference in volume between commercials and programs. Which brings me to the point of this mini rant. Since DSP is as common as hamburgers, I'm surprised that soundbars and such aren't made with built-in compression, or other DSP affects. There seems to be a market for it.

I've got a little stereo Alesis Nanocompressor that I'm going to try and connect between the TV and a soundbar. But a dedicated TV compressor, (optical or HDMI connections), would be ideal. Has anyone else dealt with this problem? If so, what kind of solutions have you come up with?

Thanks;
Artie



Sometimes it wasn't compression for compression sake, but network TV in the US until the early 1980s (AFAIR) didn't send the audio along the same path as the video.  In the early days, the video was a challenge for AT&T long lines, and they didn't even send the full 4.2MHz of video at first.  The narrower bandwidth was easier to transmit.  Then, they improved the AT&T long lines network so that video was better and better until it got good enough for color.  (A lot of the magic was a specific vacuum tube that got interatively improved over the years.)
Anyway -- the audio was sent along a special leased conditioned telephone line, and the delays between the video and audio were dealt with by various means.  HOWEVER, that leased audio line was of limited quality, and didn't often pass much more that 4-5kHz if even that (I forget the specs.)
So, outside of NYC and Washington DC, the real-time transmission of the TV networks to the outlying areas was a challenging affair, and the audio quality REALLY sucked.   I remember the day when Walter Cronkite's voice became clearer here in Indianapolis.  The other thing was that when I visited my grandma in the Washington DC area, I noticed that there was much more clarity in the network broadcast audio.  It is amazing the things that 'just happen' and we don't really know much about.
Part of the audio compression on early TV resulted from low expectations of audio quality (therefore some of the older film based shows had slightly deficient audio quality -- why bother), and I am pretty sure that the origination of the network material had a fair amount of compression done to it before giving it to AT&T.  The SNR of that conditioned phone line was probably not very good.

Maybe I diverged a little bit, but there were certainly a lot of contributing factors to MANY of the reasons in the '40s, '50s, '60s and '70s why both audio and video quality weren't as good as we would expect today.

John

Re: TV audio and compression. (Or lack thereof.)

Reply #6
All the AV receivers I have owned had dynamic range compression options that you could switch on, all with different names. Some were called 'night mode'

 

Re: TV audio and compression. (Or lack thereof.)

Reply #7
Hi folks, I worked for ABC TV for 36 years, designing audio systems.  Started there when as stated above, the audio was sent on twisted pair equalized "radio lines".  The AT&T tariff had specs and pricing for 5KHz, 8KHz and 15 KHz.  The longer the line, the more amplifiers and equalizers had to be cascaded.  The higher frequency response lines required the amplifier/equalizer pairs to be closer together, as the losses increase with frequency in any analog copper wire system.  The college radio station I was involved with had two amplifier/equalizer pairs mid-span to make the about 5 mile path (as the telephone cables ran) from the studio to the transmitter.  A properly adjusted analog 15 kHZ radio line sounded fine, if used between the studio and the transmitter, it had to pass pretty stringent proof of performance tests at least annually.  But it became almost impossible to get any real distance at 15 KHz, particularly if you wanted lines to match in stereo.  WGBH in Boston paid AT&T a fortune in extra charges to get a pair of lines from the Boston Symphony's summer home at Tanglewood in western Massachusetts somewhat over 100 miles to their studios in Boston.

Therefore, the lines used for network TV audio were mostly 5 KHz lines, with some of the main trunks 8KHz, all mono.  The main NY to Chicago to Washington DC back to NY circuit was a loop, we called it the round robin.  That made it easy for us to QC it routinely.  Lines west from Chicago and branches to affiliates were single ended (although many had provision for reversing direction so the affiliate could send programming (particularly news and sports) back to the network if required.  Some stations also depended on privately owned microwave links from another station for their network feed.  The end result was that the frequency response and signal-to-noise  in the boonies was pretty poor.  Also, don't forget that before (and actually well into) the video tape era, prime time programming was mostly 35 MM film, with its own signal to noise and frequency response issues.  And finally, the three hour delay for the west coast on live shows before video tape was a kinescope recording on film processed and played back.  All of this severely limited audio quality on the network, except at the originating city (usually NY)  Therefore, the networks limited the dynamic range to mqke the best of a pretty poor system.  Also, the local stations, in their own version of the radio "loudness wars" tended to heavily process the audio at the transmitter.  The end result, well documented at the time, was very little dynamic range.

Now move into the late 70's-early 80's.  As mentioned above, AT&T had improved the quality of its video transmission to pass better than 6 MHz of bandwidth.  Broadcasters, particularly ABC, pressured AT&T to use some of that bandwidth to send full bandwidth audio embedded in the video to all the affiliates fed by AT&T.  This was accomplished by a technique called diplexing or multiplexing, similar to the way stereo and more is broadcast over a single  FM station.  AT&T installed diplex transmitters at each origination point and diplex receivers at each receive point in the network distribution system.  This sucked some bandwidth out of the video to provide two 15 KHz audio circuits that were carried along with the video.  The television networks now had full bandwidth 15 KHz audio.  But most local stations still heavily processed their audio at their transmitters. 

In the mid to late 1980's the television networks transitioned from AT&T landline distribution to analog satellite distribution.  They maintained the same diplexed audio system, except now the network owned the diplexers, not AT&T.  You will note above that I said the diplexer (as usually provisioned) had two audio channels.  Well, that is how the television networks were able to transmit stereo audio to the affiliates.  I was heavily involved in modifying ABC TV's facilities for stereo in the mid to late 1980's.  But, still the audio was typically heavily processed with little actual dynamic range.

What eventually changed that was the coming of digital TV starting in 1998.   The specs for broadcasting digital TV audio required the local stations to stop the heavy dynamic compression/limiting they had been using.  Then the FCC adopted loudness monitoring/control equipment at every broadcast TV station and cable system, in response to complaints about loud commercials.

Now the ATSC TV broadcast system used to encode digital audio and video in the USA actually has an optional provision for the broadcaster to transmit metadata to TV receivers telling the receiver what the dynamic range of the program was, so the viewer could select how much compression to automatically insert in the TV receiver.  But I know that ABC never implemented that ability, and I doubt many other networks did.  So any dynamic range processing in your TV/receiver is single ended.

 
SimplePortal 1.0.0 RC1 © 2008-2018