Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: Pitch shift during music fades? (Read 7336 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

Pitch shift during music fades?

Apologies if this is not the ideal location for this question - I don't think it's related to the encoders being used, but maybe others have heard this and can comment.  I can't believe I'm the only one who has heard it.

Anyway, I've been listening to a lot more music on my iPod lately, encoded from multiple sources (details shortly).  On some tracks, often orchestral but sometimes not, when the track fades or decays at the end, the apparent pitch of the sound rises progressively as the volume decreases.  It's quite unnerving and I thought at first there was a defect with the encoding process - until I heard it in tracks I am sure I encoded myself.  Some of the tracks, I believe, are purchased MP3s, others encoded myself with LAME or iTunes AAC (I switched formats a while back, not sure of the exact date).

I am getting ready to start writing the tracks down and going back to the source material when possible to compare them (explanation: I do own all the original CDs except for a few purchased downloaded tracks, but in some cases I have loaded an MP3 onto the iPod, purchased the CD because I liked it, then been a slacker about replacing the iPod tracks).  But if anyone else has encountered this, perhaps I can save some time by knowing what to look for.  I have more CDs than I'd like to admit, so identifying them and grouping them by MP3, AAC, 192K, 128K, etc. will be a chore.

I can tell you right now one of the CDs that exhibits the phenomenon, because I'm listening to it - The Return of the King, Howard Shore.  I honestly don't know if these specific tracks were ripped from my CD - this is something I'm going to check as soon as I get home and pull it out of the rack.  Another example I need to double-check is Dave Grusin's GRP Live in Session, one I'm sure I ripped myself and was listening to yesterday.  I can't imagine how the encoder would introduce such a dramatic pitch shift, nor can I imagine that such a drastic defect exists on very many CDs without it being well known...  Please tell me I'm not imagining this?  It's not a psychoacoustic effect I'm aware of. 


Pitch shift during music fades?

Reply #2
But this excerpt: "Below 60 dB an increase in loudness will cause sounds below 2 KHz to be perceived as getting sharp and sounds below 2 KHz perceived as going flat" seems to indicate the opposite effect from what I'm noticing; i.e., the sounds are definitely below 60 dB on the fade-out or decay, and upon a decrease in loudness I perceive an increase in pitch.

Pitch shift during music fades?

Reply #3
It sounds like poor power supply regulation. As the volume decreases, the supply voltage increases and the local oscillator increases in frequency. All of this assumes that you only hear this effect on the iPod.

Pitch shift during music fades?

Reply #4
Exactly why I want to go back to the CDs for reference.  Hopefully I'll get a chance to check tonight.

Pitch shift during music fades?

Reply #5
I think you're listening too loud.

My ears hear exactly what you describe if I listen quite loud.


I've read that "below 60dB..." quote before - don't know where the official study to justify it was published, but in any case music is more complex than that.

Cheers,
David.

Pitch shift during music fades?

Reply #6
It's almost impossible to pitch-shift just part of the file.  So pdq's hypothesis that there's something wrong with the iPod makes the most sense to me.  If this is the case, I would guess that you  hear the same effect with the volume control as you get with the fades.


It could be a perception phenomenon, but I think you would have noticed it before, or you would have become so accustom to it that you don't notice. 


---------
There are three ways (at least) to pitch shift.  None of these should be caused by ripping, encoding, or transcoding  -

The easiest way is to change the clock (oscillator) frequency.  Sometimes people will notice a pitch shift when they record on one computer and play-back on another (or record with one soundcard and play-back on another).  In that case, one computer or the other (or both) have an inaccurate clock.  Or, if your soundcard clock is off, the ripped file will play back at a different speed than the CD when played on a CD player.  For example, the software thinks it's recording (or playing back) at 44,100 samples-per-second, but it's not because the frequency is off.    You could get a similar effect if the file header is wrong.  If the file header says 44.1kHz and the file is actually 48kHz, it will play-back slower.  (It's very rare to get an incorrect header.)  These sample rate problems can happen accidentally, but they should affect the whole file.    If you get this problem with a file "ripped" from a CD (if you're not recording analog), then the ripped file itself is OK, but there is a playback problem with your soundcard or MP3 player.

Another way is to re-sample.  You either add or subtract samples, and interpolate and to slow-down or speed-up playback.  This should never happen accidentally.

Both of the above will shift the speed and pitch together.  The only way to pitch-shift without changing speed is by FFT[/color] (Fast Fourier Transfer) which is a mathematically complex and cannot happen accidentally.


Pitch shift during music fades?

Reply #7
With all due respect, I'm not listening that loud - one of the nice things about having IEMs is that I can play music at half volume or less and still hear everything above the ambient noise.  These aren't overwhelmingly loud discs, either.

I wasn't able to dig out the LOTR disc this evening, unfortunately, but I did listen to the track in iTunes.  It's hard to be sure, because I didn't have the IEMs with me, but I couldn't hear the pitch shift while listening in iTunes.  I find it a little difficult to believe that the iPod has such a poor battery that a difference in audio output could cause major problems with the playback circuitry; I need to test more.  Like I have much time for that.      Oh well, science demands it.  Maybe I'll try while powering it off the AC adapter, or the computer USB port.

Pitch shift during music fades?

Reply #8
I think you're listening too loud.

My ears hear exactly what you describe if I listen quite loud.


I've read that "below 60dB..." quote before - don't know where the official study to justify it was published, but in any case music is more complex than that.

Cheers,
David.


I was thinking the same thing, but then I saw the last post.

OP: Are you using those in-ear buds? I am thinking they might have more of a Doppler-like eefect than the over-the-ear headphones -- the distance from the driver to the ear's membrane...  the shorter the distance the more Doppler-effect (just guessing).

I just relate to the sound pressure and pitch effect from late-night mixing
"Something bothering you, Mister Spock?"

Pitch shift during music fades?

Reply #9
Well, I still think it's your ears.

But it's trivial to find out for sure - just record the iPod's output via the input of your sound card.

Then compare that with the original .wav in an editor. The lengths should be similar - not identical, but close enough. Also a single timestretch value should match them near as damn it perfectly.

Whereas if the playback really was slowing down or speeding up enough to cause an audible pitch shift, the lengths would be quite different, and a single timestretch value wouldn't be able to make the two versions match.

Cheers,
David.

Pitch shift during music fades?

Reply #10
Quote
I find it a little difficult to believe that the iPod has such a poor battery that a difference in audio output could cause major problems with the playback circuitry...
  I agree.  This would be unusual, but your particular iPod may be defective.

Quote
I am thinking they might have more of a Doppler-like eefect than the over-the-ear headphones -- the distance from the driver to the ear's membrane...  the shorter the distance the more Doppler-effect (just guessing).
No.  The   doppler effect requires motion. As long as the distance between the headphone/earbud and the eardrum remains constant, there is no doppler effect.  i.e. As the train comes toward you, the frequency of the train's whistle is increased.  As the train moves away from you, the frequency is decreased.  If you're riding inside the train, the distance between you and the whistle reamains constant and there is no pitch shift.    (You will also get a doppler shift if you are moving and the train is stationary.)

Pitch shift during music fades?

Reply #11
I can't explain why this phenomenon happens, but I can tell you that it has nothing to do with iPods, MP3's, digital, or anything else except human weirdness. I've been hearing this affect as my old LP's faded out since I was a teenager. (I'm fifty-something now.)

Its just one of those weird psychoacoustic affects. 

Pitch shift during music fades?

Reply #12
Its just one of those weird psychoacoustic affects. 



The brain interprets the decrease in volume as meaning that the source of the sound is moving away and applies a correction to a doppler shift that isn't actually there - hence the apparent increase in pitch

Pitch shift during music fades?

Reply #13
I agree. Notice my post said Doppler-like effect, not the actual physical Doppler effect.

What about in-ear buds vs. headphones? That never got addressed. Maybe the distance between the element and the eardrum seems small, but I'll wager a few millimeters toward or away makes a big difference in many aspects.

edit:
Quote
one of the nice things about having IEMs is that I can play music at half volume or less
Ok, maybe this means in-ear membranes. Sorry to have overlooked that.
"Something bothering you, Mister Spock?"

Pitch shift during music fades?

Reply #14
The brain interprets the decrease in volume as meaning that the source of the sound is moving away and applies a correction to a doppler shift that isn't actually there - hence the apparent increase in pitch


Can you tell me where you got this info from? I'm not saying you're wrong. Just trying to understand this better, given the fact that there's evidence that doesn't support this. For example: how would the brain know whether something is a doppler shift, or just a genuine change in pitch? Like tuning an oscillator on a synth. Secondly, if the brain has this capability, why doesn't it utilize it for the infamous "passing train" affect. Apparently, the brain doesn't compensate during these events. Why any other?

Just asking.   

Pitch shift during music fades?

Reply #15
IIRC, both Moore and Fastl/Zwicker say it's a psychoacoustic illusion caused by loudness compression in the human hearing system (which causes a change in the excitation pattern).

Here is an example created in Audition. Play it such that the beginning of the file is relatively loud.

What I hear:

1. During the decay of the loud tone, the pitch increases.
2. During the quiet tone afterwards, it does not increase.
3. The pitch of the second tone is higher than the initial pitch of the first tone.

Given point 3, I tend to disagree with Wook's explanation. Would like to see a source as well.

The 60dB make sense. It's the maximum dynamic range an individual hair cell is able to "code". See also http://www.amazon.com/Introduction-Psychol.../dp/0125056281/ and http://www.mp3-tech.org/programmer/docs/asa-s96.zip.

Chris

P.S.: Forgot to mention: Of course, the frequency in the example is absolutely constant (441 Hz). The second tone is identical to the first, just reduced to 6.25% of the original amplitude.
If I don't reply to your reply, it means I agree with you.

Pitch shift during music fades?

Reply #16
IIRC, both Moore and Fastl/Zwicker say it's a psychoacoustic illusion caused by loudness compression in the human hearing system (which causes a change in the excitation pattern).


Listening to the Saint Seans third on my IE8s today I noticed this effect very clearly at the end of the gigantic final chord.  The hall decay sounded at a clearly different tone than the chord itself.  Recording was Dutoit with the Montreal Symphoney.
Ed Seedhouse
VA7SDH

Re: Pitch shift during music fades?

Reply #17
Hi,
I know this post is ancient but is anyone still interested 8n this volume related pitch bend effect?
Best wishes,
Simon

Re: Pitch shift during music fades?

Reply #18
I, too, have noticed this effect for decades, but only on "some" recordings, and then, consistently on those recordings, and not at all on most others.

Personally, I do not believe it is any kind of "psychoacoustic illusion." To me it is too noticeable, and too consistent to be anything other than some kind of recording process issue. I have no experience/knowledge of the recording process, so I cannot offer a suggested cause.

If the situation were a "psychoacoustic illusion", then why would that same illusion not show up repeatedly, over and over again throughout a recording whenever the volume fades?

Putting a frequency meter to a speaker replaying a constant tone that fades away and illustrates the perceived pitch change would answer the question definitively.

It is an interesting issue. I will have to identify and report a specific public recording next time I happen to catch the effect happening.

Interesting topic.

Re: Pitch shift during music fades?

Reply #19
I've given this topic a little more thought, and I am interested in offering a theory as to why this "phenomenon" exists.

Caveat: I am not a recording professional. I really know very little about the music recording and production process; just enough to propose uneducated theories. But, I believe I have, at least, a plausible explanation for why this phenomenon exists.

If you consider how vinyl records were produced, it is essentially the reverse process as that of a phonograph record player (turntable) that is used to play back vinyl records. According to www.furnacemfg.com "In the cutting studio, the mastering engineer places a blank lacquer onto the lathe and the machine transforms the auditory energy of the recording into the physical movement of a needle on the lacquer. In this sense, the lathe is the opposite of a turntable, turning sound into movement instead of movement into sound."

My theory is that while the cutting lathe is in the middle of a song, the platter spins at a constant rate. Then, when the end of the song comes, another motor kicks in to rotate the cutting mechanism arm to the next track. When that second motor kicks in, there is a momentary voltage drop while that motor spins up. If both the platter motor and the armature motor are on the same power circuit, that voltage drop could also affect the platter motor which causes the motor to slow down ever-so-slightly. By the platter slowing down, needle movements are more compressed along the length of the track. When the final vinyl product is played back in the home, that compressed information results in an increase in the pitch of the original music since the playback turntable platter continues to rotate at a constant speed. When the compressed information is reached, the information causes the needle to translate at a higher frequency than the original "sound".

Now, this theory only relates to vintage recording equipment. Imagine the old days of recording when equipment was simpler and less like the clean-room equipment used today. If my theory is correct, and a voltage drop caused the cutting lathe platter motor to slow down ever-soo-slightly, then surely, someone along the way identified the defect, and the next generation of cutting lathes was modified to correct the problem. Did this actually happen? Only recording engineers very close to the early cutting process would know. Why would we still hear that effect today? Could it be due to modern CDs being sourced from the master lacquer plate ("an aluminum plate covered with cellulose nitrate, a coating similar to nail polish") cut with the old defective cutting lathes? Don't know.

Well, that's my theory, and I'm sticking to it. I would be interested in hearing any rebuttal to this theory.

Thanks