Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: is WAV normalization lossless? (Read 51737 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

is WAV normalization lossless?

---

is WAV normalization lossless?

Reply #1
Wav normalizing is *lossy*.

The reason for this is that all WAV data *does* have to be changed for the volume to be changed, unlike MP3, where a single gain value for the entire file can be adjusted.

is WAV normalization lossless?

Reply #2
---

is WAV normalization lossless?

Reply #3
Basically, no.

But for what you are doing, the loss is of such an extent that I surely wouldn't worry about it.

is WAV normalization lossless?

Reply #4
Normalizing WAV files is "lossy" in the same sense that being charged 4.5% tax on a $1.00 purchase is lossy (since you can't have half a penny or half a bit).

There is a PEAK chunk that can be added to a WAV file.  The PEAK chunk stores the peak amplitude value, and thus the file could be normalized by the decoder.  Normally it is used for floating point audio.

is WAV normalization lossless?

Reply #5
Saying that normalization is 'lossy' is way overstating the facts. There are rounding errors; every sample must be a discrete 16 bit value. To normalize you multiply every sample by the normalization value. In most cases the result of the multiplication is not an exact 16 bits (like multiplying any number by pi) so the remainder is thrown away. Its a quantization error, the same as creating a 16 bit sample from analogue in the first place. Any lossy compression makes far greater changes in the data.

These errors can possibly add up to something audible if you do enough operations on 16 bit samples (noise reduction, + pop and click removal, + compression, + hard limiting, + reverb, + EQ, etc) but for a single normalization the error is more theoretical than actual in that it will never approach audibility.

is WAV normalization lossless?

Reply #6
Quote
Saying that normalization is 'lossy' is way overstating the facts.
[a href="index.php?act=findpost&pid=372670"][{POST_SNAPBACK}][/a]


Lossless is a well-defined term.

Normalizing a WAV is not lossless.

is WAV normalization lossless?

Reply #7
Its true that normalization isn't exactly a two way mapping, but I did not write anything suggesting that it is, although it almost is, relative to most other things one might do to an audio file. I don't know what numerical value to place on its impact in terms of errors, but I suspect it would compute to a least an order of magnitude less than the operations usually referred to as "lossy." I repeat that a single normalization transform on a file is insignificant in terms of anything audible -- except of course that playback will be at a different volume.

I believe I understand the way in which you are using the word, but I don't recall seeing it applied to normal transforms. These, such as normalization, are not very relatable to the information reduction processes where the term is most commonly used.  For my enlightenment, can you point to an accepted definition that includes these other processes?

is WAV normalization lossless?

Reply #8
And within that 'lossy' concept, mp3gain is also effectively lossy. The 'loss' is applied at playback time rather than destructively to the file itself, but its result is surely not a wit less.

is WAV normalization lossless?

Reply #9
Quote
My goal is to burn a standard Audio-CD with CDDA(WAV) tracks with mixed music/artists, but each music has its own volume level. Burning the lossless WAV of course does not alter the data, but the volume levels for each track are all different. Is there any special process one could do in order to level the tracks equally?
You could:

1) encode the WAV files with a lossless audio codec (FLAC, WavPack, etc.)
2) use Foobar to apply ReplayGain
3) use Foobar or Burrrn to write the CD with the ReplayGain values

That should do the trick.


is WAV normalization lossless?

Reply #10
It depends on the source material, but normalization will add no more than 6dB of quantization noise, and will add 3dB of quantization noise on average.

I would define an operation f(x) as lossless, if there exists a function f-1(x) such that f-1(f(x)) = x.  Clearly this is not the case with normalization unless it is a multiple of 2 (6dB)

is WAV normalization lossless?

Reply #11
Quote
And within that 'lossy' concept, mp3gain is also effectively lossy. The 'loss' is applied at playback time rather than destructively to the file itself, but its result is surely not a wit less.
[a href="index.php?act=findpost&pid=372694"][{POST_SNAPBACK}][/a]


You are simply wrong: the playback is lossy. mp3gain is not. Depending on the decoding setup, this does or does not make a difference, but it certainly is something different from doing it with a WAV.

You can reverse the change 100%. This is not possible with WAV. This is why mp3gain is lossless, and gaining a WAV is not.

Meaning, you can put any mp3 file into mp3gain, run it through, undo it, and end up with the same file. No information is lost, original is 100% recoverable.

You cannot do this with WAV except in degenerate circumstances.

Is this important? It depends on what you want to do. Consider switching between album and track gain 100 times. No loss with mp3gain. With a WAV, you might have issues.

is WAV normalization lossless?

Reply #12
Quote
Consider switching between album and track gain 100 times. No loss with mp3gain. With a WAV, you might have issues.
[a href="index.php?act=findpost&pid=372704"][{POST_SNAPBACK}][/a]


It seems to me at least, that this is a rather theoretical oversimplification of the problem.
The original poster wanted to know if normalising a wav was lossy, to which you, Garf have said "yes" and AndyH has said "no".
I agree with Andy.
No, it's not lossy.
Yes, as AndyH suggested, there are minute rounding errors at the bottom end of the dynamic range, but not enough for you to hear it after just one transform.
Garf, I appreciate your theory that normalising a wav up and down 100 times might introduce errors, but that's not a real world example, is it?
The orignal poster wanted to normalise some wavs to burn to CD, which only amounts to one transformation of the wavs, not 100's.
So, what we're really talking about, in this instance, is... is it lossy after one transformation, to which I'd say again, "no".
And to the human ear (which is analogue), the same distortion (if you want to be a real stickler for the numbers) introduced by normalising a wav prior to burning to CD, will also be introduced by gain replaying an MP3, because it's still just a 16 bit source, and if you normalise it on playback, there is stil real time "distortion" happening to the bits at the bottom end of the dynamic range.
Cheers,
Bruce.
www.audio2u.com
The home of quality podcasts
(including Sine Language, a weekly discussion on all things audio)

is WAV normalization lossless?

Reply #13
Just out of curiosity I took a CD track and normalized it. In this case that was a fairly substantial 5.71dB increase in amplitude. Then I reversed the process by amplifying by -5.71dB. I mixed-pasted inverted the original into this result. As a rough and ready measure, the Average RMS measurement across the final outcome of these two normalization operations is -95.3dB – almost certainly very difficult to detect by ear.

For comparison I converted that same track to a CBR320 mp3, then decoded it. The Average RMS on the result of mix pasting the original back into that result is -63.4dB. This is also very small, but it is an order of magnitude larger that on the normalization operations.

I'm not saying this is the best way of comparing the two processes but it does give some idea. Also, normalization is only one operation, there is no additional change at playback, so the actual error is less than indicated here. MP3 playback requires both operations.

I did not get my request fulfilled for a definition of ‘lossy' that includes transformations such as normalization. Its more common usage refers to some process that actually reduces resolution rather than one that simply has small errors.

The process esa372 described is, unfortunately, just another, rather long winded, way of applying normalization. It cannot result in any less error.

The rebuttal to my comment about mp3gain did no more than restate my position; I did say ‘effectively.' I understand the difference if we consider the file as a starting point for further software transformations, but the result to the listener of the music is the same. In the sense that the inquirer asked the question -- does it 'degrade' my music -- mp3gain is no more 'lossless' than is normalization.

I don't think we really have any disagreement here except on the definition of one word. There doesn't seem to be any great confusion about what actually happens, no?

is WAV normalization lossless?

Reply #14
I don't think that can be reproduced. (edit: If it can, it's not a fair comparison.)

Wav normalization is certainly not lossless, therefore it must be lossy.

is WAV normalization lossless?

Reply #15
-

 

is WAV normalization lossless?

Reply #16
Not really. Your old tracks will be the same, but the new tracks will be different from the originals; they will all have been altered to have a similar volume level.

edit:

Quote
But for what you are doing, the loss is of such an extent that I surely wouldn't worry about it.[a href="index.php?act=findpost&pid=372640"][{POST_SNAPBACK}][/a]

Well said.

Here's another vote for Replay Gain, which is better than peak normalization.

is WAV normalization lossless?

Reply #17
The Japanese term for "lossless compression" is (edit: literally) "reversible compression." The Japanese term for "lossy compression" is "irreversible compression."

If it's reversible, it's lossless. If it's lossless, it's reversible.
If it's irreversible, it's lossy. If it's lossy, it's irreversible.

Peak normalization is irreversible; therefore, it is lossy.
On the other hand, mp3gain is reversible; therefore it is lossless.

--- --- ---

I don't think I've ever heard people say "lossless" to mean "preceptually virtually lossless."

is WAV normalization lossless?

Reply #18
Quote
I did not get my request fulfilled for a definition of ‘lossy' that includes transformations such as normalization. Its more common usage refers to some process that actually reduces resolution rather than one that simply has small errors.

The term "Lossy" of course just reffers to the fact that a loss will occure, no matter how small or how imperceptible it is... You cannot say that just because the loss is neglible then the process sudently isn't lossy anymore...
Quote
Wav normalization is certainly not lossless, therefore it must be lossy.

Absolutely

Edit: Beaten by kjoonlee

is WAV normalization lossless?

Reply #19
Quote
Quote
My goal is to burn a standard Audio-CD with CDDA(WAV) tracks with mixed music/artists, but each music has its own volume level. Burning the lossless WAV of course does not alter the data, but the volume levels for each track are all different. Is there any special process one could do in order to level the tracks equally?
You could:

1) encode the WAV files with a lossless audio codec (FLAC, WavPack, etc.)
2) use Foobar to apply ReplayGain
3) use Foobar or Burrrn to write the CD with the ReplayGain values

That should do the trick.


If you look at what Burrrn is actually doing, you will find that steps 1 and 2 are completely unnecessary.  Burrrn does not use the Foobar replaygain values;  it takes the wav files (after decoding if necessary) and applies WaveGain to them.

is WAV normalization lossless?

Reply #20
Maybe I overstated reality when I said that everyone know what is actually happening. This process of "write the CD with the ReplayGain values" is first normalizing the file, then writing it to the CD.

We have two aspect of the difficulty here. One is that some of you have decided to apply the term 'lossy' to quantization error. There is no argument on my part that there isn't an irreversible change, but I suspect you can't find your definition of lossy anywhere used by the rest of the world. Normalization is definitely not compression, of any sort. Lossy just isn't the correct word in that context. Show me that I'm wrong and I'll acknowledge it. Ok, I'll even acknowledge it here: it is the slang usage this group has adopted as its own in-group speak; I should recognize the sloppy terminology in the future without comment.

The other aspect of it is that my original post was simply an objection to that technically incorrect use of the term (lossy) ...because... I saw it giving the wrong impression to the person asking the question. I sort of assumed Bourne understood the lossy concept and was just ignorant of its involvement, or non-involvement, in his operation of interest. That he understand the term may be assuming too much, I don't know.

Surely the only useful reason for wanting to know about the results when making a compilation CD, which is just something to listen to, is to know whether or not it might degrade the audible quality. With any lossy compression there is always that possibility, but with normalization there isn't, therefore the difference has some meaning.
Quote
I don't think that can be reproduced. (edit: If it can, it's not a fair comparison.)
Wav normalization is certainly not lossless, therefore it must be lossy.

Bananas have a yellow skin. That sour oval shaped fruit has a yellow skin, therefore it must be a sour banana? In the usual use of the term lossy, your application to compression is correct, your application to normalization is not. However, I don't think that horse can even twitch a muscle any more, so I think I'll quite beating it.

I don't know what you mean either by reproduced or fair. If you think I can't do it again and get the same result, or that someone else can't do it and get the same result, you are wrong.

For some kinds of situations, the mix-paste inverted is very good for presenting a picture of what has, or has not, happened. It might even give you some idea about the differences you hear (or don't hear).

When used to compare an original file and its perceptually encoded, then decoded, version, it is a much less useful tool. It really isn't any kind of a guide at all to differences in the sound of playback, which is what one is usually interested in. It does give some insight into how much the file is physically changed by the process but not to the relevance of the changes.

Again out of curiosity I did the same test but resample the file to 32 bit before doing the amplification, de-amplification. Then I resample back to 16 bit to compare with the original. This time the Average RMS measurement on the result was -117dB, rather well below perception I suspect. Common advice is do all transforms in 32 bit but when the straight operation gives  -95dB, is there any reason to strive for -117dB? It depends on how many operations you are going to do.

I also tried the experiment of simply resampling to 32 bit then back to 16 bit, then did a mix-paste inverted with the original. All measurements were either minus infinity or zero, as appropriate. There might be rounding errors, but they are clearly well below 16 bit resolution. However, if resampling the bit depth simply changes the data type of the number then I think that probably means there is actually no change that could possibly effect the final result, i.e. no errors. Only applied transforms could make any difference.

is WAV normalization lossless?

Reply #21
You're using straw man tactics to refute what I haven't said.

Also, you're still not doing a fair comparison.

1. Pick an MP3 file. Decode it to .wav.
2. Decrease the volume of the original MP3 file
3. Increase the volume of the MP3 file from step 2. Decode it to .wav
4. Decrease the volume of the .wav file from step 1.
5. Increase the volume of the .wav file from step 4.

Files from step 1 and step 5 will be different, so the wavgain process can be proved to be lossy.

Files from step 1 and step 3 will be the same, if nothing goes wrong. In such cases, MP3Gain would have been lossless.

is WAV normalization lossless?

Reply #22
A fair comparison of what? I did a comparison of the amount of change due to errors (normalization) vs the amount of change due to a lossy process (least lossy encode to mp3). The point of it is that the amount of change due to normalization, which is not technically a lossy process, does not compare to that from even the least mp3 compression, which is a lossy process.

Technically, a lossy process is one which deliberately discards data (data that is ‘expendable' by the criteria of that particular process). Normalization definitely does not do that. Lossless is simply used as contrast for a process similar to the lossy one that does not discard data. Compression is the main data manipulation to which the terms correctly apply. That lossy isn't exactly reversible is simply a measurable characteristic, the result not the cause.

Someone borrows money from you. You being a sharp operator get an agreement for 3.25% interest compounded every day. After a number of weeks the person returns with the principal and some extra to pay your interest changes. You make the calculation and exact your due, down to the last penny (this is the smallest measure of physical money in the U.S., 0.01 dollar, in case you are unaware of U.S. currency terms).

But wait, the calculation that gave you the amount to charge wasn't exact to the penny. One bank I did business with explained that to duplicate their results I had to carry out all my calculations to 12 decimal places. Whatever number of decimals you choose, rounding will be involved. It is extremely unlikely your final result will come out to the even cent. You will round up or down in order to know exactly what to charge. Do you call that rounding lossy? If so, I say you are really stretching the language, making up your own definitions as you go.

Your suggested test is between destructive and non-destructive processes. As applied to audio editing and similar activities, a destructive process alters the source data, a non-destructive one alters the data as it is being used in real time, without changing the source. What's to argue about there? What's to test? Whether or not some particular process is destructive or non-destructive? Who raised that question?

Perhaps it was raised in the suggestion to use ReplayGain (a non-destructive process?) within a CD writing application. ReplayGain may be non-destructive, not altering the original source, but its result in the data that is written to CD will of necessity be destructive (but not lossy, no data is discarded), the file written to the CD will contain altered data. This is normalization by another route. I don't, however, see how that relates to the tests I ran. I also don't know how many ways I can say that my posts had nothing what-so-ever to do with any claims that data is not altered during normalization. I never even slightly suggested that normalization is non-destructive -- but of course it can be.

You can't write files to an audio CD that are both normalized and unchanged from pre-normalization, but you can do normalization non-destructively. Normalization is another word for amplification. It can be positive (increase) or negative (decrease). Normalization, as it is most commonly used in digital audio, is just a special case of amplification where the amount is (usually) automatically calculated by a program and applied across whatever audio is selected.

When levels are adjusted in a multi-track editor, perhaps also in some simpler editors, the process is non-destructive on the sources. It is only applied destructively on a mix-down file. During editing, all processes, such as the level adjustments, are applied in real time as the data is played. This is exactly what is accomplished in any non-destructive process, such as the MP3Gain you suggested for testing. There isn't any difference.

is WAV normalization lossless?

Reply #23
Quote
Normalization is definitely not compression, of any sort. Lossy just isn't the correct word in that context.

Technically, every process that involves a loss, that being rounding errors or not, is simply a lossy process and hence, WAVE normalization is definetely lossy(as it isn't lossless). The fact that lossy compression has a bigger loss than WAVE normalization does is entirely besides the point, as we are not talking about lossy compression here, but simply a lossy process in general...

is WAV normalization lossless?

Reply #24
This is truly broke record country. That has been said a dozen times in this thread, but "technically" the declaration is not correct. In this group's mind set it seems to be, but try to find a genuine, generally accepted definition that says any such thing. Lossy refers to the process, primarily compression. It is lossy if it deliberately discards data. One consequence of lossy compression, one particular detail, is that it isn't completely reversible. The data that has been discarded, with deliberation and forethought, doesn't come back when the file is decompressed.

People in these parts have latched onto that one consequence of the process and declared it not only be the defining cause but have then applied it to anything they consider to have an even remotely similar consequence. Therefore, by that view, if I open a file in an editor, then save it, it is lossless. If I open it and change the value of one sample by increasing or decreasing its value a tiny amount, then save it, it is lossy. If I add reverb, then save it, it is lossy. Certainly those are destructive change, but aside from the fact that they aren't compression, which may be the only place the word is "technically' used, no data has been discarded, the defining process for lossy.

It seems to me much like the use of "RIP" around here. This started out as a slang term for DAE. Asking why people want slang is pointless, but at least most other places I've read anything people keep their head straight as to what they are talking about. I can only imagine the evolutionary process that has gone on here.

Someone who has no idea about what goes on uses the word in a incorrect way. Instead of educating him/her, the majority sloppily incorporate that new view into their way of talking about whatever is the immediate topic. Eventually, case by case, it reaches the current exceedingly vague state of being used for: just what? Virtually any kind of data movement, it seems to me, making no distinction between process that are really quite different from each other. It is almost a 1984 newspeak kind of thing. Reduce the vocabulary so people can't think clearly about differences and it becomes  easier to channel their limited thoughts onto controlled paths.

I am not with the language police and I don't propose to pursue people that continue down that degenerate path, but I hardly accept it either. It is hard enough to think clearly without introducing more impediments. A perusal of this thread's contents, could anyone be able to stomach making it, would show there has never been any serious dissension about what happens to the data (other than a few incorrect details by a few people)

It started with me pointing out there is an order of magnitude difference in the consequence of two different processes, where lossy was used in a "technically" incorrect way. It developed into a debate as to whether or not I adopt this special definition of a standard word. Its fun for awhile, but isn't enough enough?