Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: lossyWAV 1.4.2 Development (was 1.5.0) (Read 123005 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #150
@Nick.C What's our next step here? What should be improved here, LossyWAV or Vorbis?
• Join our efforts to make Helix MP3 encoder great again
• Opus complexity & qAAC dependence on Apple is an aberration from Vorbis & Musepack breakthroughs
• Let's pray that D. Bryant improve WavPack hybrid, C. Helmrich update FSLAC, M. van Beurden teach FLAC to handle non-audio data

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #151
@Nick.C What's our next step here? What should be improved here, LossyWAV or Vorbis?
In my opinion Vorbis should be fixed to properly handle chunks of odd length, in line with the Microsoft/IBM RIFF specification, based on the EA/Commodore IFF specification that preceded it.

A quick and nasty solution would be for lossyWAV to increase the length and stated size of the FACT chunk, if required, to ensure that it is always even, however this would perpetuate support for applications that don't handle WAV files correctly.

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #152
Please find attached (superseded) a new beta release of lossyWAV.

lossyWAV beta 1.4.3c, 02/05/2024 (expires 31/12/2024)
  • Major bug identified after @guruboolez discovered that lossyWAV would not successfully convert large files in foobar2000, i.e. those which exceed 4GiB of uncompressed data in length, using the --ignore-chunk-sizes option. Many thanks to @Case for answering my questions on foobar2000, which made identifying the bug much easier. The calculation of padding bytes to write after each chunk incorrectly assumed that the number of bytes of processed WAV file would not exceed 4GiB, which caused the program to fail when attempting to write exabytes of padding...

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #153
Complete n00b question that I didn't find much answer to. I see no -m in the signature immediately above, at least.

Reducing bit depth not by zeroing out both stereo channels, but by averaging bits - so that they are zeroed in a side/difference channel; is that -m?
Which might not be exploited in a way that gives any bang for the buck really?

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #154
Complete n00b question that I didn't find much answer to. I see no -m in the signature immediately above, at least.

Reducing bit depth not by zeroing out both stereo channels, but by averaging bits - so that they are zeroed in a side/difference channel; is that -m?
Which might not be exploited in a way that gives any bang for the buck really?
The -m, --midside parameter only works with stereo content and determines bits to remove through analysis of mid and side channel data, the calculated bits to remove are then removed from each of the stereo channels in the WAV data. This means that the bits to remove value is the same for each channel (which is not the case normally) so the overall bits to remove for the processed data will likely be lower.

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #155
the calculated bits to remove are then removed from each of the stereo channels in the WAV data.
From each ...
So there is then this in-principle-possible way to save "up to half a bit" extra at likely-small fidelity penalty: If we decimate each channel down to N bits, then design the dither so that Nth bit is common to both channels - then reducing the "side" channel by one bit, except for frames where FLAC uses dual-mono encoding?
Or does it already?

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #156
The bit removal process is carried out on each channel separately, using the calculated bits to remove value for each channel, and may (depending on whether too many new clips are encountered or whether any of the feedback [if selected] breaches limits) remove fewer bits than desired - shown in output as "bits lost". Noting that if --midside or --linkchannels have been selected then the bit removal process will be repeated for any channels where the actual bits removed is higher than the minimum for that codec block.

This would be further complicated if adaptive noise shaping was in use (which it is by default) as the filters are channel specific.

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #157
I made a few experiments with LossyWAV (I used default setting). I hope the results are not deleted again. Maybe I can add this mode to HALAC 0.2.9, but I still think it's early. The results are interesting. WMA Lossless did a really good job. All codecs on the list are at the highest level. So I didn't take into account the process speeds. They can already be predicted.

I made a small arrangement for HALAC. Actually, I disabled the predictor and only a simple filter remained. And that's why it was faster than the "-fast" mode. It was not used as 16 bit 8+8 because it was also necessary to deal with it. The following results are the results I have obtained with the Order-0 entropy encoder. With the Order-1, a slight speed decrease can probably get better results.

But as I said before, Lossywav works quite slowly (at least for me). As Converter, I used "fre:ac".

XX

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #158
I made a few experiments with LossyWAV (I used default setting). I hope the results are not deleted again.
As this is not a thread dedicated to a lossless codec other than your own, and that lossyWAV is designed to be stored in a lossless codec (that makes use of the "wasted bits" feature of FLAC), there's little likelihood of your results being deleted, IMO.

Unexpected that WMA often beats FLAC.

Also please confirm that, where possible, the lossless codec's block size has been set to the appropriate size for lossyWAV output, e.g. 512 for 44.1/48kHz, 1,024 for 88.2/96kHz, etc..

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #159
This is what I use for WMA. It works without loss.
XX
I used LossyWAV in the default way. ( LossyWav.exe filename.wav )
If you mean lossless codec HALAC, I have not made any block adaptation. I don't know if this is important in terms of results or not. However, the block size I tested contains 4096 samples and does not deal with Sample Rate.

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #160
@Hakan Abbas

Nick meant passing -b 512 to flac encoder, which you didn't:

Code: [Select]
13,749,585 '01 Riot (Feat. Damian Jr. Gong Marley).lossy.flac'
11,713,235 '01 Riot (Feat. Damian Jr. Gong Marley).lossy.b512.flac'
15,951,686 '02 Entertainment 2.0 (Feat. Juicy J, 2 Chainz & Nicki Minaj).lossy.flac'
13,103,138 '02 Entertainment 2.0 (Feat. Juicy J, 2 Chainz & Nicki Minaj).lossy.b512.flac'

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #161
Block size 512 can be set with

flac -b 512
wavpack --blocksize=512
Takc.exe -fsl512

Not much use trying ALAC as it doesn't support wasted bits, but if you want to experiment with block size: CUETools.ALACEnc.exe -b 512 (remember the whitespace). Also Monkey's doesn't support wasted bits.

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #162
Thanks both, that's exactly what I was getting at.

@2012's results look much more like it.

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #163
I have never considered the idea of compressing according to the content by giving the right to play with block sizes. Because I always try to do things that are as adaptive as possible and find a middle way according to the context. Otherwise, it would be good to get dozens of different results with a few parameters that will be presented for testing and use the best one. But normal users are not interested in this in normal life. And it is also a time-consuming process. It is very variable according to the situation. Of course, for LossyWav, this may be necessary by nature.

Below is the case where I just halved the block size for mine(HALAC) (from 4096 to 2048). There is an improvement in the results. As I said, in this form, no action is even being taken yet. And unnecessarily fast. The blocks size can be reduced even further, but the amount of data required for entropy encoding is decreasing. This also has a negative effect on compression. It will be more efficient if we group the blocks and put them into coding. And it will also be much more accurate to process as 8 + 8 bits. The only problem for me right now is the processing speed of LossyWav.

Sean Paul (Block Size: 4k -> 2k)
Code: [Select]
01 - Riot : 15,802,208 -> 15,555,518
02 - Entertainment : 18,723,000 -> 18,365,454
03 - Want Dem All : 15,340,527 -> 15,120,997
04 - Hey Baby : 14,494,502 -> 14,403,168

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #164
I have never considered the idea of compressing according to the content by giving the right to play with block sizes.
Of course, for LossyWav, this may be necessary by nature.
To achieve the best reduction, i.e. where bits to remove changes within a larger block size, it is indeed necessary.
The only problem for me right now is the processing speed of LossyWav.
While it is readily acknowledged that lossyWAV is not particularly fast, it is what it is - and as each file need only be processed once, it's not the worst downside, IMO.

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #165
N00b question (again!)
1.4.3c  and flac.exe 1.4.3, is the latter supposed to return
WARNING: RIFF chunk size of file file.lossy.wav does not agree with filesize
file.lossy.wav: ERROR reading foreign metadata: invalid WAVE file: unexpected EOF (010)

?
Edit: OptimFROG also rejects. WavPack and TAK are happy about it.



Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #166
The "ERROR reading foreign metadata" message probably relates to the addition of a "fact" chunk to the processed file, which contains information relating to how it was processed. You could try adding the "--keep-foreign-metadata" parameter to the FLAC command line to see if that improves the situation.

The file size warning is unexpected. Can you please process the original WAV file with lossyWAV 1.4.2 and see if that works? That would indicate to me whether the error is new (since I re-started development) or very old.

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #167
1.4.2 works. Problem shows up in FLAC only when using --keep-foreign-metadata
I wonder, is it so simple that the "c" in "1.4.3c" makes for wrong length of the FACT chunk?

Both report wrong length in foobar2000:
233031632wk 5d 14:12:05.632 (6 215 345 139 887 599 616 samples)

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #168
Thanks very much for taking the time to test, much appreciated. It reads like the fix I put in place for calculation of chunk padding didn't actually work.


Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #170
Please find attached (superseded) a new beta release of lossyWAV.

lossyWAV beta 1.4.3d, 22/05/2024 (expires 31/12/2024)
  • Major bug identified (in Beta 1.4.3c) after @Porcus pointed out that attempts to encode processed WAV files with FLAC would result in an error. The size of the 'fact' chunk added to the WAV file during processing (if piped output is not used) was not added to the size in the RIFF header.

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #171
Please find attached (superseded) a new beta release of lossyWAV.

lossyWAV beta 1.4.3e, 23/05/2024 (expires 31/12/2024)
  • Bug identified (in Beta 1.4.3d) relating to merging of lossy and correction files.


edit: Attachmend deleted by request of poster

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #172
A little bit of testing, although the algorithm itself isn't changing. But over the years, some codecs have been upgraded.

What I did:
Took 219 minutes (first ten and a half minutes of each of 7+7+7 albums from my signature), processed with lossywav -C, compressed both original, lossy and correction file at various settings. Tested in particular block size 1024 just to see whether the penalty is severe.
Re-did FLAC with 1.4.3 beta "d".
Included WavPack even if that has its own hybrid mode which is easier to manage.


CDDA, all numbers in kbit/s or differences thereof.
* lossless, no block size set: Because if you don't want to use LossyWAV, you don't want to fiddle with that switch.
*"corr., def" is the correction file but compressed without imposing block size. That gives slightly better results. It is commented on.
* "cost of two files": Difference between sum of the lossy (with block size set) and correction (without) - and lossless (without).
.lossless, no block size setcost of two fileslossycorr., def.
wav1411141114111411
WMAL778196487487
WavPack, 512with --merge-blocks. Virtually no penalty for double block size. 4 to 7 kbit/s penalty for storing  correction file with same setting
fast779128473434
default760122458424
-hx748123455416
-hhx4743115447411Multi-threaded -hhx6 shaves 1 off the lossy.
FLAC, 512
-0b51284184486439Double blocksize penalty: 12
-2eb51279997459437Double blocksize penalty: 11
-5b51275698432423Double blocksize penalty: 5. 8 kbit/s penalty for storing correction file with same setting
-8pb51275199429421Double blocksize penalty: 6. 8 kbit/s penalty for storing correction file with same setting
TAK, 512
-p0 -fsl51275677427406Double blocksize penalty: 8. 12 kbit/s penalty for storing correction file with same setting
-p2 -fsl51273486418402Double blocksize penalty: 4. 16 kbit/s penalty for storing correction file with same setting
-p4m -fsl51272193414400Double blocksize penalty: 4. 17 kbit/s penalty for storing correction file with same setting
ALS
-l -n51275175411415Double blocksize penalty: 13. 4 kbit/s penalty for storing correction file with same setting
-7 -p -l -n51271987407399Better use double blocksize! 8 kbit/s penalty for storing correction file with same setting
-7 -p -l -n1024 (!)719399
Going out on a limb, it should be no surprise from what one already knew about that TAK that it could be the codec of choice for LossyWAV:
* Compresses impressively
* Downsides of TAK less applicable to LossyWAV users: they know what they are doing, so they could likely handle TAK too - and find a player that supports it.
* TAK natively supports RIFF chunks, no issues about remembering the foreign metadata switch in FLAC
* ... so does WavPack, but WavPack users would likely rather use the format's own hybrid.

A couple of remarks on other codecs
* Interestingly, but hardly not anything recommended due to speed: The ALS -7 -p setting does better with block size 1024. I guess it does some optimization under the hood, you should get something out of waiting for this ridiculous amount of time.
* FLAC roundoffs could be affected by inconsistent padding since I ran the lossy+correction parts afterwards, having upgraded to beta 1.4.3d
* From this one could doubt that WMAL is "LossyWAV compatible" - it is, it is just a bad performer overall. Next table shows that the two-file penalty is less than for OptimFROG.

For reference, the following uses LossyWAV as one should absolutely not do it:
Included codecs that don't use wasted bits;
No block size options applied - except I did apply -z3 -7 -l -p -n512 for the ALS RLSMS mode which apparently does not honour it (and shouldn't be used for anything either).
.losslesscost of two fileslossy
wav141114111411
cuetools ALAC761430772
Monkey’s extra high721446739
ALS RLSMS with everything713438733
OptimFROG default724264600
WMAL778196487
WavPack default760199535
FLAC default756143476
TAK default734139472
ALS default751108444
ALS -7 -p -l71986406
Here, all three columns are same setting, and "cost of two files" is also difference between apples and apples.

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #173
Thanks very much for taking the time to perform such a comprehensive test of how lossless handle lossyWAV processed audio. Much appreciated.

For me another consideration is the "cost", in terms of CPU load / battery life of portable devices, of playing the audio. In that respect FLAC seems to do well - not sure about the others.

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #174
I tested a couple of Android phones: https://hydrogenaud.io/index.php/topic,124857.0.html
FLAC then has the advantage of being natively supported by the OS, and that apparently did matter.

Edit: I see you remarked that LossyFLAC did well on Rockbox.