Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: lossyWAV 1.4.2 Development (was 1.5.0) (Read 170209 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #200
The main reason I still use lossyWAV is to process my lossless FLAC master music library for use on a portable device - firstly it reduces filesize and secondly FLAC is an "easy" CODEC to decode, incurring not very much device CPU load while listening. As an example of this decode efficiency here's a link to a rather old Rockbox anmalysis for a number of devices: https://www.rockbox.org/wiki/CodecPerformanceComparison . While device CPUs will have increased dramatically in performance since then, I'm doubtful that the relative CPU load of different CODECs will have changed so much that FLAC will use significantly more battery than other CODECs.

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #201
The main reason I still use lossyWAV is to process my lossless FLAC master music library for use on a portable device
However I must say lossyWAV extreme could easily be used for music collection storage. I think this 100% transparent with some kind ~2x reduced bitrate.
Nick.C please correct if I am wrong in my assumption. Thank you for this codec, man!

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #202
FLAC is an "easy" CODEC to decode, incurring not very much device CPU load while listening. As an example of this decode efficiency here's a link to a rather old Rockbox anmalysis for a number of devices: https://www.rockbox.org/wiki/CodecPerformanceComparison .
Quote
I'm doubtful that the relative CPU load of different CODECs will have changed so much that FLAC will use significantly more battery than other CODECs.
We alluded to that old test back in '18, when I kinda raised the question in the Musepack forum as to whether decoding speeds still meant anything to modern processors.
Saratoga then even gave more details on how he'd carried out his testing.
• The older, the lossier
• Listen to the music, not the media it's on.

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #203
However I must say lossyWAV extreme could easily be used for music collection storage. I think this 100% transparent with some kind ~2x reduced bitrate.
Nick.C please correct if I am wrong in my assumption. Thank you for this codec, man!
It could probably be, at one of the higher quality setting.

Thanks should really go to @2Bdecided who posted here about an idea to make use of the wasted bits feature of the FLAC CODEC to allow dynamic reduction of bits-per-sample through rounding, adding some noise to the output (but the number of bits removed would be chosen so that the expected additional noise would not be noticeable). His proof-of-concept code was in Matlab which I rewrote (with the help of @halb27) in Delphi (Pascal).

@tycho kicked off the translation from Delphi to C++.

Later development included an adaptive noise shaping method proposed by @SebastianG

Thanks therefore should go to all of those involved in the project, not just me.

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #204
@lovecraft (nice nickname BTW  :) )
Last but not least: you can encode your high bitrate Opus, MPC, AAC, or LossyWAV to another high bitrate future zUHD-AACv8, Ogg-Spaceship, TRALALAC. Okay, there will be another layer of loss, but it won't be audible at all (and you'll probably be half-deaf by then, which won't help).

:) You are absolutely right on the money. Unless some miracle happens, i probably won't even be able to distinguish 128kbps mp3 in the very near future :) So I have decided to go with "opus" at 192. I cranked the complexity to 10 and disabled the stereo phase inversion thingy. The resulting files are much more smaller than lossywav and also my mp3 collection but they sound crisp. I guess to get that crisp on an mp3, i'd have to go with 320.

I accepted the fact that this is the cutting edge of audio compression. And it has been gaining support from all around, google phones now have "opus" as bluetooth codec if i am not mistaken. So yeah, this thing will probably bury me, so it is future proof enough :)

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #205
Not to belittle the ideas, the efforts and the results, but LossyWAV isn't set for world domination. It isn't bundling in any "will walk your dog while you listen to music" features, it does what it should - and that is a niche use.

If we aren't already at "niche" use when we confine attention to lossless audio on file on a user's hard drive (as opposed to on silver), we are quite close. And within that niche or near-niche:
Further downmix to LPCM is a niche use, even if someone found a clever way to utilize space savings along a bit depth dimension. Most who encode to lossy, choose a common lossy codec. They do their job and are well supported.
Hybrid lossless/lossy with correction file options, that is a niche use - otherwise, WavPack would have been a big thing in the lossless market, and it isn't.


Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #206
I think very high bitrate (320+) lossywav+flac would be a better option than 320 mp3/aac due to the fact that it will transcode lossless to another *lossless* codec. The other advantage is the compatibility of flac, 2nd to mp3/aac. Of course it would exceed mp3 quality at some level too.

The other one to look into is FSLAC - even easier to use as it produces flac files. It converts lossless to other codecs.

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #207
If there was a single binary like fslac that did losswav+flac it may have taken off more that it did.

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #208
And some HALAC 0.3.8 results... The difference here becomes even more evident. Default lossyWAV settings were used in the conversion.

Code: [Select]
Intel i7 3770k, 16 gb, 240 gb
Gubbology (671,670,372)
HALAC 0.3.8 Normal 239,329,295 3.422 4.390
HALAC 0.3.8 Fast   246,522,130 2.734 3.953
HALAC 0.3.7 Normal 261,615,892 3.406 4.531
HALAC 0.3.8 UFast  282,920,505 2.453 2.750

Globular (802,063,984)
HALAC 0.3.8 Normal 271,098,020 4.125 5.234
HALAC 0.3.8 Fast   278,214,738 3.359 4.750
HALAC 0.3.7 Normal 282,472,800 4.219 5.172
HALAC 0.3.8 UFast  312,643,849 2.953 3.234

SQUEEZE CHART (606,679,108)
HALAC 0.3.8 Normal 200,481,958 3.375 4.140
HALAC 0.3.8 Fast   204,047,554 2.781 3.812
HALAC 0.3.7 Normal 209,863,558 3.359 4.125
HALAC 0.3.8 UFast  223,975,665 2.437 2.672

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #209
If there was a single binary like fslac that did losswav+flac it may have taken off more that it did.
AFAIR (and sorry if this has been mentioned in this thread before, haven't read most of it), the original idea was precisely to have such a FLAC specific single encoder binary. But others asked for decoupling from FLAC, to use the idea with other wasted-bits capable lossless codecs.

Chris
If I don't reply to your reply, it means I agree with you.

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #210
And some HALAC 0.3.8 results... The difference here becomes even more evident. Default lossyWAV settings were used in the conversion.
Most lossless codecs cannot compress audio data that is more compressible, such as LossyWav, as fast and as good.
I expect a very fast and high quality Lossy audio codec from you again this time. Lossy HALAC!

 

Re: lossyWAV 1.4.2 Development (was 1.5.0)

Reply #211
Most lossless codecs cannot compress audio data that is more compressible, such as LossyWav, as fast and as good.
I expect a very fast and high quality Lossy audio codec from you again this time. Lossy HALAC!
I don't know if it needs a new lossy codec. (I even started to think that there was no need for a new lossless codec) Because the subject of the loss is open to too many interpretations.
However, a high-quality or near-lossless of fast version can of course be improved. But I don't know if it's worth it.