Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: Lossless Compression (Read 26889 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

Lossless Compression

Reply #50
Quote
Originally posted by Jon Ingram
Then I can pay someone to write a decoder.


I'm poor

In this aspect, I'm very pratical. It's music we are archiving, not the plans for the atomic bomb you stole from the russians. (Though I didn't know?  )

So, if, someday, somehow, the format I am using dies, there's no way for me to decode the tracks and I have lost the original CD, I will shrug and say "Well, I lost my tracks. Will my life be ruined from now on because of this? Hell no, it's just music!" Then, I'll forget about it. (You know, no use crying over spilled milk...)

@Speek: I liked a lot your comparision, but there are two things I would like to suggest. (You know how I like to do suggestions, don't you?  )

1. - Add the sum of output sizes and compression time. (Like a "conclusion" table)

2. - I would really like to see Rkau in your comparision, to know if it's still better than APE. (Whittle's page compares an outdated version of Monkey)

Regards;

Roberto.

Obs: Just for the info, I think StarOffice reads WordStar formats.

Lossless Compression

Reply #51
Quote
Originally posted by Rant
So my question is why the different compression settings that were mentioned? If it's lossless it's all going to sound the same no matter whether you use regular, high or very high compression right?So why would it be recommended that I not encode to Money Audio using very high compression to get the smallest file possible?
You are right, all lossless compression sounds the same. The trade-off is that more compression requires more processing power and is therefore slower. Also, some compression formats require more CPU power to play the more highly compressed files, which could bog down your computer for other things.

For example, a lot of people use WavPack just to compress audio files on their hard drive temporarily (like a few hours or days) just to save some space until they need them again. For this, WavPack's 2x-3x speed advantage over everyone else is handy and the couple percent of compression you lose is irrelevant (as is the closed-format issue!)

Quote
Originally posted by Speek
Another comparison between MAC, LPAC, FLAC and WavPack:                   http://home.wanadoo.nl/~w.speek/comparison.htm
Thanks for your comparison, Speek! It's nice to have an unbiased, up-to-date one out there. Perhaps, if you happen to have absolutely nothing else to do, you could include the new high mode from the latest WavPack beta.

Also, I would be happy to send you a classical CD, although I guess the test wouldn't be unbiased then!

Lossless Compression

Reply #52
Quote
Originally posted by bryant
You are right, all lossless compression sounds the same. The trade-off is that more compression requires more processing power and is therefore slower. Also, some compression formats require more CPU power to play the more highly compressed files, which could bog down your computer for other things.

For example, a lot of people use WavPack just to compress audio files on their hard drive temporarily (like a few hours or days) just to save some space until they need them again. For this, WavPack's 2x-3x speed advantage over everyone else is handy and the couple percent of compression you lose is irrelevant (as is the closed-format issue!)


Ahhhh! Well that explains that quite nicely thanks. 
To me, clowns aren\'t funny. In fact, they\'re kinda scary. I\'ve wondered where this started and I think it goes back to the time I went to the circus and a clown killed my dad. -- Jack Handy

Lossless Compression

Reply #53
I updated the comparison with WavPack 3.94beta and Monkey's Audio 3.95a1. I also put the totals at the end. Roberto, I'm not going to do RKAU for two reasons: it's very slow and it uses a lot of CPU power when playing the files. I don't think it can compete with modern compressors. (I did compress the first CD - Dido - with RKAU, so you can get an idea of it's compression).

Lossless Compression

Reply #54
Thanks Speek, that's great!

Lossless Compression

Reply #55
Great Speek, thanks.

Quote
Originally posted by Speek
I'm not going to do RKAU for two reasons: it's very slow and it uses a lot of CPU power when playing the files.


That´s OK. When you are an AAC freak, you can´t really care for compressor slowness. 
And I have an AthlonXP 1600+ (just got it), so I do all my compressions overnight.

Regards;

Roberto.

Lossless Compression

Reply #56
Why no Shorten? It seems to be very popular.

Lossless Compression

Reply #57
Quote
Originally posted by Radboy
Why no Shorten? It seems to be very popular.


It's defunct. Compresses worse than Monkey and WavPack. Not worth the hassle.

Lossless Compression

Reply #58
Quote
Originally posted by rjamorim


It's defunct. Compresses worse than Monkey and WavPack. Not worth the hassle.


It would be interesting to see how .shn measures up; there are so many of them out there, and the tapping community sure seems attached to it (look @ etree.org). I guess there aren't alot of people into that kind of music around here, but the fact that it is so widely used makes it relevant (to me, anyway). I can tell, by looking at the file sizes, that .ape and .flac compress more...

Isn't the .shn source available?

Lossless Compression

Reply #59
Quote
Originally posted by macdaddy

It would be interesting to see how .shn measures up; there are so many of them out there, and the tapping community sure seems attached to it (look @ etree.org). I guess there aren't alot of people into that kind of music around here, but the fact that it is so widely used makes it relevant (to me, anyway). 


You can have an idea of the performance on Robin Whittle's site. It seems to be only superior to WaveZip (Not surprisingly, GadgetLabs, vendors of WaveZip, are now out of business).

Quote
Isn't the .shn source available?


It was available in the past. What means the specs are available, but not sources for the latest versions.

Regards;

Roberto.

Lossless Compression

Reply #60
Quote
Originally posted by rjamorim


You can have an idea of the performance on Robin Whittle's site. It seems to be only superior to WaveZip (Not surprisingly, GadgetLabs, vendors of WaveZip, are now out of business).

It was available in the past. What means the specs are available, but not sources for the latest versions.

Regards;

Roberto.


Source is still available. Current version is 3.4 which has some
bugfixes with stdin/stdout operation. Works fine.

Link can be found on my webpage.
--  Frank Klemm

Lossless Compression

Reply #61
Sorry about that. I didn'd look for the source as well as I should. Just took a look at Shorten's official page and, not finding it there, guessed it wasn't available anymore.

Regards,

Roberto.

Lossless Compression

Reply #62
Quote
Originally posted by Frank Klemm


This doesn't help you so much if you can't compile the source on
your operating system and you do not find out why. Or it uses
special tools to prepare compiling. For a lot of software packages
it is not a problem to spend some days until the project compiles
and the programs are working.


http://cvs.sourceforge.net/cgi-bin/viewcvs....viewcvs-markup

Covers how to build on a gnu environment using autoconf, or using MSVC on windows, or project builder on mac OS X, or using the 'lite' make system when all of those fail.  As open source projects go I think that is pretty good.

Quote
Originally posted by Frank Klemm


You can have exactly the same problems. May be a guru
cares about your program and maintains it after the original
author "died", but this can happen or not. 

yeah, but your chance of that happening are at least > 0.

Quote
Originally posted by Frank Klemm


Other problem. If you doing exactly the things in lines 9...15
of the online help (--help) you get corrupted files on Windows.
This means the online help describes how to cook files.
Please correct this problem as soon as possible. This bug is fixed
in LPAC (1.36), Shorten (3.4) and some other programs.


CVS has a fix that works for command.com but breaks in cygwin (at least my old b20.1 version).  As soon as I can get a fix that works in both or determine that it is a problem with old cygwin that has been fixed I will do another release.

Quote
Originally posted by fewtch

Jeesh, if you're gonna make an argument in favor of open source / codecs and against windows-only stuff, at least make it a good one...


Sure, it's extreme.  But the worst doesn't have to happen:

http://www.monkeysaudio.com/cgi-bin/YaBB/Y...&num=1014986574

Look, I too have said before that Monkey has hands-down the best engine out there.  Wavpack has come a long way also.  But you should be comfortable with the risk you are taking by archiving to a proprietary format.  Maybe someone has to be burned once to take that seriously.

Lossless Compression

Reply #63
Quote
Sure, it's extreme. But the worst doesn't have to happen:
http://www.monkeysaudio.com/cgi-bin/YaBB/Y...&num=1014986574
Now that's just a bit unfair. The link to the Monkey's Audio forum that you posted is for a discussion about crashes occuring in version 3.95 of the software. The important thing to know is that version 3.95 is at the alpha stage. No expectation of proper functioning should be assumed with alpha software, open or closed source. In contrast, the latest released version works just fine.

I don't think Monkey's will stay completely closed and Windows-only. I think that if enough interest is generated in the format, and subsequently enough calls for multiple platform support, he'll open things up more eventually.

If you're an open-source purist I don't suppose that's any consolation, because there are no guarantees here. But I think that if Monkey's works well for your application, then go for it.
Michael


Lossless Compression

Reply #65
Quote
Originally posted by Speek
Updated my comparison of lossless compressors. Added more albums and tested also the highest compression levels. Link: http://home.wanadoo.nl/~w.speek/comparison.htm

>Another conclusion is that LPAC "Medium" level is as good
>as "Extra High". ''

... did you try any tests with the random access disabled... and with different kinds of music?  That's an unusual conclusion, given that "Extra High" uses a lot of extra predictive stuff over "Medium."

Lossless Compression

Reply #66
Quote
Originally posted by Speek
Updated my comparison of lossless compressors. Added more albums and tested also the highest compression levels. Link: http://home.wanadoo.nl/~w.speek/comparison.htm


On the 2500 files on my hard disk, extra high always generates
smaller files than normal compression. Exceptions are files
only containing silence which have the same size.

This is completely disjunct to your results.
--  Frank Klemm

Lossless Compression

Reply #67
Quote
Now that's just a bit unfair. The link to the Monkey's Audio forum that you posted is for a discussion about crashes occuring in version 3.95 of the software. The important thing to know is that version 3.95 is at the alpha stage. No expectation of proper functioning should be assumed with alpha software, open or closed source. In contrast, the latest released version works just fine.
I think what jcoalson means is that software can break at any time, and if the author happened to abandon the program after that, while keeping the source closed up, your files are lost (unless you can find an older, working copy of the program). And if you're merely using the program for archiving, you're not likely to keep downloading the newest version of the software to see if it still works as it should with your new OS or processor architecture for years on end. Who knows what happens with MA ten years from now (64-bit processors, Lindows/Winux, or simply altering to the file format), but regardless of what, you'll still want to be able to decode your files.

I do think that for a lossless format to become widely accepted among programmers and integrators, at least file format specifications should be made available. Asides from that, what I consider important for a lossless format is (ordered by relative priority):

1) Error detection and recovery
2) Compression ratio
3) Handling of many kinds of adopted formats (24-/32-bits)
4) Streaming ability/seeking (and error resilience for streaming)
5) Small format overhead
6) CPU usage for playback
7) CPU usage for compressing

Anyway, as a request for Speek: can you please include those other lossless compressors as well, since if someone is trying to convert another to using a different kind of format, it's easier when there's a comparison showing the format they're using to be inferior. You're not gonna get any avid Shorten users to convert if there isn't at least a comparison between it and other lossless compressors.

Lossless Compression

Reply #68
Quote
I think what jcoalson means is that closed-source software can break at any time, and if the author happened to abandon the program after that, your files are lost (unless you can find an older, working copy of the program).
If you're using a stable version of the software, and it already does everything you want it to do without incident, then it's not all that likely to just "break at any time." And if it does, it won't have the disastrous consequences you're predicting (losing all my files). I can certainly see a scenario where a program I've been using for awhile breaks when I try to compress a new album. Drat. But do you really expect me to believe that means I will lose all my files? Of course not.

The one exception I can envision is when undertaking an OS upgrade. That can certainly break software sometimes, and if the decoder is broken I might be screwed. But if I don't verify the operation of my important software before erasing my old OS, that's my own darn fault.

This is really not about open-source versus closed-source, this is about whether a project is being actively maintained or not. If you encouter a bug in software that is actively maintained, all you do is report it to the maintainers, and wait for the results. Thats the same process with open or closed source.

Of course, if the project maintainers have gone AWOL, the theoretical advantage for open-source software is that you could fix it yourself. In practice, for most of us, that theory just isn't reality. Would you fix it yourself? I'd say that for most people, the answer is no. We want to download the binaries and have them run, and we have neither the time, the inclination, or in most cases the skill to fix problems ourselves.

That leaves one potential advantage to open source: if the original developers lose interest or ability to continue, then other developers can take over. I think you have to evaluate this likelihood on a case-by-case basis. I've seen plenty of open-source software projects that have lost all momentum, and nobody else has wanted to take on the responsibility either.

Being able to download the source code, then, is no substitute for a thorough examiniation of the product and its developers: to see if the program is of high quality, is well-maintained and well-supported, and will not break when you least expect it.

Having said all that---I definitely agree that at the very least the file format ought to be standardized and published. Encoders are valuable designs and I can understand the temptation to keep them proprietary, but there is real value and safety in keeping the decoder open. Heck, doing that might even create new markets for the encoder, as people create decoders for new platforms that weren't initially considered.
Michael

Lossless Compression

Reply #69
Frank and fewtch,

I asked Tilman Liebchen, LPAC author, about it and he answered: "I think this problem comes from the fact that LPAC isn't yet fully optimized for using the "random access" feature. If you don't use it, the overall compression ratios will increase as well - so perhaps you might include LPAC _without_ random access in your test results."

So I compressed the albums again with "random access" disabled. Now "Extra High" is better than "Medium", but only by 8 MB on more than 3,4 GB of compressed data.

YouriP, I will add Shorten first and, if I still like this crazy work, maybe others later

Lossless Compression

Reply #70
Quote
Originally posted by YouriP

Anyway, as a request for Speek: can you please include those other lossless compressors as well, since if someone is trying to convert another to using a different kind of format, it's easier when there's a comparison showing the format they're using to be inferior. You're not gonna get any avid Shorten users to convert if there isn't at least a comparison between it and other lossless compressors.


Another comparison if you're interested in how Shorten stacks up:

http://flac.sf.net/comparison.html

I had the same experience Speek did with LPAC, i.e. normal mode generating smaller files (with random access).  Maybe the results are different with that turned off but then you aren't comparing apples and apples anymore as most codecs incur the time/storage overhead of doing a seek table by default.

Josh

Lossless Compression

Reply #71
Quote
Originally posted by jcoalson
I had the same experience Speek did with LPAC, i.e. normal mode generating smaller files (with random access).  Maybe the results are different with that turned off but then you aren't comparing apples and apples anymore as most codecs incur the time/storage overhead of doing a seek table by default.

Only if the files compressed (with said codec) were meant to be 'played' by something like Winamp -- and afaik, such a function was added as an afterthought in LPAC. 

I don't see how comparing LPAC's normal compression with its high compression, *both* with random access turned off, would be an 'apples to oranges' comparison.  It's quite possible that the seek table for 'extra high' could be significantly larger than the one used for 'normal', for whatever reason.

Personally, I don't notice much difference between LPAC files that use 'random access' and those that don't... seeking is slow as hell with both  .


Lossless Compression

Reply #73
Thanks Speek...

The file sizes I was looking at on my own do not lie-.shn is not that good as far as compression goes. However, there is no getting around the fact that the format is the established standard of the taping community; I guess the flip-side of that coin is that .shn will continue to be supported...

Unfortunately, I don't think lossless is viable for my archival purposes-I have WAY too much material to encode and store. The only way I think I could swing it would be to use a RAID, but the price of 8x120gig drive is prohibitive (for now)...

Lossless Compression

Reply #74
hello,

firstly i would like to say that so far i have only used monkey's audio. to me it seems it does an excellent job: it is simple to use, is fast according to some comparisons (speek's), it uses random access (which is quite important for me)...

The thing I would like to ask you is if other formats use tagging...I like MA's a lot and wouldn't go to other lossless encoder before i find out more about that..

Secondly. I listen to rock most of the time and with MA i usually get compression larger than 65%. Is that normal? Does verifying got anything to do with it?

Regards,
Matej