Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: LA - what happened to it? (Read 9584 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

LA - what happened to it?

When I first got interested in lossless audio compression, the encoder that first caught my eye was LA - I couldn't believe how easily that one beat all the others at the time (which was about five years ago). I was amazed by it and experimented with it, but still ended up sticking with mp3 because of limited hard-drive space at the time.

Then, when I finally decided to use lossless compression for all my audio backing-up needs (which was about two years ago), I find that apparently the developer abandoned the project, leaving behind extremely buggy decoder plugins and a lossless encoder which looked so promising and which achieved compression ratios that some encoders, even today, can't get even remotely close to.

What happened?! Why did the developer abandon the project after such a great start and after achieving such awesome compression ratios? Did he say anything, does anyone know? And why the hell didn't he make the source code available?!



LA - what happened to it?

Reply #3
Just looking at the comparison, it seems to be a lot more CPU intensive for both encoding and decoding than FLAC:
http://www.lossless-audio.com/comparison.htm

Yes, it is incredibly CPU intensive for both encoding and decoding, but I doubt that it was something which couldn't be improved and fixed with code optimisation. However, this almost directly brings us to your next question...

Also, is it open source? That's an obvious advantage to FLAC, and FLAC has been able to achieve some hardware support.

No, it's not open-source. That's the worst problem of all and that's probably the main reason it died. I mean, if it was open-source then even after the initial developer lost interest (for whatever reason), I highly doubt that everyone would lose interest in it, especially when you look at the compression it managed to achieve back in 2004, and also how an even older version of it compares to the latest version of lossless encoders which are popular today.

Also, if it was open-source, more programmers would be able to help with the development and there would be much less trouble with creating the plugins for it, and the code would most probably be optimised so it wouldn't take so long to encode/decode and it would be less CPU intensive.

But what confuses me the most is why would the developer decide to abandon the project after such awesome results... :-/


edit:
Quote
Sorry, no idea. But if you search another lossless codec look here: http://www.synthetic-soul.co.uk/comparison...Size&Desc=0
TAK is similar to LA regarding compression rate but a lot faster.

Thanks for the info.

LA - what happened to it?

Reply #4
Hi doccolinni. Yeah I decided to take a look at LA just a few weeks ago. Knowing that it was extremely CPU intensive I was interested in just how challenging (or not) it would be for playback on a modern dual-core CPU (you know how things change over time with regard to what is really hard on a cpu.)

I found the same as you, that project is just so dead, I couldn't find a single plugin that was usable with any recent players so I gave up. BTW It also reminded me of one other slightly annoying thing about the project - ridiculously short cryptic naming - I mean try googling for "la" and it's such a pain. I blame it on programmers brought up on a diet Linux and C, they just don't know any better.

LA - what happened to it?

Reply #5
...especially when you look at the compression it managed to achieve back in 2004, and also how an even older version of it compares to the latest version of lossless encoders which are popular today.

The other developers of lossless codecs did not make compression ratio their ultimate goal, and that is why LA retained its top position (not sure if OptimFrog is better in some scenarios). They wanted fast decodability and encodability, both of which have to be weighed in comparison with compression ratio to cater to different niches. Many people use FLAC for speed, its open-source licence and relative ubiquity. Some use TAK because they like the very finely balanced tradeoff between compression ratio and decoding speed. Others use ALAC and WMA Lossless because these work best with their portable music players or home streaming servers.

LA - what happened to it?

Reply #6
Hi doccolinni. Yeah I decided to take a look at LA just a few weeks ago. Knowing that it was extremely CPU intensive I was interested in just how challenging (or not) it would be for playback on a modern dual-core CPU (you know how things change over time with regard to what is really hard on a cpu.)

I found the same as you, that project is just so dead, I couldn't find a single plugin that was usable with any recent players so I gave up. BTW It also reminded me of one other slightly annoying thing about the project - ridiculously short cryptic naming - I mean try googling for "la" and it's such a pain. I blame it on programmers brought up on a diet Linux and C, they just don't know any better.

Thank you for answering and for the information.

But seriously, I don't see what does bad naming have to do with Linux and C... :-/

The reason why LA died is because it had only one developer and it was closed-source - these sorts of programs are practically doomed to failure, because the development goes incredibly slow and is incredibly difficult when there's only one developer so she/he will almost certainly abandon the project after while because of getting bored with it and since it's closed-source that's obviously when the program itself will die.

The other developers of lossless codecs did not make compression ratio their ultimate goal, and that is why LA retained its top position (not sure if OptimFrog is better in some scenarios). They wanted fast decodability and encodability, both of which have to be weighed in comparison with compression ratio to cater to different niches. Many people use FLAC for speed, its open-source licence and relative ubiquity. Some use TAK because they like the very finely balanced tradeoff between compression ratio and decoding speed. Others use ALAC and WMA Lossless because these work best with their portable music players or home streaming servers.

As a matter of fact I'm pretty sure I saw a comparison chart wherein OptimFROG with command-line options for greatest possible compression regardless of the compression speed managed to compress better than LA, although it took it even longer than LA to compress then.

LA - what happened to it?

Reply #7
Don't be fooled as it is not better in opensource projects: thousands are dead in sourceforge not to mention forks. Even if its open there has to be enough interest for someone else to pickup the pieces.

LA - what happened to it?

Reply #8
Even if its open there has to be enough interest for someone else to pickup the pieces.

Precisely, but like I've already said I doubt there wouldn't be enough interest when you notice how incredibly well LA managed/manages to compress - back in 2004 no other lossless encoder was even remotely close to its level of compression, and even today, five years later, it's still practically on the top as far as compression level goes (also note that the version used in that chart is not the latest version of LA). Yes, it's incredibly CPU intensive and encodes/decodes are really slow, but I doubt that that couldn't be fixed with proper optimisation.

But it's all just guess-talk now that it's dead anyway, isn't it?

LA - what happened to it?

Reply #9
Actually, with current compression techniques, such (relatively) high compression ratios require a great measure of encoding/decoding power. The curve is asymptotic.

It will be interesting if there have been technical advancements since the last time LA was seen alive (3-4 years ago?)

LA - what happened to it?

Reply #10
Don't be fooled as it is not better in opensource projects:


For audio formats at least, it is.  That way people can implement decoders even after the project dies.  Otherwise you end up stuck with files that can't be played in modern applications . . .

LA - what happened to it?

Reply #11
Yes, it's incredibly CPU intensive and encodes/decodes are really slow, but I doubt that that couldn't be fixed with proper optimisation.

I doubt it.  While you might be able to make it significantly faster, it is extremely unlikely that it can be transformed into a codec that many could consider "fast" when compared to something like WavPack, FLAC, or TAK.  Decoding speed is of particular interest.

There's a reason that asymmetric codecs are preferred over symmetric ones.  Few care about how long it takes to encode something (within reason), but fast decompression is absolutely crucial.  I shouldn't have to dedicate a high powered PC simply to playing music or take a 20% hit to my in-game performance because my chosen codec requires so much CPU power to decode in real time.  For integration into other applications (like music for games), dedicating such great resources to audio decoding means that there's less CPU power available for things like display output, AI, physics, etc.  Imagine requiring a Core-i7 just to mix more than 6 tracks in an audio editor.

For things like hardware support the amount of CPU power required to decode is a make-or-break proposition for a codec.  For portable devices, it affects things like CPU clock speed required for the device, battery life (or battery size required to achieve a given life time for targeted usage profiles), heat output, and required cooling.  These can cause changes in things like the device size, noise profile, etc.

(Given, much of this post is not specific to lossless audio in particular, but to compression of any kind.)

LA - what happened to it?

Reply #12
doccolinni, very few people consider a fractional percentage compression improvement over state of the art as "awesome" or "not even remotely close".

LA - what happened to it?

Reply #13
Could it be that the developer abandoned it after other faster, better alternatives for lossless compression came into being?

The chart made by Synthetic Soul looks like FLAC, TAK, ALAC, and others all are dozens of times faster encoding/decoding and all the lossless compressors produce filesizes that are relatively close to each other.

For example, comparing LA and TAK.  TAK's filesizes are only 1% different than that of LA, yet TAK encodes about 300% faster than LA and decodes 1400% faster than LA.

So, why continue to use or even bother with LA when you can get virtually the same results with TAK, but much faster?


Take FLAC versus LA.  FLAC's files are only about 3% larger than LA's, but  FLAC encodes at least 400% faster, and decodes at least 1600% faster, than LA.

In addition, there is a great deal of FLAC support, as well as it being open-source.

LA - what happened to it?

Reply #14
doccolinni, very few people consider a fractional percentage compression improvement over state of the art as "awesome" or "not even remotely close".

Today, the difference is a fractional percentage. Back in 2004 (which I was talking about), on the other hand, it was different, and then it was both awesome and not even remotely close.

You might as well read my post properly first if you've decided to respond to it.


edit:
Quote
Take FLAC versus LA. FLAC's files are only about 3% larger than LA's, but FLAC encodes at least 400% faster, and decodes at least 1600% faster, than LA.

In addition, there is a great deal of FLAC support, as well as it being open-source.

I wasn't planning on using LA instead of FLAC, I was just curious about LA because I have a feeling that it could have been optimised and that it was only so slow because it was still practically in earliest alpha stages.

LA - what happened to it?

Reply #15
I think there's a point of diminishing returns with lossless compression.  All lossless compression, that includes compressors like ZIP, RAR, ACE, and 7-Zip as well as the ones used for audio like TAK, FLAC, and ALAC.

To eke out one or two more percent compression often requires a much larger amount of computation time in both encoding and decoding.  It also sometimes requires more memory.

Using 7-Zip and RAR as examples, not even related to audio.  However, the point being made is the same.

7-Zip often produces better compression results than RAR.  Not significantly better in most instances, but just 2 or 3 percent better.  However, to achieve this, it requires a huge amount of memory and computation time.

To compress 500 MB with 7-Zip, I use the "normal" mode.  That requires 186 MB of RAM to encode and 18 MB to decode and takes a long time.

Whereas I can compress, the same 500 MB of data using RAR, and it requires only 64 MB to encode and 8 MB to decode and encodes twice as fast, and decodes four times faster even using the "Best" compression mode.

When you are dealing with hundreds of MB of data, that means taking 15 minutes to compress with RAR or 30 minutes with 7-Zip.

The difference between the resulting 7-Zip archive and the RAR archive?  About 3 percent difference.

Is it worth needing almost 3x the RAM and twice the compression time, just to achieve an extra 3 percent of compression?  Not to me in most cases.

I could achieve another 1 or 2 percent extra compression, if I used 7-Zip's "Ultra" mode.  However, that requires 700 MB of RAM memory for encoding and 64 MB for decoding!  Not to mention doubling again the compression and decoding time needed, meaning it takes at least 4x longer than RAR to compress and decode.  But, hey, I get an overall 5% improvement over RAR!  Is it worth it though?

Same with LA.  It might still be the "best" compressor in terms of absolute final filesize numbers, but any attempts to make it faster and keep that compression will likely fail.  To make it faster, comparable to TAK or FLAC, will most likely require trimming back the compression strength.

LA - what happened to it?

Reply #16
7-Zip often produces better compression results than RAR.  Not significantly better in most instances, but just 2 or 3 percent better.

(...)

Is it worth needing almost 3x the RAM and twice the compression time, just to achieve an extra 3 percent of compression?  Not to me in most cases.

(...)

But, hey, I get an overall 5% improvement over RAR!  Is it worth it though?

You're measuring percentages wrong.

Depending on the kind of data, RAR and 7-Zip can compress it down to 20% - 40% of the original size, about 30% in average. If RAR compresses something to 32% of its original size and 7-Zip compresses it to 29%, that's not the same difference of 3% as if something was compressed to 60% and 57% (as is the case with lossless audio compressors). 57% is actually approx. 5% better compression ratio than 60%, while 29% is approx. 10% better compression ratio than 32%.

LA - what happened to it?

Reply #17
Using 7-Zip and RAR as examples, not even related to audio.  However, the point being made is the same.


Actually, comparing 7zip and RAR is a bad example.  Yes, in many cases 7zip is only a few % better than RAR.  However on some data sets, such as many gigs of mixed office documents, I have seen 7zip beat RAR by over 100%.  IE, the 7zip archive is less than half the size of the equivalent RAR.  I used to manage a very space constrained office file server and 7zip was a lifesaver.  In that case the extra time and memory was well worth using 7zip. 



LA - what happened to it?

Reply #18
7-Zip often produces better compression results than RAR.  Not significantly better in most instances, but just 2 or 3 percent better.

(...)

Is it worth needing almost 3x the RAM and twice the compression time, just to achieve an extra 3 percent of compression?  Not to me in most cases.

(...)

But, hey, I get an overall 5% improvement over RAR!  Is it worth it though?

You're measuring percentages wrong.

Depending on the kind of data, RAR and 7-Zip can compress it down to 20% - 40% of the original size, about 30% in average. If RAR compresses something to 32% of its original size and 7-Zip compresses it to 29%, that's not the same difference of 3% as if something was compressed to 60% and 57% (as is the case with lossless audio compressors). 57% is actually approx. 5% better compression ratio than 60%, while 29% is approx. 10% better compression ratio than 32%.



I'm going by the resultant filesizes of the respective archives.  There is often just a several percent difference between them.

If RAR takes 15 minutes to compress 500 MB to 250 MB, and 7-Zip takes 30 minutes to compress the same 500 MB to 242 MB, is the 8 MB in filesize difference worth the extra 15 minutes?  8 MB is about 3 percent of 250 MB.  Also, it takes longer for 7-Zip to extract back to the original 500 MB, as well as a large deal of temporary space.

LA - what happened to it?

Reply #19
doccolinni, very few people consider a fractional percentage compression improvement over state of the art as "awesome" or "not even remotely close".

Today, the difference is a fractional percentage. Back in 2004 (which I was talking about), on the other hand, it was different, and then it was both awesome and not even remotely close.

You might as well read my post properly first if you've decided to respond to it.

I did, and back in 2004, very few people would consider the fractional percentage compression improvement over state of the art (optimfrog) as "awesome" or "not even remotely close".
(your reference) (optimfrog changelog) 124328206 / 124634428 = 0.997543

Quote
Take FLAC versus LA. FLAC's files are only about 3% larger than LA's, but FLAC encodes at least 400% faster, and decodes at least 1600% faster, than LA.

In addition, there is a great deal of FLAC support, as well as it being open-source.

I wasn't planning on using LA instead of FLAC, I was just curious about LA because I have a feeling that it could have been optimised and that it was only so slow because it was still practically in earliest alpha stages.


people who implement lossless audio codecs disagree with your feeling.  it's closed source but at one time the author talked about the architecture here.



 

LA - what happened to it?

Reply #22
Ahh, my favorite quote:
If I ever reach the stage where I have no more time / desire to work on and maintain the program, I will make it open source. I don't know if that allays anybody's fears.


Let this be a lesson about best intentions. 

Creature of habit.