Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: TAK 1.0 - Beta testing (Read 104049 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

TAK 1.0 - Beta testing

Content

This thread should be used to discuss the first public beta release of my lossless audio compressor TAK (formerly known as YALAC):

1) Bug reports

That's the most important purpose of this beta testing!

Please try to provide as many details as possible about the conditions which created the error. And please keep the affected files. I may ask you for them.

2) Ideas for improvements and feature requests

I am aware, that the first release is missing many important features, especially tagging and plugin support for media players. No need to tell me about this...

I will collect your ideas and requests and try to implement them into later releases.

The ReadMe of the beta archive contains a list of features i am planning to implement into future versions.

3) Compression results

are always welcome.

Download

The beta can be downloaded from the Upload section:
TAK 1.0 Beta 2

TAK 1.0 - Beta testing

Reply #1
First of all... Congratulations on the first public build!

Also, I'd like to be the first to thank you for all the work you've done on your labor of love up to this point Thomas. We all know how much time you've spent on this "little" project of yours in the past years and you can be sure we all appreciate it greatly.

TAK 1.0 - Beta testing

Reply #2
Well said.

Congratulations Thomas.  The results posted for TAK from the alpha testing seem to have caused a real buzz on this board.  It will be interesting to see what happens in 2007.

Edit:  I have updated Tag to recognise TAK files (so you don't have to specify --ape2 --nocheck).  Download 2.0.49 if you want to tag using Case's Tag.  NB: 2.0.48 was a "silent" release that updated to libFLAC 1.1.3.

You can then easily tag using:

Code: [Select]
TAG.EXE --artist "My Artist" --album "My Album" --genre Rock --year 2006 --track 1 --title "My Title" myfile.tak

If you want to rip to TAK using EAC I suggest that you use Wapet.  Take a look at the EAC and Monkey's Audio guide in the wiki for the general idea.

Remember to swap ".ape" with ".tak", and use a command line like:

Code: [Select]
%d -t "Artist=%a" -t "Title=%t" -t "Album=%g" -t "Year=%y" -t "Track=%n" -t "Genre=%m" "C:\Program Files\TAK\TAKC.EXE" -e -pN -v %s %d
I'm on a horse.

TAK 1.0 - Beta testing

Reply #3
Congratulations! 

I just played around a bit

Not a big thing but i found a file with invalid RIFF-Header that TAK refuses to process

http://www-mmsp.ece.mcgill.ca/Documents/Au...verse/GLASS.WAV

the file is 8000Hz,16-Bit,1 Channel,40200 Samples

edit:

Sadly WAVE_FORMAT_EXTENSIBLE files are not supported, or are they?
Would be nice if you could add that feature

TAK 1.0 - Beta testing

Reply #4
TAK Decoder Stub for foobar2000

I've made a simple component for foobar2000 that will allow you to use foobar2000 to tag your TAK files. It doesn't support playback for obvious reasons.  Perhaps some people will find it useful to manage their TAK files.

Download
Source code

If the above links don't work, please check my components page or this thread.

TAK 1.0 - Beta testing

Reply #5
Sorry I may seem new to TAK but what's the big fuss all about it? I just ran a test with -e -p4m and the files were still a bit larger than APE -c4000 (extra high) files. The encoding time was about the same.

TAK 1.0 - Beta testing

Reply #6
TAK's advantage is more with encoding and decoding speed than compression.  It compresses very well, while still being fast.  Monkey's Audio, OptimFROG and LA will compress better, but tend to be slower.

If you compare TAK Normal with Monkey's Audio Normal they compress about the same, and encode at the same rate, but TAK decodes twice as fast.  If you compare TAK Normal with FLAC -5 they encode and decode at around the same rate, but TAK shaves off around 2% compression.

If you look at the poll, and the type of codec being used by most people, you have the answer to your question.
I'm on a horse.

TAK 1.0 - Beta testing

Reply #7
Sorry I may seem new to TAK but what's the big fuss all about it? I just ran a test with -e -p4m and the files were still a bit larger than APE -c4000 (extra high) files. The encoding time was about the same.

The section About in Tak's Readme.html gives you this answer:
Quote
My goal was to develop a compressor which combines strong compression with highest decompression speeds. On average the current implementation should match the compression efficiency of Monkey's Audio High and achieve decompression speeds similar to FLAC.


There are several threads about the development of this new codec in the Lossless / Other Codecs forum. Just run a search for YALAC or TAK.
You'll also find lots of benchmark results and comparisons to other codecs there.

TAK 1.0 - Beta testing

Reply #8
My testing results:

02.01.2007  19:51      107.467.628 original.wav
02.01.2007  19:51        64.151.240 flac8.flac
04.01.2007  14:05        60.495.029 p2m.tak
04.01.2007  14:06        59.833.424 p3m.tak
04.01.2007  14:04        59.743.460 p4m.tak

Decoding time for p4m.tak:  8.67 sec 

Monkey's Audio Extra High:
04.01.2007  14:11        58.977.820 extrahigh.ape

Decoding time for extrahigh.ape:  44.6 sec

TAK 1.0 - Beta testing

Reply #9
Decoding time for p4m.tak:  8.67 sec 
...
Decoding time for extrahigh.ape:  44.6 sec
I assume that you've just answered your own question?

There are  several threads about the development of this new codec in the Lossless / Other Codecs forum. Just run a search for YALAC or TAK.
You'll also find lots of benchmark results and comparisons to other codecs there.
Here is my comparison, which includes TAK alpha 0.14, FLAC 1.1.3, WavPack 4.40, and Monkey's Audio 3.99.
I'm on a horse.

TAK 1.0 - Beta testing

Reply #10
Congratulations! I was watching the development and finaly I can test it my self. It is realy fast! I can't wait for an final version lauched in an GPL-compatible license, for I play in my linux also.

Not exaclty a bug report, but the "Tak_Enco_Proto.txt" have the "yes" and "no" as "nein" and "ja". I think it is better translate it to english. Maybe leave the possibility of multilanguage versions... but that shoud be much more complex.

TAK 1.0 - Beta testing

Reply #11
Seeing as everyone else is doing speed/compression tests and I'm on a slow pc at the moment I think ill do some damage testing, see if I can choke the decoder.
I can run my own tests easily enough but does anyone have the damage tool that was created with this in mind available/

TAK 1.0 - Beta testing

Reply #12
I do, but I'm not sure about the ethics of passing out an application that was given to me as an alpha tester.  I think Thomas needs to authorise this.

I'm probably being anal, but I'd rather not break his confidence.  Sorry.
I'm on a horse.

TAK 1.0 - Beta testing

Reply #13
I do, but I'm not sure about the ethics of passing out an application that was given to me as an alpha tester.  I think Thomas needs to authorise this.

I'm probably being anal, but I'd rather not break his confidence.  Sorry.

Don't worry I can wait.
I seem to be able to produce an undecodable file by cutting off the first 8 lines in notepad++, might just be saving it in a different format(utf-8, uncode etc) by mistake though so I shall investigate further.

TAK 1.0 - Beta testing

Reply #14
...
I can run my own tests easily enough but does anyone have the damage tool that was created with this in mind available/

I do, but I'm not sure about the ethics of passing out an application that was given to me as an alpha tester.  I think Thomas needs to authorise this.

I'm probably being anal, but I'd rather not break his confidence.  Sorry.

Again, thanks for beeing anal! I really appreciate this.

I could put the Damage tool (V1.02) into the upload section. It's only about 85 K big and i doubt, that many people are interested, hence i suppose it is ok to use the upload section for this without asking the admins (Hope so).

Could take 1 hour.

BTW: This thread is real fun for me! I will respond to some other posts later.

Edit: Here it is: Damage 1.02

TAK 1.0 - Beta testing

Reply #15
Chopping a big chunk from head and feet (~120KiB in total of a file with 6.54MiB) makes the file undecodable. That is normal? Where you posted about errors that are know to not been decodable?

EDIT: Even an not so big chunk (10K in total) turn the file undecodable.

EDIT2: I've found no problem cutting the entire first half or the entire last half of the file. The remaining data decodes perfectly!

TAK 1.0 - Beta testing

Reply #16
Chopping a big chunk from head and feet (~120KiB in total of a file with 6.54MiB) makes the file undecodable. That is normal? Where you posted about errors that are know to not been decodable?

It's normal for the current decoder implementation. The decoder has to read a stream info structure which contains all general parameters needed for decoding (audio format, frame size...). This can be found in the meta data structure at the beginning of the file and it is also beeing inserted every 2 seconds into the audio data stream itself. Therefore the file format can survive even very heavy damage. Theoretically...

While the stream info is available in many places of the file, the current decoder implemation checks only 3 positions:

1) The meta data.
2) If the meta data is damaged, it searches for the first frame header (this always contains another stream info).
3) If both above are damaged the decoder searches for the last frame header (with a flag set, indicating that this is the last frame) within the last 1 MByte of the file.

If all this fails, the decoder stops. Obviously this can be improved: The decoder could go through the whole stream.

But all this is beeing performed by the open function of the decoder. I don't want to let it scan through the whole file (imagine a 4 GB file), at least not without asking the user.

I suppose, that damage of head and feet is a rare case, therfore this limitation should not hurt too much.

To make it clear: another decoder implementation can solve this.


EDIT: Even an not so big chunk (10K in total) turn the file undecodable.


If this is not the head-and-feet issue: It is sometimes necessary to deactivate the restore wave file meta data option to decode damaged files. This is always the case, if the damage changes the file size. Then the decoder has to modify the audio data size entries in the wave file header. But if you are using the restore wave file meta data option, the decoder handles the previously stored header as a black box, which it will not touch.



EDIT2: I've found no problem cutting the entire first half or the entire last half of the file. The remaining data decodes perfectly!

Well, this supports my explaination above.

Nice testing!

TAK 1.0 - Beta testing

Reply #17
Quote
If all this fails, the decoder stops. Obviously this can be improved: The decoder could go through the whole stream.

But all this is beeing performed by the open function of the decoder. I don't want to let it scan through the whole file (imagine a 4 GB file), at least not without asking the user.

I suppose, that damage of head and feet is a rare case, therfore this limitation should not hurt too much.

To make it clear: another decoder implementation can solve this.

OK. Glad to hear that. I supose that is possible add this "scan the whole file" as an switch in the decoder, and in case of an undecodable file, ask the user if he want to make this in depth scan.
Quote
If this is not the head-and-feet issue: It is sometimes necessary to deactivate the restore wave file meta data option to decode damaged files.

I don't know where the head or feet start or end, I was just making some dumb cuts by an hex-editor. If head an foot in an 5.64MB file shoud have more than ~10KB togheter, then it may be a problem. I deactivated the restore meta data option since the begin, so that is not the problem.

As an side note, every time that I change a preset, the "verify" option is disabled, and I have to re-check it. As I want to leave it enabled for catch any possible encoder/decoder errors in this beta, this is anoying. Just my opinion, as the defaut may be better for other people...

PS: How I enable SSE optimizations? In the log has a mention that it is disabled, but I've only found the enable/disable MMX option. I have an P4 based Celeron, that shoud suport up to SSE2.

TAK 1.0 - Beta testing

Reply #18
Quote
If all this fails, the decoder stops. Obviously this can be improved: The decoder could go through the whole stream.

But all this is beeing performed by the open function of the decoder. I don't want to let it scan through the whole file (imagine a 4 GB file), at least not without asking the user.

I suppose, that damage of head and feet is a rare case, therfore this limitation should not hurt too much.

To make it clear: another decoder implementation can solve this.

OK. Glad to hear that. I supose that is possible add this "scan the whole file" as an switch in the decoder, and in case of an undecodable file, ask the user if he want to make this in depth scan.

Good idea!

Probably i will build a separate repair function with such an option.

The decoder itself should be simple without too many options and without user interaction. Otherwise it would be more difficult to build media player plugins (i suppose, i am not yet an expert in this).

Quote
If this is not the head-and-feet issue: It is sometimes necessary to deactivate the restore wave file meta data option to decode damaged files.

I don't know where the head or feet start or end, I was just making some dumb cuts by an hex-editor. If head an foot in an 5.64MB file shoud have more than ~10KB togheter, then it may be a problem. I deactivated the restore meta data option since the begin, so that is not the problem.

I too don't know. It depends on frame size, audio format, file size and compressability.

But with for instance 5 to 10 kB cut from the beginning and the end there is a fair chance to have damaged all 3 possible stream info positions.

As an side note, every time that I change a preset, the "verify" option is disabled, and I have to re-check it. As I want to leave it enabled for catch any possible encoder/decoder errors in this beta, this is anoying. Just my opinion, as the defaut may be better for other people...

This shouldn't be! Thanks! I will change this.

PS: How I enable SSE optimizations? In the log has a mention that it is disabled, but I've only found the enable/disable MMX option. I have an P4 based Celeron, that shoud suport up to SSE2.

Well, i have removed the SSE optimizations, because they brought absolutely nothing (evaluated by different testers). Obviously i forgot to remove the corresponding protocol entry.

Thanks for your thorough testing!

TAK 1.0 - Beta testing

Reply #19
Some news (I think a new post is better than editing):

If I cut only 30bytes from the start and 30bytes from the end, the decoder still tell that it is undecodable. If I cut only the first and the last byte, the decoder tries to decode the file, but at 98% (as the GUI shows, but may be more) it stop with an error:

"Assertion failure (D:\VocComp\Win\yaaFileDecomp.pas, line 484)"

I click OK and the program close...

TAK 1.0 - Beta testing

Reply #20
If I cut only 30bytes from the start and 30bytes from the end, the decoder still tell that it is undecodable. If I

Even with only 30 bytes it is possible to damage the headers of the first and the last frame, if the file begins and ends with silence, which can be compressed into very small frames.

If I cut only the first and the last byte, the decoder tries to decode the file, but at 98% (as the GUI shows, but may be more) it stop with an error:

"Assertion failure (D:\VocComp\Win\yaaFileDecomp.pas, line 484)"

I click OK and the program close...

Since the morning i am sitting here and waiting for something like this to happen...

But it's very nice to know the code line.

Would it be possible to send me the damaged file which generates this error?

I assume, that the last frame is very small and that this causes the error.

TAK 1.0 - Beta testing

Reply #21
E-mail sent. Probably the silence problem.

TAK 1.0 - Beta testing

Reply #22
E-mail sent. Probably the silence problem.

Thank you! I've got it.

Now i will look for the error...

Well, if you don't try to make your codec error tolerant and simply stop decoding (or skip the data) instead of trying to restore the data, it's simplier. Then you have not to deal with some quite complicated error correction code, which itself can contain errors...

Now Tak looks less stable because of it's attempts to restore as much data as possible... 

TAK 1.0 - Beta testing

Reply #23
If I cut only 30bytes from the start and 30bytes from the end, the decoder still tell that it is undecodable. If I cut only the first and the last byte, the decoder tries to decode the file, but at 98% (as the GUI shows, but may be more) it stop with an error:

"Assertion failure (D:\VocComp\Win\yaaFileDecomp.pas, line 484)"

I click OK and the program close...

Fixed!

Sweating...

The compressed audio data of a frame is beeing followed by a CRC. I forgot to check, if there were enough bytes left in my buffer for the CRC. After removal of the last byte (same is true for 2 or 3 bytes) it wasn't enough (This could only happen with the last frame).

The fix was so simple, that this alone is no reason for a second beta. But let's see what comes next.

TAK 1.0 - Beta testing

Reply #24
Thank you! I've got it.

Now i will look for the error...

Well, if you don't try to make your codec error tolerant and simply stop decoding (or skip the data) instead of trying to restore the data, it's simplier. Then you have not to deal with some quite complicated error correction code, which itself can contain errors...

Now Tak looks less stable because of it's attempts to restore as much data as possible... 

Don't worry about this so much! It is a beta, that isn't suposed to be totaly stable and "production ready"! No one in future will judje your codec based in an early beta release.

And on top of all this the error is with an corrupted file, that as you said could be ignored. And this even isn't an error that will make you loose some data, as it will be recoverable with the next version, or, in the worst case, with an specialized tool in future.

Lets find all the possible hiden bugs to release an clean stable version! 

Edit: "No one in future will judje your codec based in an early beta release" ... but will see with good eyes an developer that is so concerned with the problems in his codec and quicky in bug-fixes! That was fast! Thank you.