Skip to main content
Topic: TAK 2.1.0 - Beta release 3 (Read 22781 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

TAK 2.1.0 - Beta release 3

Reply #25
Dream Theater - Octavarium
8 tracks encoded with tak 2.1.0 beta 3
Intel Core i5 (quad, no ht) 750 @ 2.67GHz

-pMax

wav________%tak___-tn1__-tn2_-tn3__-tn4
89183180___66.48%__35*__59*__85*__94*
58760060___59.97%__35*__61*__83*__90*
80443148___66.60%__34*__59*__86*__86*
47510444___70.03%__33*__58*__82*__88*
86969948___70.66%__35*__62*__86*__95*
71707820___71.01%__34*__59*__86*__91*
113418188__67.09%__34*__60*__84*__90*
254018348__63.70%__35*__60*__87*__92*

wav________%tak___-tn1__-tn2___-tn3___-tn4
x__________66.29__34.52__59.90__85.37__91.22

Amazing speed improvements with the multi-threading!

TAK 2.1.0 - Beta release 3

Reply #26
Well, I'd prefer MP3 rather than other LossyCodec, because my portable devices don't support any Lossless Codecs or other LossyCodec besides LossyWav.I don't think it's nessessary to have a lossyTAK,cause no handheld devices support it. This is only my personal opinion.

I was aware of those restrictions, but thought, there would be some users using LossyWav only for archiving purposes, where the lack of hardware support wouldn't matter.

I think i will remove the dedicated LossyWav codec from TAK 2.1. It can easily be added again when there is some demand. But currently i don't want to add such a quite complex feature that has not been tested by anyone else but me. Nevertheless the development of the codec was a quite interesting task for me.

Maybe you can release an individual build only purposed for LossyWav archiving.If users want a LossyTAK Compressor, they may join the beta test to help you with debugging or something.Since TeraByte Capacity HardDisk is really cheap now,so most of the TAK users demand a Lossless TAK Compressor.I think this will be better rather than completely remove LossyWav codecs,then add it again when required.And sorry that I can't help testing TAK 2.1.0 Beta 3 B,because I'm not at home this week.

Hope you good luck,I can't wait trying the TAK 2.1.0 Final Release.

TAK 2.1.0 - Beta release 3

Reply #27
Using the same fileset as mentioned in my earlier post in beta 2 thread (10.3GB wav format, encoded by foobar2000, reading from SSD, writing to modern hard drive and CPU Intel duo core E7600), I got these results:
   
Code: [Select]
    
TAKv210b3    TAKv210b3    TAKv210b3  TAKv210b3    TAKv210b3
-p4m         -p4m         -p4m       -p4m         -p4m
1 process    2 processes  1 process  2 processes  4 processes
-tn1         -tn1         -tn2       -tn2         -tn1
34:06.467    18:31.429    23:51.730  22:41.264    18:16.874
30.66x       56.46x       43.83x     46.10x       57.21x

Number of processes is controlled by foobar2000 Preferences>Tools>Converter>Thread count
Any available instruction set is used.
Published data is a result of one run (but seems reasonable).

I have done several other smaller tests with -p4m and always come to the same conclusion: encoding with 2 processes+single thread per process is much faster than one proces+2 threads per process.

I suspect that with faster encoding settings the conclusion will be different. If I find time during the next days, I might try -p0.

TAK 2.1.0 - Beta release 3

Reply #28
Quick test of -p0:

Code: [Select]
TAKv210b3
1 process  1 process  2 process
-p0 -tn1   -p0 -tn2   -p0 -tn1
3:10.072   3:13.691   3:39.758
330.20x    324.03x    285.59x

As opposed to -p4m, -p0 compression setting doesn't use the full cpu power and goes up and down. The bottleneck for -p4m is the cpu and for -p0 it's the drive(s) speed. This is why one process with 2 threads runs faster than 2 processes with one thread (as opposed to -p4m setting).

TAK 2.1.0 - Beta release 3

Reply #29
I tried to run an unusual test for you. I'll do it soon, but before I do, I report you a problem that stopped me from doing the test.
Your encoder don't seem to be able to handle some paths (I've already experienced it, but forgot to report).
The ones I try to convert now in a folder tree, downloaded to my pendrive (on my brother's computer), are single file TTA albums with such paths (try them):

Quote
L:/Touhou lossless music collection/[dBu music]/2005.05.04 [DBCD-0001] 弾奏結界 紅魔狂詩曲 Scarlet Rapsodia [例大祭2]/CDImage.cue
L:/Touhou lossless music collection/[dBu music]/2005.05.04 [DBCD-0002] 弾奏結界 幻葬旋律曲 Necromanza [例大祭2]/CDImage.cue
L:/Touhou lossless music collection/[dBu music]/2005.05.04 [DBCD-0004] 弾奏結界 追憶鎮魂曲 Nostalgic Requiem [例大祭2]/CDImage.cue
L:/Touhou lossless music collection/[dBu music]/2005.12.30 [DBCD-0005] 深弾奏結界 散華嬉遊曲 Flower Divertimento [C69]/CDImage.cue
L:/Touhou lossless music collection/[dBu music]/2007.05.20 [DBCD-0009] 絶弾奏結界 兎角宴舞曲 courante impromptu [例大祭4]/dbu Music - 絶弾奏結界 兎角宴舞曲 courante impromptu.cue
L:/Touhou lossless music collection/[dBu music]/2007.12.31 [DBCD-0010] 風弾奏結界 神交風雅曲 Oratario del Vento [C73]/Audio CD.cue


And their .tta counterparts, respectively.
(These will be my test subjects in the next test, weighing almost 2GB, definitely too much to entirely cache on my rig w/ 2GB RAM).

Edit: it seems the problem is the file creation part, because I renamed the folders to lack kanji, yet when I tried to convert them to subfolders by album titles, TAK encoder exited again.

TAK 2.1.0 - Beta release 3

Reply #30
OK, so here's the test itself. I hoped to incorporate I/O times in the test by exceeding the volume of a single album. I even copied the files to the HDD after I got path errors converting from pendrive. This way I was really satisfied and hoped to have created a favorable track for the built-in threading... but it all proved to be an own goal...
The reason is: I left the sorce files as they are (single file per album TTAs), and this format poses another bottleneck. This made 2-tread reading from a (quite freash and new, formatted a few weeks ago...) hard drive a non-issue. The issue it brought up was the slow TTA decoding (~168x realtime on this machine), which is done on a single thread when I use 1 instance of the TAK encoder... so the results are totally contradicting with my expectations 

2 intances, 1 thread each - Total encoding time: 4:19.445, 58.66x realtime
1 intance, 2 threads each - Total encoding time: 5:55.745, 42.78x realtime

Anyway, doing 2 reads on a single HDD when the target drive is not the same (hardware) doesn't limit anyhing compared to TTA decoding. It seems that the only scenario where I could make the built-in threading win would be converting uncompressed data, within one single HDD (I seldom convert from a drive to itself, exactly because it imposes serious i/o limit on converting, and especially on muxing video), and preferably more threads than the core count of my current core 2 duo. And I still say that built-in threading is a good thing. I may be hopeless  Or I may think of the future, massively multicore CPUs (like 8-core Bulldozers).

I still owe you an i/o independent test between Tak 2.0 and 2.1, on an AMD (2-core, low-power 2.5GHz Brisbane K8) based computer - it's right here in the next room  But if there is no difference apart from the SSSE3 optimization, there's no point doing this one.

TAK 2.1.0 - Beta release 3

Reply #31
wow, damn brutal speed optimizations, Thomas   

Intel Core i3 350M  - cores:2 threads:4 HT SSSE3

using ramdisk and -p2 (suffient to me, space-benefit with -p3 and -p4 is too small with regard to the higher compression-times)

Code: [Select]
thread:1 ssse3
Duration:        23.43 sec
Speed:          169.29 * real time

Code: [Select]
thread:4 ssse3
Duration:        10.08 sec
Speed:          393.27 * real time



Code: [Select]
thread:4 none
Duration:        25.09 sec
Speed:          158.08 * real time
Code: [Select]
thread:4 mmx
Duration:        11.99 sec
Speed:          330.86 * real time


Bugs are possible! If you want to help, please make sure to first compress, then decompress and finally compare the decompressed files with the original files. It may not be sufficient to use -v (Verify) and -md5 (MD5-creation and validation) to reveal multi core encoder errrors!

Don't get the point, md5-creation by using external applications (like nirsoft.net - HashMyFiles) should be sufficient or not?

greetings to the good ol wueste 

TAK 2.1.0 - Beta release 3

Reply #32
Thanks TBeck, for this brilliant codec.
Just want to express my desire to see this released for Linux based systems!
Remember us!

TAK 2.1.0 - Beta release 3

Reply #33
Hello!
just finished testing of TAK 2.0.3 b. Here is my results

PC configuration:
Core i3 530 2.93 GHz
2x2Gb DDR3-1333
SATAII HDD 500 Gb Hitachi (source), SATAII HDD 1TB Hitachi (destination)
Windows 7 32-bit

Audio:
File size : 804MB (843 702 428 bytes)
Duration : 1:19:42.893 (210925596 samples)
Sample rate : 44100 Hz
Channels : 2
Bits per sample : 16
Bitrate : 1411 kbps
Codec : PCM
Encoding : lossless

Encoding: foobar2000 1.1.1, tak -e -<x> -tn1 -ihs - %d
Decoding: foobar2000 1.1.1 with foo_benchmark (high priority, buffer entire file into memory, 5 passes)


TAK 2.1.0 - Beta release 3

Reply #34
Befor TAK compress.

Hash my file==> CYNTHIA HARRELL - I AM THE WIND.wav   MD5:1c27a7dab07a51a276acf5dc644bc0ad   SHA1:c17030bbae306f974c9fe7c4c74048c33e70db22   CRC32:ce752a4a

--------------------------------------------------------------------------------------------

After TAK compress (4 Threads + SSSE3 + MD5 + Verify + P4M)

File Info==>CYNTHIA HARRELL - I AM THE WIND.tak    MD5:641729292fb0ffb5b0cd95b3f243c9b0

Hash my file==>CYNTHIA HARRELL - I AM THE WIND.tak   MD5:e4df01999fb3b67a67c9bb850c480a34   SHA1:545e11cd6991c3fcb70cb460edd4302739f890b1   CRC32:6595eb49

--------------------------------------------------------------------------------------------

After Tak decompress

Hash my file==>CYNTHIA HARRELL - I AM THE WIND.wav   MD5:1c27a7dab07a51a276acf5dc644bc0ad   SHA1:c17030bbae306f974c9fe7c4c74048c33e70db22   CRC32:ce752a4a

--------------------------------------------------------------------------------------------

After TAK compress 2nd time (1 Thread + None + P2)


Hash my file==>CYNTHIA HARRELL - I AM THE WIND.tak   MD5:9248dfa820c5c800c4a4531fd1627a7c   SHA1:253a2f613fc0272e87cc57edfa70ce55383bb089   CRC32:22ea9bca

--------------------------------------------------------------------------------------------

AFter Tak decompress 2nd time

Hash my file==>CYNTHIA HARRELL - I AM THE WIND.wav   MD5:1c27a7dab07a51a276acf5dc644bc0ad   SHA1:c17030bbae306f974c9fe7c4c74048c33e70db22   CRC32:ce752a4a

--------------------------------------------------------------------------------------------

It Looks there is no error compressing file with SSSE3 + Multi-threads. Hopes this will help

EDIT:Add another test.

TAK 2.1.0 - Beta release 3

Reply #35
Thank you all for testing!

...
Amazing speed improvements with the multi-threading!

Cool!


I think i will remove the dedicated LossyWav codec from TAK 2.1. It can easily be added again when there is some demand. But currently i don't want to add such a quite complex feature that has not been tested by anyone else but me. Nevertheless the development of the codec was a quite interesting task for me.

Maybe you can release an individual build only purposed for LossyWav archiving.If users want a LossyTAK Compressor, they may join the beta test to help you with debugging or something.Since TeraByte Capacity HardDisk is really cheap now,so most of the TAK users demand a Lossless TAK Compressor.I think this will be better rather than completely remove LossyWav codecs,then add it again when required.

Well, i have removed the new dedicated LossyWav codec for now.

I don't intend to release a seperate build including this codec for several reasons (only the most important ones listed):

- Although i am trusting my comprehensive automatic test scripts, i know, that external tests are required to feel really safe. If nobody is testing the new codec, my personal quality standards have not been accomplished.
- Maybe it was generally a bad idea to add such a codec to TAK. Especially, if it would have been widely used! Maybe TAK then would get the reputation, that it can not be trusted, because too much lossy content is beeing published with it. Since you can never be sure, if the source of a lossless encode isn't lossy, this is logically and practically bullshit, but it is easy to get a bad reputation.

I have done several other smaller tests with -p4m and always come to the same conclusion: encoding with 2 processes+single thread per process is much faster than one proces+2 threads per process.

Maybe i have to tune the io-part of the multi-threaded-encoder to make it competive with foobar's simultaneous multiple-file encoding...

I tried to run an unusual test for you. I'll do it soon, but before I do, I report you a problem that stopped me from doing the test.
Your encoder don't seem to be able to handle some paths (I've already experienced it, but forgot to report).
....
Edit: it seems the problem is the file creation part, because I renamed the folders to lack kanji, yet when I tried to convert them to subfolders by album titles, TAK encoder exited again.

Maybe you just noticed, that TAK currently doesn't support unicode char sets? Sorry...

Anyway, doing 2 reads on a single HDD when the target drive is not the same (hardware) doesn't limit anyhing compared to TTA decoding. It seems that the only scenario where I could make the built-in threading win would be converting uncompressed data, within one single HDD (I seldom convert from a drive to itself, exactly because it imposes serious i/o limit on converting, and especially on muxing video), and preferably more threads than the core count of my current core 2 duo. And I still say that built-in threading is a good thing. I may be hopeless  Or I may think of the future, massively multicore CPUs (like 8-core Bulldozers).

More evidence that TAK's multi core encoder might need some tuning... Thank you!

wow, damn brutal speed optimizations, Thomas 

Thank you!

Thanks TBeck, for this brilliant codec.
Just want to express my desire to see this released for Linux based systems!
Remember us!

No promises, but requests are always capable to affect the priorities of items on my todo list.

Hello!
just finished testing of TAK 2.0.3 b. Here is my results

Great! The diagram is excatly illustrating the way TAK's preset system and general design is intended to work:

- Always fast decoding, affected only slightly by the preset choice.
- Decoding is about equally fast for one preset's evaluation levels.
- A really good default preset -p2 regarding compression ratio, encoding and decoding speed. Again thanks to the users who helped to create it!
- Maybe it's another hint, that it's now time to make -p4m a bit stronger...

It Looks there is no error compressing file with SSSE3 + Multi-threads. Hopes this will help

Definitely! Thank you very much!

I will now prepare the final release. It will still be called 2.1, although i am usually only changing the second place of the version number, if something has been added, which can not be decoded by prior versions. Now, that the LossyWav-codec has been removed, this is no longer true. But i think it would be iritating if i now release a 2.0.1 and later a 2.1 with totally different functionality than the current betas (named 2.1).

BTW: I wasn't totally lazy in the meantime. The next release (2.1.1) will -among other things- again be a bit faster.

  Thomas

TAK 2.1.0 - Beta release 3

Reply #36
Thank you, Thomas, for sharing your work. Always a pleasure and always exciting to participate. Also, I forgot to thank you for conjuring up multi-core abilities as a reaction to my question of TAK's potential parallelism. Could it be that multi-core was already in-the-works? :shrug:

- Maybe it was generally a bad idea to add such a codec to TAK. Especially, if it would have been widely used! Maybe TAK then would get the reputation, that it can not be trusted, because too much lossy content is beeing published with it.
I said this before that I, too, don't believe in tying lossy techniques to a lossless program. I think it is my hard-nosed conviction of the fundamental philosophy behind lossless audio encoding. Perhaps it's because I work primarily with original waveforms in studio projects and take the lossless thing quite seriously, arguably more than a person who can simply re-rip their CD's. All I'm trying to say is that I agree with: as long as the person who packaged the source material is trusted, TAK will deliver lossless as specified.

In regards to Steve Forte Rio's diagram:
Quote
Maybe it's another hint, that it's now time to make -p4m a bit stronger...
I didn't run my own tests yet, but the diagram shows minor compression gains from migrating from -p0e -> p0m. Perhaps it was the material being tested that gave this result, but it does raise the inquiry of whether evaluation modes -pXe and -pXm share qualities independent of the numerical preset. If so, can evaluation modes be strengthened?

Happy new year!
"Something bothering you, Mister Spock?"

TAK 2.1.0 - Beta release 3

Reply #37
Quote
- Maybe it's another hint, that it's now time to make -p4m a bit stronger...


looking forward to it 

 

TAK 2.1.0 - Beta release 3

Reply #38
Great! The diagram is excatly illustrating the way TAK's preset system and general design is intended to work:

- Always fast decoding, affected only slightly by the preset choice.
- Decoding is about equally fast for one preset's evaluation levels.
- A really good default preset -p2 regarding compression ratio, encoding and decoding speed. Again thanks to the users who helped to create it!
- Maybe it's another hint, that it's now time to make -p4m a bit stronger...

  Thomas


Well,I think with the speed up provided by SSSE3 and Mulit-thread compressing, you can simplify the preset,such as scaling P0~P5 from:P0(new) -> P1(current) , P1(new) -> P2(current). P2(new) -> P3(current), P3(new) -> P4(current) , P4(new) -> P4M(current) , P5 -> (stronger than current P4M). and the default set to P2(new)

There are too many presets which are detailedly listed,and some of them are seldom used or even useless.

 
SimplePortal 1.0.0 RC1 © 2008-2020