Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: Multithreaded lame version? (Read 7431 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

Multithreaded lame version?

I tested a Multithreaded Lame version wich claims it should be up to 50% Faster on Multi Processor Systems.
My question, for Lame developers is just simple.
Are they going to Final state sometime , or is this just a "Bad Hack"?

Multi Threaded LAME

Multithreaded lame version?

Reply #1
Quote
I tested a Multithreaded Lame version wich claims it should be up to 50% Faster on Multi Processor Systems.
My question, for Lame developers is just simple.
Are they going to Final state sometime , or is this just a "Bad Hack"?


[span style='font-size:8pt;line-height:100%']http://softlab.technion.ac.il/project/LAME/html/lame.html[/span]
[a href="index.php?act=findpost&pid=289413"][{POST_SNAPBACK}][/a]


I would guess that if you simply run two instances of a normal lame compile simultaneously, you should get at least the same 50% speedup on such a system, as mp3 encoding is mostly memory/cpu bound rather than disc IO bound. Working on two different harddiscs should boost it even more, though.
As I said, only guessing.

Multithreaded lame version?

Reply #2
I do this a lot when encoding.  I'll rip a bunch of images, and I wrote a C++ program to find all images in a directory and enode them to mp3.  I've got 2 cpus, so If I'm not doing anything that would require the other processor, I run two instances of lame and get the equaivalent of a two thread encode.

However, I stopped doing this once I realized how hard it is on the disc.  It has to constantly leap back and forth between the two files being encoded to do write operations...this can go on for hours on end if you're doing a big batch encode.  A true multithreaded version of lame would be put your drive under far less strain.

Multithreaded lame version?

Reply #3
Quote
I do this a lot when encoding.  I'll rip a bunch of images, and I wrote a C++ program to find all images in a directory and enode them to mp3.  I've got 2 cpus, so If I'm not doing anything that would require the other processor, I run two instances of lame and get the equaivalent of a two thread encode.

However, I stopped doing this once I realized how hard it is on the disc.  It has to constantly leap back and forth between the two files being encoded to do write operations...this can go on for hours on end if you're doing a big batch encode.  A true multithreaded version of lame would be put your drive under far less strain.
[a href="index.php?act=findpost&pid=289451"][{POST_SNAPBACK}][/a]


How is it hard on the disk to read from it ?

You're still reading and writing the same amount of data.  I also doubt you're even approaching the disk's read speed.  Even if you multithreaded version was encoding LAME at 40x, then you're still only reading <7 MB/s, just a fraction of a modern disk's sequential read speed.

Multithreaded lame version?

Reply #4
Quote
Quote
I do this a lot when encoding.  I'll rip a bunch of images, and I wrote a C++ program to find all images in a directory and enode them to mp3.  I've got 2 cpus, so If I'm not doing anything that would require the other processor, I run two instances of lame and get the equaivalent of a two thread encode.

However, I stopped doing this once I realized how hard it is on the disc.  It has to constantly leap back and forth between the two files being encoded to do write operations...this can go on for hours on end if you're doing a big batch encode.  A true multithreaded version of lame would be put your drive under far less strain.
[a href="index.php?act=findpost&pid=289451"][{POST_SNAPBACK}][/a]


How is it hard on the disk to read from it ?

You're still reading and writing the same amount of data.  I also doubt you're even approaching the disk's read speed.  Even if you multithreaded version was encoding LAME at 40x, then you're still only reading <7 MB/s, just a fraction of a modern disk's sequential read speed.
[a href="index.php?act=findpost&pid=289454"][{POST_SNAPBACK}][/a]


It's not the reading, it's the writing, and it's not a matter of rates, it's the fact that you are encoding two things at once rather than one thing twice as fast.  The writing head has to skip back and forth continuously between two locations as both instances output encoded data.

When I was encoding with one instance of lame, my disk was nice and quiet, just writing away.  When I did two, I noticed that the disk sounded like it was just chewing away as the write head jumped all over the disk.  I don't like keeping the disk under that kind of strain.

Multithreaded lame version?

Reply #5
Why not just have them each write to a byte array, ram disc or some other kind of memory buffer, then page out to the disc once finished? Should be simple enough. I mean your hard drive is almost guaranteed to be doing that already using its own write buffer, so i wouldn't worry about wrecking your disc unless you do this 24/7, but if you are, then have it write to a buffer first.

Multithreaded lame version?

Reply #6
Quote
It's not the reading, it's the writing, and it's not a matter of rates, it's the fact that you are encoding two things at once rather than one thing twice as fast.  The writing head has to skip back and forth continuously between two locations as both instances output encoded data.

When I was encoding with one instance of lame, my disk was nice and quiet, just writing away.  When I did two, I noticed that the disk sounded like it was just chewing away as the write head jumped all over the disk.  I don't like keeping the disk under that kind of strain.
[a href="index.php?act=findpost&pid=289464"][{POST_SNAPBACK}][/a]

The read/write heads are on the same arm/armature, so it's both.

Multithreaded lame version?

Reply #7
Pehaps it would work better if you use separate drives to run the two LAME binaries from.
flac > schiit modi > schiit magni > hd650

Multithreaded lame version?

Reply #8
Quote
I tested a Multithreaded Lame version wich claims it should be up to 50% Faster on Multi Processor Systems.
My question, for Lame developers is just simple.
Are they going to Final state sometime , or is this just a "Bad Hack"?


[span style='font-size:8pt;line-height:100%']http://softlab.technion.ac.il/project/LAME/html/lame.html[/span]
[{POST_SNAPBACK}][/a]
This thread discusses this version of LAME [a href="http://www.hydrogenaudio.org/forums/index.php?showtopic=29811&st=0&p=257819&#entry257819]Multi-threading Lame Encoder[/url]

Also threads worth noting:
Lame and SMP
Lame compiles for hyperthreading?

Also, if you are still curious about multi-threaded LAME, search the internet for gogo-no-coda and look at the discussions surrounding it.

Hope this helps, tec

Multithreaded lame version?

Reply #9
Quote
Pehaps it would work better if you use separate drives to run the two LAME binaries from.
[a href="index.php?act=findpost&pid=289537"][{POST_SNAPBACK}][/a]


If I'm right about two instances working the hard drive back and forth too fast, that would solve the problem.

Multithreaded lame version?

Reply #10
Already stated in the second post:
Quote
I would guess that if you simply run two instances of a normal lame compile simultaneously, you should get at least the same 50% speedup on such a system, as mp3 encoding is mostly memory/cpu bound rather than disc IO bound. Working on two different harddiscs should boost it even more, though.
[a href="index.php?act=findpost&pid=289429"][{POST_SNAPBACK}][/a]

And, yes, this thread should discuss a multi-threaded lame version, but I think a short discussion about whether it is needed is not too OT...

Cheers.

Multithreaded lame version?

Reply #11
PFS:  I don't think you understand how HDs work.

Quote
It's not the reading, it's the writing, and it's not a matter of rates, it's the fact that you are encoding two things at once rather than one thing twice as fast. 


This is effectively the same thing.  The two files are both being written across various sectors of the harddisk, so it makes no difference how many files are being writen, only the transfer rate.

Quote
The writing head has to skip back and forth continuously between two locations as both instances output encoded data.


Actually it does that anyway.  The data is being written to whatever free sectors the file system chooses.  Theres no reason to think your two files aren't getting write cached and then written interleved to the same place.

Quote
When I was encoding with one instance of lame, my disk was nice and quiet, just writing away.  When I did two, I noticed that the disk sounded like it was just chewing away as the write head jumped all over the disk. 


Thats nonsense.  The write speed of lame's out put is extremely small compared to the write speed of your disk.  Probably much less then 1/10th unless you have an extremely slow hard disk.  Furthermore, writes are cached by Windows and completed all at once when the write buffer is full.  Whatever you were hearing was most likely your imagination, or simply the OS swapping out virtual memory.

Quote
I don't like keeping the disk under that kind of strain.


Your disk serves the same number of writes either way, so theres no additional strain.  Anyway if you're worried about 50MB of writes from LAME, you must be terrfied of the GBs worth of writes the OS does for VM alone everytime you boot . . .

Multithreaded lame version?

Reply #12
Quote
Quote
I tested a Multithreaded Lame version wich claims it should be up to 50% Faster on Multi Processor Systems.
My question, for Lame developers is just simple.
Are they going to Final state sometime , or is this just a "Bad Hack"?


[span style='font-size:8pt;line-height:100%']http://softlab.technion.ac.il/project/LAME/html/lame.html[/span]
[{POST_SNAPBACK}][/a]
This thread discusses this version of LAME [a href="http://www.hydrogenaudio.org/forums/index.php?showtopic=29811&st=0&p=257819&#entry257819]Multi-threading Lame Encoder[/url]

Also threads worth noting:
Lame and SMP
Lame compiles for hyperthreading?

Also, if you are still curious about multi-threaded LAME, search the internet for gogo-no-coda and look at the discussions surrounding it.

Hope this helps, tec
[a href="index.php?act=findpost&pid=289622"][{POST_SNAPBACK}][/a]


I noticed that Gogo uses two threads looking at it in prcview.

Multithreaded lame version?

Reply #13
You'll get a better speed up if you just run two instances of standard LAME at once.

Multithreaded lame version?

Reply #14
Write a batch file or script (I assume Windows has tools for making Ramdisks available) that does the following:
1) Creates a Ramdisk
2) Copies the source file to the Ramdisk
3) Instructs LAME to encode, outputting to the Ramdisk
4) Copy the .mp3 off the Ramdisk onto your hard drive
Wash, rinse, repeat for each file to encode. Run one script per CPU and you will have neatly parallel LAME processes without too much hard drive thrashing. The obvious downside is that you can't encode files bigger than about a quarter of your RAM, but it shouldn't be a problem with single tracks on a machine with a decent amount of RAM. Linux users need to include the apropriate kernel support.

This process is a bit of a pain in the ass - but should give you the fastest encodes on a multiprocessor box short of modifying the LAME source or adding more drives.