Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: number of processor cores settings in fb? (Read 7326 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

number of processor cores settings in fb?

I have a q6600 and I noticed that foobar knows to use 4 independent instances of flac.exe when converting a whole list of files.  Is there a setting in fb that will dictate how many instances of flac.exe foobar uses when doing a mass convert?  I'd like to limit it to 2 or 3 for example.

Thanks!

number of processor cores settings in fb?

Reply #1
no

number of processor cores settings in fb?

Reply #2
Ctrl + Alt + Delete
Task Manager
Processes Tab
Find foobar2000.exe process
Right click on it
Left Click on Set Affinity
Check or uncheck the number of processors you want to use

Enjoy or don't.
Zune 80, Tak -p4 audio library, Lossless=Choice

number of processor cores settings in fb?

Reply #3
@Gow - thanks, just what I wanted

number of processor cores settings in fb?

Reply #4
By messing with threads and CPU-affinity, you are shooting yourselves in the foot. An app can decide much better about multicore usage at any given moment, than you can. An app can also make use of multicore stuff in situations, where you wouldnt expect it - for example, to speed up certain UI stuff - thus increasing responsiveness - or to improve the speed of library searches. All this is lost if you mess with that - basically, you are just reducing app-efficiency.

If you dont want conversion-tasks to interfere with your foreground tasks.... then there is something called thread-priority. For some reason, it is set to very low priority for conversion tasks, by default..... so that it doesnt interfere with foreground stuff. So, there isnt even a point in changing the defaults.
I am arrogant and I can afford it because I deliver.

number of processor cores settings in fb?

Reply #5
By messing with threads and CPU-affinity, you are shooting yourselves in the foot. An app can decide much better about multicore usage at any given moment, than you can.


i wouldn't say this about software with an open component architecture.  one simple example where you can determine multi-core usage better is in the use non-thread safe component.  this isn't a problem that foobar2000 has, except with one component that i can think of, but i disagree in the use of such an absolute statement.

number of processor cores settings in fb?

Reply #6
reil: We regard components as broken, if they are not thread-safe, so this is insufficient motivation to add such a global setting in foobar2000. The components in question should be fixed by their authors instead.

number of processor cores settings in fb?

Reply #7
Nice discussion, all.  I just want to limit the disc usage by dropping it down to two flac.exe's running at once.  Having 4 will not only rape the hardrive, it will require me to run a long defrag job afterwards.

number of processor cores settings in fb?

Reply #8
Having 4 will not only rape the hardrive, it will require me to run a long defrag job afterwards.
Why? Are your hard drives that slow??? I haven't used defrag applications since W2K.

number of processor cores settings in fb?

Reply #9
Having 4 will not only rape the hardrive, it will require me to run a long defrag job afterwards.
Why? Are your hard drives that slow??? I haven't used defrag applications since W2K.


I think you'd be surprised by the disk churning caused by four large WAV -> FLAC conversions going on even if the process priority is set low*.  I wouldn't be surprised if part of the problem is due to Windows stupendously brain-dead paging (& memory-mapped file) behaviors.

-brendan

* I really wish I could prioritize disk-IO or paging by process sometimes.

number of processor cores settings in fb?

Reply #10
I wouldn't be surprised if part of the problem is due to Windows stupendously brain-dead paging (& memory-mapped file) behaviors.

Yes, stupid Microsoft for hiring garden gnomes to develop their memory management code. Tsk tsk. Garden gnomes make pretty ornaments, but very poor developers. It's a shame they rejected your job application. You obviously know what you're talking about.

number of processor cores settings in fb?

Reply #11
I wouldn't be surprised if part of the problem is due to Windows stupendously brain-dead paging (& memory-mapped file) behaviors.

Yes, stupid Microsoft for hiring garden gnomes to develop their memory management code. Tsk tsk. Garden gnomes make pretty ornaments, but very poor developers. It's a shame they rejected your job application. You obviously know what you're talking about.


Ok, point taken - my florid language was more than a bit over the top.  I just get annoyed at the file cache putting pressure on applications to page large data sets at inconvenient (to me) times.

-brendan

number of processor cores settings in fb?

Reply #12
Having 4 will not only rape the hardrive, it will require me to run a long defrag job afterwards.
Why? Are your hard drives that slow??? I haven't used defrag applications since W2K.


Wow dude, I'd challenge you to scan your drive with the m$ util that's built-in to your o/s.  I think you'll be amazed at just how fragmented your partitions are; windows is exceptionally bad in this regard.

What I suspected about fragmentation while using 4 threads is true; I have over 500 fragments per track that I converted!  I guess the real experiment is to try it with 1 thread and see if it's a function of "convert to same folder" (i.e. read/writing to the same disc) that's causing the problems.

number of processor cores settings in fb?

Reply #13
Wow dude, I'd challenge you to scan your drive with the m$ util that's built-in to your o/s.  I think you'll be amazed at just how fragmented your partitions are
So what? No, I won't be amazed or surprised. Well it's my personal experience: Defraging is a waste of time. 2 of my old computers p3 era, with newer IDE hard disks and regularly used and lots of file IO, run way better than lots of the "out of the box" consumer windows boxes. Defragging not necessary here, never ever. No problem if they run 150 instead of 170 mph. (It's the same with truecrypt volumes, I have no interest in measuring the disk io and cpu difference: everything works fine, that's what I want).

number of processor cores settings in fb?

Reply #14
Quick update: the number of threads has no effect on the amount of file fragments when converting to the same dir via a transcode.  One will get literally hundreds of fragments when using this function with 4 or 1 threads. 

Too bad -- I can convert ape --> flac via 4 threads at over 80x real time but I have to run defrag for a such a LONG time afterwards to cleanup the fall out.

number of processor cores settings in fb?

Reply #15
If you really need to defrag. Then there is free defragmentator that is much faster than built in. http://www.kessels.com/JkDefrag/


number of processor cores settings in fb?

Reply #17
NTFS was built to fragment, just like ext3.

Unless you're running Windows98 (running FAT32), fragmentation doesn't matter anymore on a home desktop.

I defrag once a year, but never have performance issues.
elevatorladylevitateme

number of processor cores settings in fb?

Reply #18
This thread needs ABX for disk-fragmentation slowdown placebo.
I am arrogant and I can afford it because I deliver.

number of processor cores settings in fb?

Reply #19
Yeah, it's kinda gone a bit off topic.

number of processor cores settings in fb?

Reply #20
Makes me want to go back in time and un post the solution.  Calm down and remember Personal Computers are personal, so run yours how you want to run it and the other person will run theirs how they want it.  We all don't have to be the same.

JKDefrag has a GUI frontend that got made and another good one is WinContig. http://wincontig.mdtzone.it/en/index.htm

All file systems fragment over time and use though I have had more luck with NTFS than other ones including Apple's touted superior file system.  NTFS is pretty robust considering it is NT.  FAT was just a waste of time but is still necessary if you want to read and write to a Windows partition in Linux...until they get around to finally working with the writing end of NTFS, they have the reading end down.  Ext3 is good so is ReiserFS, though even Linux is prone to fragmentation...and that is why they made defragmenting tools.

"To Defragment or not to defragment that is the question,
Whether tis nobler in the Computer to suffer
the files and fragments of outrageous fortune
Or to take arms against a sea of fragmentation
And by defragmenting end the fragmentation"

- Hamlet.app 2.0 
Zune 80, Tak -p4 audio library, Lossless=Choice

number of processor cores settings in fb?

Reply #21
Logically, "much faster" defragmentor means it finds less fragments than others or skips lots of fragmented files, right? .

You know that Windows built in defragmenter is a version of a commercial program Diskeeper? That almost all defragmenting apps use Windows built in api for defragmenting?