Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: Is metadb data size a concern in 2.0? (Read 777 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

Is metadb data size a concern in 2.0?

My foo_enhanced_playcount probably has one of the larger metadb sizes out there (depending on how many plays/scrobbles you have) and it seems to be significantly adding to bloat in metadb.sqlite.

With 202k scrobbles plus probably another 150k plays in foobar (so around 352k timestamp_t values I'd guess) my old index-data file took up 5.1MB. Now in metadb.sqlite that data appears to take up a whopping 56MB (saved a copy deleted the tables and compressed db to get that number, so may not be 100% accurate). Since beta 18 it's my understanding that this data is loaded in memory just like in 1.0. Is it actually using all that extra space in memory?

Should I be taking steps to compress the data being saved to my metadb?