can someone tell me how to speed up calibres add to library operation. it should not take 10 minutes to add 20 books totaling 12 MB the sheer speed and power of modern computers can move this amount of data in under a second so why is calibre dragging ass?

Calibre not only has to move the data, depending on your settings calibre also has to explode/unzip the books an examine the contents. And perhaps convert and zip again. And it has to do this in a way that ensures that the calibre library doesn't loose data.

Calibre sometimes automatically does a lot of very heavy lifting, depending on the preferences set.

It is without any doubt the most demanding program I have installed on my computer, perhaps with the exception of Handbrake for video conversion. But Handbrake doesn't put a load on the I/O like calibre does, mainly cpu.

I don't find book import all that slow. But what I have wondered about is why batch metadata updates are so much slower than single-book metadata updates.

Say, correcting a title or author sort for a few books. Do it 1 book at a time, and they take less than a second each. Do even two at a time (thus using the batch metadata update screen) and it takes far longer than (N x <1s).

Try to minimize the tag browser to the left. I have have it minimized all the time, except when I actually need it.

Usually when you update metadata only some records in the database is changed. But when you update authors or title, then the books have to be moved to a new folder in the calibre library.

Also when the books are moved they are really first copied to the new place and then deleted at the old. Not just renamed. The reason is that this reduces the chance of data corruption if the process is interrupted for some reason. If only one book is updated, then the disc caches can most likely be fully used to speed it up. But maybe not as likely if several books are updated.

I don't find book import all that slow. But what I have wondered about is why batch metadata updates are so much slower than single-book metadata updates.

Say, correcting a title or author sort for a few books. Do it 1 book at a time, and they take less than a second each. Do even two at a time (thus using the batch metadata update screen) and it takes far longer than (N x <1s).

I can't really explain that one.

Wish my metadata individual downloads took less than a second. More like 20 to 50 seconds for me, and it is not connection speed or hardware I am pretty sure. Probably I have too may sources configured.

I don't find book import all that slow. But what I have wondered about is why batch metadata updates are so much slower than single-book metadata updates.

Say, correcting a title or author sort for a few books. Do it 1 book at a time, and they take less than a second each. Do even two at a time (thus using the batch metadata update screen) and it takes far longer than (N x <1s).

I can't really explain that one.

Wish my metadata individual downloads took less than a second. More like 20 to 50 seconds for me, and it is not connection speed or hardware I am pretty sure. Probably I have too may sources configured.

Look a little closer. he was referring to metadata updates not metadata downloads.

Look a little closer. he was referring to metadata updates not metadata downloads.

Ah I understand. Turning of the tag browser does indeed make a big difference in this regard, in the GUI at least although I love the tag browser.

Kovid Goyal explained to me that the tag browser took a lot of time to update every time a field was changed. Even with the tag browser off though it takes a long time to do a bulk metadata change of even one field in two records, so there is something else in play here.

Ah I understand. Turning of the tag browser does indeed make a big difference in this regard, in the GUI at least although I love the tag browser.

Kovid Goyal explained to me that the tag browser took a lot of time to update every time a field was changed. Even with the tag browser off though it takes a long time to do a bulk metadata change of even one field in two records, so there is something else in play here.

Try to minimize the tag browser to the left. I have have it minimized all the time, except when I actually need it.

Usually when you update metadata only some records in the database is changed. But when you update authors or title, then the books have to be moved to a new folder in the calibre library.

Also when the books are moved they are really first copied to the new place and then deleted at the old. Not just renamed. The reason is that this reduces the chance of data corruption if the process is interrupted for some reason. If only one book is updated, then the disc caches can most likely be fully used to speed it up. But maybe not as likely if several books are updated.

More RAM will most likely help.

I'm already running with the tag browser minimized. And, yeah, I know what Calibre does when I update an author or a title, so I expect it to be slower than say, adding a tag or something like that which only needs to update the Calibre database. But that is not it, because actions like that are a lot slower as well when done in batch as compared to doing them one at a time.

It does seem database size related, because the effect is a lot less when I do it on my small library (which is about a sixth the size of my main one)

Even with the tag browser off though it takes a long time to do a bulk metadata change of even one field in two records, so there is something else in play here.

Yup, that's exactly what I'm seeing. Pick two books, do a metadata update through the individual update screen and it's poof, poof, done. Do it with the bulk update screen and it's wait...wait....wait....done. Far more than twice as long as a single update.

It's database size related, too; the effect is a lot less if your library is smaller.