id summary reporter owner description type status priority milestone component version resolution keywords cc 2000 Batch API for annotation handling Nicklas Nordborg Nicklas Nordborg "When updating a large number of annotation on a large number of items it easy to run into problems: * The first-level cache in Hibernate can easily use up all available memory * Dirty-checking and SQL execution by Hibernate takes a long time * If the change history logging is enabled, this also takes a long time The current annotation importer plug-in has been used as for testing. It was used to import values for 140+ different annotation types to 4900+ items (samples). The data file is 4MB large. The work done by the annotation importer can be divided into the following steps. JConsole is used to check the memory usage and debug output to check the time. || '''Action''' || '''Time''' || '''Memory''' || || Parse the file and find the item to update (loaded by ID) || 7 sec || ~500MB || || Update annotations || 5 min || ~500MB -> 1.5GB || || Commit - Hibernate || 12 min || ~1.5GB || || Commit - Change log || 13 min || ~1.5GB -> 1.9GB || CPU usage may also be interesting. This is usually below 10% (less than a full single core). The CPU usage for Postgres is in the same range. The main problems here are that the memory usage grows in the second step and that the last two steps takes a long time. In theory it should be possible to improve the second step a lot since in this stage the annotation importer is only working with a single item at a time. We do not need Hibernate to keep things in the first-level cache. If we can manage this it may be that the Hibernate commit step is also automatically solved. The change log step may be harder, since we are already using the stateless session here. However, it is maybe possible to replace this with our own batch SQL implementation as we have done for reporters and raw data already. It turned out that using the annotation importer to delete the existing annotations proved to be much worse. The initial parsing and updating of items used about the same time and amount of memory as when creating items. When committing BASE need to go through relations that may point to the deleted items and either delete them as well or nullify the reference (for example, any-to-any links and inherited/cloned annotations). This consumed more and more memory and reached a point where most of the time was spent doing GC. After 1.5 hours (60 minutes GC) I gave up and killed Tomcat. I'll see what happens if Tomcat get more memory... Giving Tomcat 4GB instead of 2GB memory helped. The maximum low level was near 3GB. The steps outlined in the table above took more or less the same time as when inserting annotations. An additional hour was spent checking/removing references to the deleted annotation. Total time was over 1 hour 20 minutes. '''Final note''' After all changes in this ticket and in #2002 has been made the annotation importer has improved a lot. Using the same test data as in the table above, the time for importing new annotations is typically 4-5 minutes and for deleting 6-7 minutes. Memory usage is well below 1GB most of the time and garbage collection seems to be able to clean up so that no more than 0.5GB remains." enhancement closed critical BASE 3.8 core fixed