diff --git a/reference/en/master.xml b/reference/en/master.xml index 49bf57a7dc..bc0c6d148f 100644 --- a/reference/en/master.xml +++ b/reference/en/master.xml @@ -14,6 +14,7 @@ + @@ -155,6 +156,7 @@ &session-api; &transactions; &events; + &batch; &query-hql; &query-criteria; diff --git a/reference/en/modules/batch.xml b/reference/en/modules/batch.xml new file mode 100755 index 0000000000..eefef1c1e5 --- /dev/null +++ b/reference/en/modules/batch.xml @@ -0,0 +1,98 @@ + + Batch processing using Hibernate + + + A naive approach to inserting 100 000 rows in the database using Hibernate might + look like this: + + + + + + This would fall over with an OutOfMemoryException somewhere + around the 50 000th row. That's because Hibernate caches all the newly inserted + Customer instances in the session-level cache. + + + + In this chapter we'll show you how to avoid this problem. First, however, if you + are doing batch processing, it is absolutely critical that you enable the use of + JDBC batching, if you intend to achieve reasonable performance. Set the JDBC batch + size to a reasonable number (say, 10-50): + + + + + + You also might like to do this kind of work in a process where interaction with + the second-level cache is completely disabled: + + + + + + Batch inserts + + + When making new objects persistent, you must flush() and + then clear() the session regularly, to control the size of + the first-level cache. + + + + + + + + Batch updates + + + For retrieving and updating data the same ideas apply. In addition, you need to + use scroll() to take advantage of server-side cursors for + queries that return many rows of data. + + + + + + + \ No newline at end of file