Batch processing
A naive approach to inserting 100 000 rows in the database using Hibernate might
look like this:
This would fall over with an OutOfMemoryException somewhere
around the 50 000th row. That's because Hibernate caches all the newly inserted
Customer instances in the session-level cache.
In this chapter we'll show you how to avoid this problem. First, however, if you
are doing batch processing, it is absolutely critical that you enable the use of
JDBC batching, if you intend to achieve reasonable performance. Set the JDBC batch
size to a reasonable number (say, 10-50):
You also might like to do this kind of work in a process where interaction with
the second-level cache is completely disabled:
Batch inserts
When making new objects persistent, you must flush() and
then clear() the session regularly, to control the size of
the first-level cache.
Batch updates
For retrieving and updating data the same ideas apply. In addition, you need to
use scroll() to take advantage of server-side cursors for
queries that return many rows of data.