Welcome to the Treehouse Community

Want to collaborate on code errors? Have bugs you need feedback on? Looking for an extra set of eyes on your latest project? Get support with fellow developers, designers, and programmers of all backgrounds and skill levels here with the Treehouse Community! While you're at it, check out some resources Treehouse students have shared here.

Looking to learn something new?

Treehouse offers a seven day free trial for new students. Get access to thousands of hours of content and join thousands of Treehouse students and alumni in the community today.

Start your free trial

Databases

Performing many queries in a single Hibernate session

Is it dangerous to perform hundreds to thousands of queries within a single Hibernate session?

I am reading an excel file that can have rows on the order of tens of thousands and I need to insert its information into a database. Right now I am reading the rows of the excel file into Java objects that I am storing in a List. I am then opening a Hibernate session and starting a transaction and iterating through the list, inserting each object one by one.

I have other methods that split up the list into chunks on the order of hundreds and then creating separate sessions/transactions for each of those chunks.

Should I be splitting the list up or does it not matter?

1 Answer

Steven Parker
Steven Parker
231,128 Points

The "danger" might be loss of time on error.

I like doing time-intensive operations in smaller chunks because if an error occurs, you don't have to restart the entire thing from scratch.

I understand splitting the operation into multiple transactions, but should I split it into multiple sessions (high overhead)? Is there a common practice for how to handle potentially long connections with databases?

Update: I read about batch processing https://docs.jboss.org/hibernate/orm/4.0/devguide/en-US/html/ch04.html and it seems like this is a possible solution to saving large sets of data.

Steven Parker
Steven Parker
231,128 Points

Batch processing is great to offset reliance on maintaining a connection for an extended time, but I'd still recommend processing in chunks for the same reason (to reduce recover time from errors).