javax.persistence.TransactionRequiredException while performing inserts in a loop
Asked Answered
G

2

9

I have an application which uses Spring, Hibernate and JTA.

We received a request with larger than expected data (10000 elements), general requests are 10-100 elements.

While processing this request we try to insert a record in DB for each of these elements and that's done in a for loop

pseudo code:

processRecords( list){

for (Element element: list){

dao.findBy -- This takes around 100 ms

-- some other checks and logic

dao.add(element); -- This takes around 150 ms
dao.flush();
 }

}

This block takes a lot of time to process the records and then I get *

"javax.persistence.TransactionRequiredException: no transaction is in progress"

*

I tried moving the flush out of the for loop, did not help, I tried researching on batch inserts for hibernate , but this is a huge application with lots of customization and I do not see that as an option as it will impact whole application, Also I tried finding where is the Transaction duration configured and only place I could find was in JTA on weblogic and it was set to 480 Sec.

Any indicators on how to solve this scenario would be much appreciated.

Edit: increasing the JTA Timeout in weblogic has solved the problem temporarily, but I set it to a very large value 5000 sec, is there anyway to improve the performance, since I am just inserting 8K records (I know batch processing is one option, but there are some constraint in custom "framework" around that)

Garlaand answered 13/12, 2016 at 22:51 Comment(5)
Do you use JPA or native hibernate. Please provide the code snippet that persist data. In order to persist data you need to have a transaction at the JDBC/DB level, if you use spring managed hibernate/JPA you typically do this by adding @Transactional to your dao method.Moll
we receive the request in jms queue and @Transactional is added while reading the message (inbound jms) from queue, the whole transaction begins from thereGarlaand
What's your database? Appears that the amount of data change is too much for your provider, like it can't keep its before image data in whatever temp storage it has and then the transaction automatically gets rolled back. It might also be helpful to turn on Hibernate debugging and see if there's anything happening there.Punishable
Did you set the timeout property while configuring your Transaction manager?Boz
there are a number of solutions possible depending on the exact situation. Fetching you data in bulk should already help your case. Not flushing after each element could also help. Maybe you could go with a multi-threaded solution?Arva
S
8

Use @Transactional to control the transactions.

  1. Make sure to not automatically start a transaction with PROPAGATION_NEVER
  2. Start new transactions for each item in list (or subset of items in list) by calling a service method with REQUIRES_NEW.

In the second step, you might want to call on another bean so that the transaction annotation is actually picked up, can't remember whether this works on methods on the same object.

Sportscast answered 15/3, 2017 at 16:34 Comment(3)
"Start new transactions for each item in list" This might take a while...Shabby
Yes, multiple items in list should be grouped together to improve performance, if necessary.Sportscast
Since you have very straight situation, let Spring handle the transaction. Use of @Transactional should helpEtymology
F
2

Based on your inline commments, your transaction duration is 2,500 seconds, of which 1,000 seconds is reading the database, and 1,500 is adding to the database.

You may reduce the database transaction by restructuring your logic a little bit: In most database drivers, the database doesn't actually start a transaction until the first "write" to the database. You can leverage this by restructuring your logic to do all the reads first, then do all the writes. I've had success with this approach with non-JPA data access, but have not tried it with JPA. The restructured pseudocode is something like this:

processRecords( list){

// do all the reads
List adds=new ...
for (Element element: list){
  dao.findBy -- This takes around 100 ms
  -- some other checks and logic
  adds.put(element);
}

// do all the writes
for (Element element: adds){
  dao.add(element); -- This takes around 150 ms
  dao.flush();
 }

}

Use a database trace to validate this is working as expected. In addition to reducing the transaction duration, this approach reduces the risk of blocking due to a read on another thread.

Felt answered 21/3, 2017 at 17:12 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.