This forum is now a read-only archive. All commenting, posting, registration services have been turned off. Those needing community support and/or wanting to ask questions should refer to the Tag/Forum map, and to http://spring.io/questions for a curated list of stackoverflow tags that Pivotal engineers, and the community, monitor.

AnnouncementAnnouncement Module

Collapse

No announcement yet.

problem with duplicates inserted when using getSimpleJdbcTemplate().batchUpdate()Page Title Module

problem with duplicates inserted when using getSimpleJdbcTemplate().batchUpdate()

Mar 15th, 2010, 02:46 PM

Hi all,

I'm getting a problem of getting my records inserted twice if I use getSimpleJdbcTemplate().batchUpdate() inside my ItemWriter if i have an an error while inserting one of the records in the commit.

For example, if have 10 records in my batch update, and 1 of them is going to fail, the successful 9 will be inserted twice.

It seems like that even though there is an error with one item in the batch insert, the other 9 are still commited. However, the batchUpdate() will return with an exception. Spring Batch (in particular, the BatchRetryTemplate.. i think), is catching that trying to insert each record with a chunk size of 1...(in an effort to find the one that caused an error), hence producing the duplicate. Is there a way to stop
that from happening?

so i think where it is causing issue for me is, the first rollback is not happening. my records are still commited. so when it goes in 1 by 1, i get duplicates on all of them except the one that causes an exception.

so 1) do you guys using batchUpdate() as well and getting the rollback? I dont think it is an issue if i use insert(), but it is much slower.

and 2) is there a way to stop the 1 by 1 reprocessing while still throwing an exception??

if i catch the exception from being thrown up, the meta data in the spring batch tables get off b/c it thinks i written everything.