2019-01-14 11:48:25 UTC
I'm trying to imrove the performance of the ETL process.
We have ldif to db2 flow.
The iterator in AL goes through objects in ldif file and then update or insert the entries in DB2 table.
We've found some points to improve:
1.commit by portions
2.change update to custom merge statement
3.try parallel processing of ldif files
4.change link criteria to the custom ones
Related to this have the question about commit statement. Where is the set of changes to db are stored before commit? on the side of db or in some connector pool?
Will appreciate your help greatly if you have any other advice regarding the performance.
Thanks in advance.