Using SSIS lookup transformation editor with millions of rows
If you are using a SSIS lookup transformation editor to determine whether to update/delete/insert records and the reference dataset is large (millions of rows) and you are using “Full cache” on the lookup transformation (the default), you might run into a problem. With this setting the rows will all be loaded into memory, and with so many rows you can run out of memory on the machine and the data flow will fail because the lookup transformation will not spool its memory overflow to disk.
There are two solutions to this. One solution is you can use one of the other two cache modes: “Partial cache” or “No cache”. These will prevent all or most of the rows from being loaded into memory, but the trade-off is that you will see a lot more hits to the database. And if your reference dataset is tens of millions of rows, these could make the task take hours or even days. Read more about the different cache modes: SSIS – Lookup Cache Modes – Full, Partial, None and Lookup cache modes.
A better solution is to replace the lookup transformation editor and the corresponding OLE DB Command transformations and instead use the MERGE statement (requires SQL Server 2008) in an Execute SQL task. Read about it at Using MERGE in Integration Services Packages. Basically, you will create a Data Flow task that loads, transforms, and saves the source data to a staging table, create an Execute SQL task that contains the MERGE statement, and then connect the Data Flow task to the Execute SQL task, and use the data in the staging table as the input for the MERGE statement.
One SSIS package that I was working on was using “No cache” and the reference dataset was 5 million rows. It was taking 6 hours to run. After replacing it with the MERGE statement, it took only 15 minutes to complete.
One side note: “Full cache” mode adds additional startup time for your data flow, as all of the caching takes place before any rows are read from the data flow source(s). So if you execute a package and it takes a long time before you see the first task start, it could be because many rows are being loaded into memory.
Hi James,
Did you tried Cache task and then Lookup that cache?
Regards,
Pedro
MVP SQL BI
Hi Pedro,
I have not tried the Cache Transform task. I will give it a spin, but I’m thinking I will have the same issue with it taking up all the memory when I try to load millions of records. Thanks for the question.
Thanks! Great resource.
Hello James,
Right now I am using lookup transformation and we have more than 2 millions rows, the tempdb is not able to handle this. Can you give me an example on how to use merge statement, please.
Source is DB2 and target is Sql Server
Thanks James