kohera-logo-regular.svg

Overlooked Properties in SSIS: Lookup Component Cache Mode

Vector_BG.png

Overlooked Properties in SSIS: Lookup Component Cache Mode

Vector_BG.png

SSIS has so many properties that some are overlooked or ignored. In this article we will take a closer look at one of them: the “cache mode” setting of the Lookup Component.

As the name indicates, the “Cache mode” lets you choose which form of caching the Lookup component will use. The available modes are Full, Partial and None. By default “Full Cache” mode is selected. This is a good default value, but there are of course situations in which you might want to change it, otherwise there would be no need for the other modes.

Let’s have a look at how the different modes affect the behavior of the parent component. Once we know the impact, it will be easier to decide which mode to use in a given situation.

 

 

Full Cache

The entire lookup dataset is loaded into SSIS cache. Your dataflow waits until the cache is fully loaded before starting to do its lookups. This is the default setting, and it should be. For most situations this is the best setting. The only 2 situations when this setting shouldn’t be used, is when the Lookup data set is very large, or when the number of input records is really small compared to the lookup dataset.

Let’s combine these two cases in an example. Consider the situation in which your input dataset is only 2 records large. But the lookup dataset is over 200 million records. Using Full cache mode, SSIS loads the full 200 million records in memory, and then performs the lookup. All this, just to look up 2 values. A bit of overkill, isn’t it?

In this specific case, it would be much better to use the “No Cache” mode.

 

No Cache

The easiest of the bunch. SSIS doesn’t keep any cache and sends a query to the reference DB for every record it needs to look up.

When to use?

  1. The number of records in the source data set is very small (and will remain so)
  2. AND the reference data set is very large (otherwise just use full cache mode)

Returning to the extreme example above, the “No Cache” setting solves all our problems. Instead of loading 200 million records in memory and doing the lookup there, SSIS queries the DB only twice, once for each row in the input data source. This is a huge gain in performance and server load.

 

Partial cache

Of course, there has to be a third situation, for which neither of these modes offers a good solution. Why else would we have 3 modes to select from? The third mode is a hybrid of the other two.

Let’s return to the example above, but increase the size of our input dataset to 10 000 rows. Investigating our input data we notice that although we now have 10 000 records, the number of unique values that need to be looked up is really small (let’s say 10). So we have thousands of input records and millions of records in our lookup dataset of which we only really need 10. If we would use “No Cache mode” we would query the DB for each record in the input dataset. Thus we would perform 9 990 useless lookups. Not an ideal solution by any means.

Luckily we have the Partial Cache mode to help us out in this situation.

With Partial Cache enabled, SSIS builds up a cache as it goes along. For each record SSIS checks if the value to look for is already known in its cache. If that value is not there, SSIS queries the reference DB and stores the result in its cache for future queries.

This way SSIS only retrieves the lookup records it really needs.

When to use?

  1. The number of unique lookup combinations in the source data set is small
  2. AND the reference data set is very large (otherwise just use full cache mode)

As you can see, this is a perfect fit for our previous example. Though we have 10 000 input records, we only query the reference DB ten times at most. Once we have retrieved the lookup for a specific value, all further records with that value will use the cache.

 

Conclusion

Using the correct cache mode for the job can have a huge impact on performance and server load. While the default setting is correct most of the time, always keep in mind what it does, and change it if necessary.

Photo of successful woman coder hacker web creator sitting armchair comfortable workspace workstation indoors.
The hurdles and pitfalls of moving or migrating a System-versioned temporal table cross database
Maybe you already have your own way of doing this and are wondering about alternative methods, or maybe you are...
Group of computer programmers working in the office. Focus is on blond woman showing something to her colleague on PC.
Updating your Azure SQL server OAuth2 credentials in Power BI via PowerShell for automation purposes
The better way to update OAuth2 credentials in Power BI is by automating the process of updating Azure SQL Server...
2401-under-memory-pressure-featured-image
Under (memory) pressure
A few weeks ago, a client asked me if they were experiencing memory pressure and how they could monitor it...
2402-fabric-lakehouse-featured-image
Managing files from other devices in a Fabric Lakehouse using the Python Azure SDK
In this blogpost, you’ll see how to manage files in OneLake programmatically using the Python Azure SDK. Very little coding...
2319-blog-database-specific-security-featured-image
Database specific security in SQL Server
There are many different ways to secure your database. In this blog post we will give most of them a...
kohera-2312-blog-sql-server-level-security-featured-image
SQL Server security made easy on the server level
In this blog, we’re going to look at the options we have for server level security. In SQL Server we...