All good things in life consists of 3 parts, and so also this series of blogposts. In previous blogposts we have been creating the Linked Services in Azure and CTAS’ing files from Azure Data Lake into tables on Azure DWH. I finished my previous post by advising you to use Azure Data Factory V2 (ADF) as an orchestration tool for your ELT load. This is exactly what we’ll be doing in this post.
In order to keep the ADF procedures small we’ll be reading out a parameter table from an Azure SQL DB containing the names of the Stored Procedures that we want to trigger in ADF. ADF will then iterate over these parameter lists and execute the Stored Procedures on our Azure Datawarehouse. The execution can occur sequentially or simultaneously, depending on the settings.
Let’s get started.
What we need as prerequisites
In our example the parameter table is called dbo.ParameterTable. How original 😊
In the Master pipeline you then add a Lookup (you find it under General), and name the lookup Lkp_StoredProcedureList.
Under the Settings tab set the settings as following.
The Source Dataset is the one you just created. As my parametertable is very simple I just read the entire table into my array. But you could also load a reduced dataset when you use a query with a filter on a MustRun field for example. The table can contain several columns, for instance, if you would want to run several stored procedures or U-SQL statements (Azure Data Lake Analytics) in the same iteration of one worker pipeline.
Second step
As we created our lookup it is now time to create exactly that Worker pipeline we just discussed about. In our example it’s called Workerline_SP.
Under Iteration & Conditionals you’ll find the ForEach object. Drag this into the Workerline, (deselect the ForEach) and create a Parameter called PipelineList you give the type Object. I called the ForEach loop Iterate_SPList.
Now you select the Foreach activity Iterate_SPList and set the following settings:
By doing this I force the ForEach loop to run iteratively. If you unselect it you have to set a batch count for controlling the amount of parallel executions.
The really important setting here is that of the Item.
We populate it with @pipeline().parameters.PipelineList.
This means we will iterate over the looked up array and use its content as parameters for our load. Remember that in the Workerline parameter settings we have set the object that will be called PipelineList.
Next you click on the pencil in our for each loop. This will bring you to the payload. In this example we will trigger our Stored Procedure on the Azure DWH.
So in the payload we drag in the Stored Procedure activity from under General.
I have called it SP Workerline LoadSP. When you click it go to the section SQL Account, and choose the Linked Service referring to your Azure DWH.
Next you go to Stored Procedure, this is the one we will trigger.
Under Stored procedure name you type in @{item().SP_Name}.
The item() is the reference to the array we have set in the setting of our for each loop. The SP_Name is the columnname of our parameter table. So in case you want to run several activities in the same iteration, this is the place where you can define which column from the parameter table you assign to each activity.
If someone at Microsoft is reading this. I found it hard to discoverer useful documentation about all the parameter options in the Microsoft Documentation, so here’s a warm call to the Microsoft staff to provide an overview of all possible options and possibilities we have using dynamic content in ADF 😊.
Tirth step
Almost there. We have set up the entire Worker; but in order to get everything working as it should, there is one missing step left. We still need to trigger the Worker from within the master
Go back to the Master and drag the Execute Pipeline activity in from under General.
Link the both together under the success output. If the lookup was a success, the pipeline will start, if not nothing will happen and the workflow will fail. ETL tools anyone?
Click on the Execute Pipeline activity and under Settings set the following.
The Invoked pipeline is the worker pipeline you just created. With the Wait on completion option set I force the trigger to only report completion when the underlying pipeline finishes.
At parameters we off course need to pass our parameter array to the workerline pipeline. So I create a parameter PipelineList and give it the following value: @activity(‘Lkp_StoredProcedureList’).output.value
This will send the content of the output of our lookup into the Workerline pipeline.
The proof of the pie is in the eating
Looking at the Monitor we see the runs to be successful. I have 2 different stored procedures inside my look. One for ctas’ing real estate and one for insurance data. They both ran fine.
So we see that both my Master and Workerline started and when I go deeper into the workerline details, I see that my for loop did 2 iterations, and ran both stored procedures in my parameter table.
Off course you are not obliged to store your parameters in an Azure SQL database. You can also maintain it in an Azure table or JSON file.
Wish you all the best in your future Azure use.
© 2023 Kohera
Crafted by
© 2022 Kohera
Crafted by
Cookie | Duration | Description |
---|---|---|
ARRAffinity | session | ARRAffinity cookie is set by Azure app service, and allows the service to choose the right instance established by a user to deliver subsequent requests made by that user. |
ARRAffinitySameSite | session | This cookie is set by Windows Azure cloud, and is used for load balancing to make sure the visitor page requests are routed to the same server in any browsing session. |
cookielawinfo-checkbox-advertisement | 1 year | Set by the GDPR Cookie Consent plugin, this cookie records the user consent for the cookies in the "Advertisement" category. |
cookielawinfo-checkbox-analytics | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics". |
cookielawinfo-checkbox-functional | 11 months | The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional". |
cookielawinfo-checkbox-necessary | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary". |
cookielawinfo-checkbox-others | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other. |
cookielawinfo-checkbox-performance | 11 months | This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance". |
CookieLawInfoConsent | 1 year | CookieYes sets this cookie to record the default button state of the corresponding category and the status of CCPA. It works only in coordination with the primary cookie. |
elementor | never | The website's WordPress theme uses this cookie. It allows the website owner to implement or change the website's content in real-time. |
viewed_cookie_policy | 11 months | The cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data. |
Cookie | Duration | Description |
---|---|---|
__cf_bm | 30 minutes | Cloudflare set the cookie to support Cloudflare Bot Management. |
pll_language | 1 year | Polylang sets this cookie to remember the language the user selects when returning to the website and get the language information when unavailable in another way. |
Cookie | Duration | Description |
---|---|---|
_ga | 1 year 1 month 4 days | Google Analytics sets this cookie to calculate visitor, session and campaign data and track site usage for the site's analytics report. The cookie stores information anonymously and assigns a randomly generated number to recognise unique visitors. |
_ga_* | 1 year 1 month 4 days | Google Analytics sets this cookie to store and count page views. |
_gat_gtag_UA_* | 1 minute | Google Analytics sets this cookie to store a unique user ID. |
_gid | 1 day | Google Analytics sets this cookie to store information on how visitors use a website while also creating an analytics report of the website's performance. Some of the collected data includes the number of visitors, their source, and the pages they visit anonymously. |
ai_session | 30 minutes | This is a unique anonymous session identifier cookie set by Microsoft Application Insights software to gather statistical usage and telemetry data for apps built on the Azure cloud platform. |
CONSENT | 2 years | YouTube sets this cookie via embedded YouTube videos and registers anonymous statistical data. |
vuid | 1 year 1 month 4 days | Vimeo installs this cookie to collect tracking information by setting a unique ID to embed videos on the website. |
Cookie | Duration | Description |
---|---|---|
ai_user | 1 year | Microsoft Azure sets this cookie as a unique user identifier cookie, enabling counting of the number of users accessing the application over time. |
VISITOR_INFO1_LIVE | 5 months 27 days | YouTube sets this cookie to measure bandwidth, determining whether the user gets the new or old player interface. |
YSC | session | Youtube sets this cookie to track the views of embedded videos on Youtube pages. |
yt-remote-connected-devices | never | YouTube sets this cookie to store the user's video preferences using embedded YouTube videos. |
yt-remote-device-id | never | YouTube sets this cookie to store the user's video preferences using embedded YouTube videos. |
yt.innertube::nextId | never | YouTube sets this cookie to register a unique ID to store data on what videos from YouTube the user has seen. |
yt.innertube::requests | never | YouTube sets this cookie to register a unique ID to store data on what videos from YouTube the user has seen. |
Cookie | Duration | Description |
---|---|---|
WFESessionId | session | No description available. |