The DatabaseMail feature in SQL Server and Managed Instance is widely used by many professionals. But what if you want a similar functionality in Azure SQL Database? There are options, but it requires a bit of tinkering. Tinkering that we gladly explain in this blog.
So you want to create something in Azure SQL Database that replicates the DatabaseMail feature. You could build an alternative solution by using a mailtable, a mailservice and logic apps. However, the much missed SQL Agent is missing in Azure SQL Database. You could substitute that with Elastic Database Jobs. So what if you want to combine the features of Elastic Database Jobs with the DatabaseMail alternative and let your jobs trigger emails?
One approach would be to create a mailtable in every database, let the job insert data into it and use a logic app to query each database individually. However, suppose you have hundreds of databases. That would be a hell of job and for each newly created database, you would need to modify your logic app. Luckily, we can use the output clause of Elastic Database Jobs to insert emails in a central mailtable in the JobDatabase. Meaning you would only need to create a single logic app that will fetch messages from a central mailtable. Let’s get this sorted!
The first thing to do before we can get started, is creating an Elastic Job agent and a SQL Database to hold the systemobjects or JobDatabase in my case. This setup is out of scope for this blog, but you can find lots of good articles online. One thing I’d like to recommend is to use target groups referring to Azure SQL Servers or Elastic Pools instead of individual databases. Whenever new databases are added to the server or pool, they will be included as target automatically.
The official documentation can be found here: https://docs.microsoft.com/en-us/azure/azure-sql/database/elastic-jobs-overview .
After the JobDatabase and necessary elastic jobs have been set up, we need to create a table in our JobDatabase to store emails. Records in this table will be retrieved by a logic app and their fields will be used to construct an email. To find newly inserted records, the primary key of this table is used. Therefore you shouldn’t reseed its identity value by, for example, truncating the table.
Here you can find the Data Definition Language (DDL) I used for the table. Most columns are self-explanatory except for Internal_execution_id. This column is used to store the execution id of our job.
CREATE TABLE [dbo].[DatabaseMail](
[MailID] [int] IDENTITY(1,1) NOT NULL,
[MailText] [nvarchar](max) NULL,
[MailSubject] [nvarchar](250) NULL,
[MailTo] [nvarchar](250) NULL,
[MailFrom] [nvarchar](250) NULL,
[InsertDate] [datetime] NULL,
[internal_execution_id] [uniqueidentifier] NULL,
PRIMARY KEY CLUSTERED
)WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY];
ALTER TABLE [dbo].[DatabaseMail]
ADD CONSTRAINT [DF_DatabaseMail_MailFrom]
DEFAULT (Nfirstname.lastname@example.org') FOR [MailFrom];
ALTER TABLE [dbo].[DatabaseMail]
ADD DEFAULT (getdate()) FOR [InsertDate];
To send out the emails we need a mailing service. In my case, I choose for SendGrid because it has a free tier that gives you 100 free mails per day. But you can choose almost any mail service for this use case, so just go for your preferred solution.
The next thing we need is an API key that will be used by our logic app to connect to the mail service. In the case of SendGrid, you need to go the newly created resource and click on “Open SAAS Account on Publishers site.”
This will bring you to a dashboard where you get an overview of for example the amount of emails that have been sent. For now, we only need to create an API key. This can be done by going to Settings > API keys > Create API Key.
In the pop-up, you can enter a name for your key and specify the permissions associated with it. Mail Send permissions should suffice.
Make sure that you copy the API key, it won’t be displayed again afterwards!
Next, you need to verify your sender identity to protect your reputation. In my case – this was only for testing purposes – I chose to use the single sender method. The downside of this method is that mails can only be send from a single (verified) email address.
For more information: https://docs.sendgrid.com/for-developers/sending-email/sender-identity
If you’ll never interact with the email, it’s recommended to set up domain authentication. This allows you to use any address within your domain, such as email@example.com.
To fetch the emails from our [dbo].[DatabaseMail] table and send them to SendGrid, we need to set up a logic app. I chose for the consumption based pricing tier. As the logic app only runs every X minutes and uses few actions and connectors, this is the cheaper option.
Afterwards, go to your newly created app and open the designer. Here we will define the order of actions. The first step will check for newly created items in our [dbo].[DatabaseMail] table. This is done, as said before, based on the Primary Key. Therefore, never interfere with the autonumbering as messages might not be picked up.
Create an action of the type: “When an item is created (V2)”. Here you will need to specify the server, database and table to connect to by creating a connection. You can also specify the interval of the check. As I use the consumption based model, this also determines the cost of the logic app. With a reasonable interval, the app will only cost you a couple of euros per month.
In the next step – the action “Send Email (V4)” – we will compose our email and transfer it to SendGrid. Just as for the first step, you will need to create a connection first. We’ll use the API key we created before in the SendGrid portal to do so. Afterwards, use the output of the previous step to build your message.
Mails can be sent by simply inserting data into [dbo].[DatabaseMail] via the below code.
INSERT INTO [dbo].[DatabaseMail]
('A tricycle with two wheels is a bicycle'
This is actually almost the same method that you use in elastic database jobs, we just need to be able to insert records. When creating an elastic jobstep in the JobDatabase, you can specify output parameters. When the step returns any data, these parameters will be used as a destination to store that data. So in your jobstep you can just select any data to return. These records will be inserted in the mailtable in the JobDatabase and can then be used as building blocks for your email. Just make sure to match the column names of your destination table.
In this example, the logging table of the Ola Hallengren maintenance jobs is checked for errors and an email will be sent if it finds any.
DECLARE @JobName NVARCHAR(128) = N'Index Maintenance'
DECLARE @JobStepName NVARCHAR(128) = N'Deploy Maintenance Schema.'
DECLARE @Command NVARCHAR(MAX) = N '
WHERE [StartTime] > DATEADD(hour, -24, GETUTCDATE())
AND [ErrorNumber] <> 0
AND [CommandType] = ''ALTER_INDEX''
) > 0
DECLARE @recipients NVARCHAR(250);
DECLARE @subject NVARCHAR(250);
DECLARE @text VARCHAR(MAX);
SET @text = ''Index maintenance has failed on database ''+DB_NAME()+''. Please check the dbo.commandlog table.'';
SET @recipients = ''firstname.lastname@example.org'';
SET @subject = DB_NAME()+'': Azure index maintenance failed'';
SELECT @text AS MailText,
@subject AS MailSubject,
@recipients AS MailTo;
DECLARE @CredentialName NVARCHAR(128) = N'JobCredential'
DECLARE @TargetGroupName NVARCHAR(128) = N'DatabaseMaintenanceGroup'
DECLARE @Parallelism INT = 1
@job_name = @JobName,
@description = 'Verify index maintenance';
EXEC jobs.sp_add_jobstep @job_name = @JobName,
@step_name = @JobStepName,
@command = @Command,
@credential_name = @CredentialName,
@target_group_name = @TargetGroupName,
@max_parallelism = @Parallelism,
And that’s actually all there is to it! If the query returns any result, a record will be generated. The MailID, InsertDate and internal_execution_id will be filled in automatically, so we can safely ignore those. And a couple of minutes later, depending on the interval you configured, the mail will be in your inbox!