Perform local backups and then save to Azure blob storage

Backing up your databases direct to Azure can be dangerous!

If you experience a router or network connectivity issue, then you run the risk of your transaction log backups consuming all disk space and halting your business operations.

If you start experiencing high network latency, then your backups could run into your business day and start impacting performance.

One solution is to perform a local backup and then load the backups into Azure. This way backups are not impacted by network outages/latency.

Let’s go through how this can be achieved.

The approach I’m using can also be used with SQL Server Express Edition.

Big caveat – these scripts were written in 1 day! Testing and error capture is minimal! Feel free to test and any feedback is welcome.

1. Prepare Your Azure Storage

Of course you can skip this step if you’ve got it setup already.

Create a Storage account.
You’ll be needing your Storage account name for later.
Enable Blob soft delete for a get out of jail free card.
Create and you’re nearly done.
Find your Storage account.
Go to Blobs.
Let’s create a Container (folder) to hold your SQL backups.
Give it a name and OK.
You’ll be needing that name later.
Right time to get your access keys!
Save your Key you’ll be needing it.

2. Get Your Favourite Ola Hallengren Backup Scripts

You’ll be wanting to save and deploy the shown 2 scripts on your SQL Server instance that you want to backup ref: https://ola.hallengren.com/downloads.html

Of course you can skip this step if you’ve got them already.

Deploy these to your SQL Server instance.
Yeah I created them in the master database, very naughty.

3. Create my sql_backups table in the same database

This table will keep track of your backups.

USE [master]
GO
CREATE TABLE [dbo].[sql_backups](
	[id] [bigint] IDENTITY(1,1) NOT NULL,
	[database_name] [nvarchar](128) NULL,
	[backup_type] [char](4) NULL,
	[backup_start_date] [datetime] NULL,
	[backup_end_date] [datetime] NULL,
	[local_backup_path] [nvarchar](4000) NULL,
	[keep_local_backup_hrs] [int] NULL,
	[local_backup_removed] [char](1) NULL,
	[local_backup_removed_date] [datetime] NULL,
	[saved_to_azure] [char](1) NULL,
	[azure_copy_start_date] [datetime] NULL,
	[azure_copy_end_date] [datetime] NULL,
	[azure_storage_account] [varchar](24) NULL,
	[azure_container] [varchar](64) NULL,
	[azure_backup_file] [nvarchar](4000) NULL,
	[keep_blob_backup_hrs] [int] NULL,
	[blob_backup_removed] [char](1) NULL,
	[blob_backup_removed_date] [datetime] NULL,
	[error_desc] [nvarchar](4000) NULL
) ON [PRIMARY]
GO
You should now see all 3 components.

4. Save PowerShell Backup and Copy Scripts

This solution makes use of 2 PowerShell scripts which can be scheduled using Task Scheduler or a SQL Server Agent job.

I’ve renamed the extension to txt in case you have issues on download.

SQL_Backups

SQL_Copy_Backups_To_Blob

The first script is SQL_Backups.ps1 and the second is SQL_Copy_Backups_To_Blob.ps1.

Simply save these to a folder of your choice.

PowerShell files you’ll be needing.

5. Schedule the SQL_Backups.ps1 Script

You would schedule SQL_Backups for Full and Log backups.

SQL_Backups.ps1 takes the following parameters and all are required:

  • instance – This is the instance which you’ll be backing-up e.g. ‘KN01\SQL2K19CTP3_0’
  • type – This is the standard Ola Hallengren backup types e.g. FULL or DIFF or LOG
  • local_backup_folder – This is where you want your local backups to go e.g. D:\SQLBackups
  • azure_storage_account – This is the Azure Storage account you created earlier e.g. sqlbackupsnz
  • azure_container – This is the Azure storage container (folder) you created earlier e.g. sqlbackups
  • keep_local_hrs – This is how many hours you would like to keep the local backups for e.g. 24
  • keep_blob_hrs – This is how many hours you would like to keep the blob backups for e.g. 168

An example of this running would be:

powershell –ExecutionPolicy Bypass -File C:\Scripts\SQL_Backups.ps1 -instance 'KN01\SQL2K19CTP3_0' -type 'FULL' -local_backup_folder 'D:\SQLBackups' -azure_storage_account 'sqlbackupsnz' -azure_container 'sqlbackups' -keep_local_hrs 24 -keep_blob_hrs 168

It just creates your standard local backups and populates the sql_backups table.

Saves backup details.
Showing other info stored.

Great so now you are secure from network issues/latency.

6. Schedule the SQL_Copy_Backups_To_Blob.ps1 Script

SQL_Copy_Backups_To_Blob.ps1 only needs to be scheduled once and typically it should run every 15 minutes.

This script checks and uploads local backups listed in the sql_backups table to Azure and will also check and remove local & blob backups which have passed your specified retention time.

Local backups that haven’t been uploaded to Azure won’t be deleted, even if their retention period has passed. Cool!

SQL_Copy_Backups_To_Blob.ps1 only has 1 parameter and it is required:

  • instance – This is the instance which you’ll be backing-up e.g. ‘KN01\SQL2K19CTP3_0’

What you will need to do is enter the Storage account and Key information that you created / viewed earlier. This allows the backups to be uploaded and blobs to be removed.

NOTE: This is in 2 places in the file, yeah lazy scripting.

Set you Storage connection information. Yikes it’s not encrypted! O’well at least your server is locked down!

An example of this running would be:

powershell –ExecutionPolicy Bypass -File C:\Scripts\SQL_Copy_Backups_To_Blob.ps1 -instance 'KN01\SQL2K19CTP3_0'
sql_backups table updated.
Back in Azure, you’ll see your backups.

That is it, when backups / blobs exceed your hours to keep threshold, they will be removed.

Enjoy!

Leave a Reply