Beware the impact of the ANSI_WARNINGS setting

Disabling the ANSI_WARNINGS setting in your code can compromise your data integrity.

As an example; say you are inserting an invoice number into a table and ANSI_WARNINGS are disabled.

Yay!

You pat yourself on the back as the insert succeeded.

Can you spot the issue?

Boo!

Ok you can take that pat back.

You should have ANSI_WARNINGS ON to ensure you get a warning and the operation doesn’t succeed.

Ahh!

One way to spot if an active connection has this setting disabled is to query the DMV sys.dm_exec_sessions. An example is shown below:

SELECT 
 session_id
,CASE WHEN [ansi_warnings] = 1 THEN 'yes' ELSE 'no' END AS is_ansi_warnings_enabled
FROM sys.dm_exec_sessions
WHERE session_id > 50;

Time to investigate session 54!

Check your SQL errorlog file sizes

At some stage you’ll likely want to check your SQL Server errorlog files using xp_readerrorlog or sp_readerrorlog.

If you haven’t worked on the server before and/or don’t have access to view the file sizes before running the command the below query will help obtain this information.

Why would you be interested in knowing the file sizes before running the command? Well, if the instance hasn’t been restarted in a while and sp_cycle_errorlog hasn’t been run, the log files could be really large and cause potential memory and IO performance issues when you try read them.

DECLARE @enumerrorlogs TABLE ([Log Type] varchar(5), [Archive #] int, [Date] datetime, [Log File Size (Byte)] bigint);

--get sql log file sizes
INSERT INTO @enumerrorlogs ([Archive #], [Date], [Log File Size (Byte)])
EXEC master.sys.sp_enumerrorlogs 1;

UPDATE @enumerrorlogs
SET [Log Type] = 'SQL'
WHERE [Log Type] IS NULL;

--get sql agent log file sizes
INSERT INTO @enumerrorlogs ([Archive #], [Date], [Log File Size (Byte)])
EXEC master.sys.sp_enumerrorlogs 2;

UPDATE @enumerrorlogs
SET [Log Type] = 'Agent'
WHERE [Log Type] IS NULL;

SELECT 
 a.[Log Type] AS log_type
,a.[Archive #] AS archive
,a.[Date] AS date_last_updated
,CAST(a.[Log File Size (Byte)]/1024/1024.0 AS decimal(20,3)) AS log_size_mb
FROM @enumerrorlogs a
ORDER BY a.[Log Type] DESC, a.[Archive #];

SQL Server 2014 Mainstream Support Gone

Yes SQL Server 2014 Mainstream Support ended this month. I didn’t notice much of a fuss. Possibly everyone is scrambling with SQL Server 2008/R2 exiting Extended Support or they are sorted.

That currently leaves SQL Server 2016 and SQL Server 2017 as the only mainstream products. SQL Server 2019 release is imminent, but uptake will likely take some time.

Warning: Clustered columnstore indexes on Azure SQL databases

If you are using clustered columnstore indexes on Azure SQL databases you run the risk of not being able to access your data when you change your pricing tier.

To demonstrate, I’ll create a basic table and add a Clustered columnstore index.

Simple table with CCI.

We’ll insert 2 rows and view the data.

2 rows added.

Let’s verify the current pricing tier.

We’re working with a S3.

Cool, we’ll change the database to a S2 to save some money.

Yay it’s now a S2.

The boss wants to view some data from that table we created earlier.

Bugger!

Now you know something to watch out for with CCIs.

Perform local backups and then save to Azure blob storage

Backing up your databases direct to Azure can be dangerous!

If you experience a router or network connectivity issue, then you run the risk of your transaction log backups consuming all disk space and halting your business operations.

If you start experiencing high network latency, then your backups could run into your business day and start impacting performance.

One solution is to perform a local backup and then load the backups into Azure. This way backups are not impacted by network outages/latency.

Let’s go through how this can be achieved.

The approach I’m using can also be used with SQL Server Express Edition.

Big caveat – these scripts were written in 1 day! Testing and error capture is minimal! Feel free to test and any feedback is welcome.

1. Prepare Your Azure Storage

Of course you can skip this step if you’ve got it setup already.

Create a Storage account.
You’ll be needing your Storage account name for later.
Enable Blob soft delete for a get out of jail free card.
Create and you’re nearly done.
Find your Storage account.
Go to Blobs.
Let’s create a Container (folder) to hold your SQL backups.
Give it a name and OK.
You’ll be needing that name later.
Right time to get your access keys!
Save your Key you’ll be needing it.

2. Get Your Favourite Ola Hallengren Backup Scripts

You’ll be wanting to save and deploy the shown 2 scripts on your SQL Server instance that you want to backup ref: https://ola.hallengren.com/downloads.html

Of course you can skip this step if you’ve got them already.

Deploy these to your SQL Server instance.
Yeah I created them in the master database, very naughty.

3. Create my sql_backups table in the same database

This table will keep track of your backups.

USE [master]
GO
CREATE TABLE [dbo].[sql_backups](
	[id] [bigint] IDENTITY(1,1) NOT NULL,
	[database_name] [nvarchar](128) NULL,
	[backup_type] [char](4) NULL,
	[backup_start_date] [datetime] NULL,
	[backup_end_date] [datetime] NULL,
	[local_backup_path] [nvarchar](4000) NULL,
	[keep_local_backup_hrs] [int] NULL,
	[local_backup_removed] [char](1) NULL,
	[local_backup_removed_date] [datetime] NULL,
	[saved_to_azure] [char](1) NULL,
	[azure_copy_start_date] [datetime] NULL,
	[azure_copy_end_date] [datetime] NULL,
	[azure_storage_account] [varchar](24) NULL,
	[azure_container] [varchar](64) NULL,
	[azure_backup_file] [nvarchar](4000) NULL,
	[keep_blob_backup_hrs] [int] NULL,
	[blob_backup_removed] [char](1) NULL,
	[blob_backup_removed_date] [datetime] NULL,
	[error_desc] [nvarchar](4000) NULL
) ON [PRIMARY]
GO
You should now see all 3 components.

4. Save PowerShell Backup and Copy Scripts

This solution makes use of 2 PowerShell scripts which can be scheduled using Task Scheduler or a SQL Server Agent job.

I’ve renamed the extension to txt in case you have issues on download.

SQL_Backups

SQL_Copy_Backups_To_Blob

The first script is SQL_Backups.ps1 and the second is SQL_Copy_Backups_To_Blob.ps1.

Simply save these to a folder of your choice.

PowerShell files you’ll be needing.

5. Schedule the SQL_Backups.ps1 Script

You would schedule SQL_Backups for Full and Log backups.

SQL_Backups.ps1 takes the following parameters and all are required:

  • instance – This is the instance which you’ll be backing-up e.g. ‘KN01\SQL2K19CTP3_0’
  • type – This is the standard Ola Hallengren backup types e.g. FULL or DIFF or LOG
  • local_backup_folder – This is where you want your local backups to go e.g. D:\SQLBackups
  • azure_storage_account – This is the Azure Storage account you created earlier e.g. sqlbackupsnz
  • azure_container – This is the Azure storage container (folder) you created earlier e.g. sqlbackups
  • keep_local_hrs – This is how many hours you would like to keep the local backups for e.g. 24
  • keep_blob_hrs – This is how many hours you would like to keep the blob backups for e.g. 168

An example of this running would be:

powershell –ExecutionPolicy Bypass -File C:\Scripts\SQL_Backups.ps1 -instance 'KN01\SQL2K19CTP3_0' -type 'FULL' -local_backup_folder 'D:\SQLBackups' -azure_storage_account 'sqlbackupsnz' -azure_container 'sqlbackups' -keep_local_hrs 24 -keep_blob_hrs 168

It just creates your standard local backups and populates the sql_backups table.

Saves backup details.
Showing other info stored.

Great so now you are secure from network issues/latency.

6. Schedule the SQL_Copy_Backups_To_Blob.ps1 Script

SQL_Copy_Backups_To_Blob.ps1 only needs to be scheduled once and typically it should run every 15 minutes.

This script checks and uploads local backups listed in the sql_backups table to Azure and will also check and remove local & blob backups which have passed your specified retention time.

Local backups that haven’t been uploaded to Azure won’t be deleted, even if their retention period has passed. Cool!

SQL_Copy_Backups_To_Blob.ps1 only has 1 parameter and it is required:

  • instance – This is the instance which you’ll be backing-up e.g. ‘KN01\SQL2K19CTP3_0’

What you will need to do is enter the Storage account and Key information that you created / viewed earlier. This allows the backups to be uploaded and blobs to be removed.

NOTE: This is in 2 places in the file, yeah lazy scripting.

Set you Storage connection information. Yikes it’s not encrypted! O’well at least your server is locked down!

An example of this running would be:

powershell –ExecutionPolicy Bypass -File C:\Scripts\SQL_Copy_Backups_To_Blob.ps1 -instance 'KN01\SQL2K19CTP3_0'
sql_backups table updated.
Back in Azure, you’ll see your backups.

That is it, when backups / blobs exceed your hours to keep threshold, they will be removed.

Enjoy!

SQL Server Supported Upgrade Paths

In-place upgrade paths of SQL Server are shown below.

Ahh the memories

Note: You’ll need to take into consideration:

Destination
2008

2008 R2

2012

2014

2016

2017

2019*
Source
2005
122
20083 455
2008 R26 7 8 8
20129 10 10
2014
2016
2017

1 Minimum SQL Server 2005 SP2

2 Minimum SQL Server 2005 SP4

3 Minimum SQL Server 2008 SP2

4 Minimum SQL Server 2008 SP3

5 Minimum SQL Server 2008 SP4

6 Minimum SQL Server 2008 R2 SP1

7 Minimum SQL Server 2008 R2 SP2

8 Minimum SQL Server 2008 R2 SP3

9 Minimum SQL Server 2012 SP1

10 Minimum SQL Server 2012 SP2

*  Unknown at this stage but SQL Server 2008 and SQL Server 2008 R2 are not blocked

Backup restores to a different SQL Server version

We know that upgrading of a SQL Server instance to another version has potential limitations e.g. you can not perform an in-place upgrade of SQL Server 2005 to SQL Server 2016.

Well what about restoring an older database backup to a newer version?

Good news, no such limitation currently exists.

Destination
2005

2008/R2

2012

2014

2016

2017

2019
Source
2005
11 1 1
2008/R2
2012
2014
2016
2017
2019

1 Database compatibility level automatically changes to SQL Server 2008 (100)