Warning: Clustered columnstore indexes on Azure SQL databases

If you are using clustered columnstore indexes on Azure SQL databases you run the risk of not being able to access your data when you change your pricing tier.

To demonstrate, I’ll create a basic table and add a Clustered columnstore index.

Simple table with CCI.

We’ll insert 2 rows and view the data.

2 rows added.

Let’s verify the current pricing tier.

We’re working with a S3.

Cool, we’ll change the database to a S2 to save some money.

Yay it’s now a S2.

The boss wants to view some data from that table we created earlier.

Bugger!

Now you know something to watch out for with CCIs.

Perform local backups and then save to Azure blob storage

Backing up your databases direct to Azure can be dangerous!

If you experience a router or network connectivity issue, then you run the risk of your transaction log backups consuming all disk space and halting your business operations.

If you start experiencing high network latency, then your backups could run into your business day and start impacting performance.

One solution is to perform a local backup and then load the backups into Azure. This way backups are not impacted by network outages/latency.

Let’s go through how this can be achieved.

The approach I’m using can also be used with SQL Server Express Edition.

Big caveat – these scripts were written in 1 day! Testing and error capture is minimal! Feel free to test and any feedback is welcome.

1. Prepare Your Azure Storage

Of course you can skip this step if you’ve got it setup already.

Create a Storage account.
You’ll be needing your Storage account name for later.
Enable Blob soft delete for a get out of jail free card.
Create and you’re nearly done.
Find your Storage account.
Go to Blobs.
Let’s create a Container (folder) to hold your SQL backups.
Give it a name and OK.
You’ll be needing that name later.
Right time to get your access keys!
Save your Key you’ll be needing it.

2. Get Your Favourite Ola Hallengren Backup Scripts

You’ll be wanting to save and deploy the shown 2 scripts on your SQL Server instance that you want to backup ref: https://ola.hallengren.com/downloads.html

Of course you can skip this step if you’ve got them already.

Deploy these to your SQL Server instance.
Yeah I created them in the master database, very naughty.

3. Create my sql_backups table in the same database

This table will keep track of your backups.

USE [master]
GO
CREATE TABLE [dbo].[sql_backups](
	[id] [bigint] IDENTITY(1,1) NOT NULL,
	[database_name] [nvarchar](128) NULL,
	[backup_type] [char](4) NULL,
	[backup_start_date] [datetime] NULL,
	[backup_end_date] [datetime] NULL,
	[local_backup_path] [nvarchar](4000) NULL,
	[keep_local_backup_hrs] [int] NULL,
	[local_backup_removed] [char](1) NULL,
	[local_backup_removed_date] [datetime] NULL,
	[saved_to_azure] [char](1) NULL,
	[azure_copy_start_date] [datetime] NULL,
	[azure_copy_end_date] [datetime] NULL,
	[azure_storage_account] [varchar](24) NULL,
	[azure_container] [varchar](64) NULL,
	[azure_backup_file] [nvarchar](4000) NULL,
	[keep_blob_backup_hrs] [int] NULL,
	[blob_backup_removed] [char](1) NULL,
	[blob_backup_removed_date] [datetime] NULL,
	[error_desc] [nvarchar](4000) NULL
) ON [PRIMARY]
GO
You should now see all 3 components.

4. Save PowerShell Backup and Copy Scripts

This solution makes use of 2 PowerShell scripts which can be scheduled using Task Scheduler or a SQL Server Agent job.

I’ve renamed the extension to txt in case you have issues on download.

SQL_Backups

SQL_Copy_Backups_To_Blob

The first script is SQL_Backups.ps1 and the second is SQL_Copy_Backups_To_Blob.ps1.

Simply save these to a folder of your choice.

PowerShell files you’ll be needing.

5. Schedule the SQL_Backups.ps1 Script

You would schedule SQL_Backups for Full and Log backups.

SQL_Backups.ps1 takes the following parameters and all are required:

  • instance – This is the instance which you’ll be backing-up e.g. ‘KN01\SQL2K19CTP3_0’
  • type – This is the standard Ola Hallengren backup types e.g. FULL or DIFF or LOG
  • local_backup_folder – This is where you want your local backups to go e.g. D:\SQLBackups
  • azure_storage_account – This is the Azure Storage account you created earlier e.g. sqlbackupsnz
  • azure_container – This is the Azure storage container (folder) you created earlier e.g. sqlbackups
  • keep_local_hrs – This is how many hours you would like to keep the local backups for e.g. 24
  • keep_blob_hrs – This is how many hours you would like to keep the blob backups for e.g. 168

An example of this running would be:

powershell –ExecutionPolicy Bypass -File C:\Scripts\SQL_Backups.ps1 -instance 'KN01\SQL2K19CTP3_0' -type 'FULL' -local_backup_folder 'D:\SQLBackups' -azure_storage_account 'sqlbackupsnz' -azure_container 'sqlbackups' -keep_local_hrs 24 -keep_blob_hrs 168

It just creates your standard local backups and populates the sql_backups table.

Saves backup details.
Showing other info stored.

Great so now you are secure from network issues/latency.

6. Schedule the SQL_Copy_Backups_To_Blob.ps1 Script

SQL_Copy_Backups_To_Blob.ps1 only needs to be scheduled once and typically it should run every 15 minutes.

This script checks and uploads local backups listed in the sql_backups table to Azure and will also check and remove local & blob backups which have passed your specified retention time.

Local backups that haven’t been uploaded to Azure won’t be deleted, even if their retention period has passed. Cool!

SQL_Copy_Backups_To_Blob.ps1 only has 1 parameter and it is required:

  • instance – This is the instance which you’ll be backing-up e.g. ‘KN01\SQL2K19CTP3_0’

What you will need to do is enter the Storage account and Key information that you created / viewed earlier. This allows the backups to be uploaded and blobs to be removed.

NOTE: This is in 2 places in the file, yeah lazy scripting.

Set you Storage connection information. Yikes it’s not encrypted! O’well at least your server is locked down!

An example of this running would be:

powershell –ExecutionPolicy Bypass -File C:\Scripts\SQL_Copy_Backups_To_Blob.ps1 -instance 'KN01\SQL2K19CTP3_0'
sql_backups table updated.
Back in Azure, you’ll see your backups.

That is it, when backups / blobs exceed your hours to keep threshold, they will be removed.

Enjoy!

SQL Server Supported Upgrade Paths

In-place upgrade paths of SQL Server are shown below.

Ahh the memories

Note: You’ll need to take into consideration:

Destination
2008

2008 R2

2012

2014

2016

2017

2019*
Source
2005
122
20083 455
2008 R26 7 8 8
20129 10 10
2014
2016
2017

1 Minimum SQL Server 2005 SP2

2 Minimum SQL Server 2005 SP4

3 Minimum SQL Server 2008 SP2

4 Minimum SQL Server 2008 SP3

5 Minimum SQL Server 2008 SP4

6 Minimum SQL Server 2008 R2 SP1

7 Minimum SQL Server 2008 R2 SP2

8 Minimum SQL Server 2008 R2 SP3

9 Minimum SQL Server 2012 SP1

10 Minimum SQL Server 2012 SP2

*  Unknown at this stage but SQL Server 2008 and SQL Server 2008 R2 are not blocked

Backup restores to a different SQL Server version

We know that upgrading of a SQL Server instance to another version has potential limitations e.g. you can not perform an in-place upgrade of SQL Server 2005 to SQL Server 2016.

Well what about restoring an older database backup to a newer version?

Good news, no such limitation currently exists.

Destination
2005

2008/R2

2012

2014

2016

2017

2019
Source
2005
11 1 1
2008/R2
2012
2014
2016
2017
2019

1 Database compatibility level automatically changes to SQL Server 2008 (100)

Microsoft loves PaaS Yeah Nah

Microsoft sure does push the cloud and why not it’s a big money earner for them.

These days lots of organizations are looking for cloud first solutions as it’s the next big thing and it theoretically will save them money, its robust etc.

Ok cool, so lets look at Microsoft’s all products page.

All Products page 2019

Can you spot how many of Microsoft’s latest key business products ‘support’ use of Azure SQL Database or Azure SQL Database managed instance for it’s back-end?

  • Microsoft Skype for Business Server 2019 – no
  • Microsoft Dynamics CRM 2016 – no
  • Microsoft Dynamics 365 (on-premises / IaaS) – no
  • Microsoft Dynamics GP 2018 – no
  • Microsoft SharePoint Server 2019 – no
  • Microsoft System Center 2019 – no
  • ….

Wow, so to me this means either Microsoft’s products teams:

  • Don’t have the expertise to make use of PaaS, or
  • Can’t make it work, or
  • Didn’t get the memo re using the cloud, or
  • Don’t trust it, or
  • Just don’t know?

So when your boss or client says, let lift to the cloud and hey what about using Azure SQL Database or Azure SQL Database managed instance as a backed for our latest Microsoft products, you can respond Yeah Nah.

Don’t pay for SQL Server Licenses if you don’t have to

Let’s face it SQL Server licensing is expensive and companies will jump at any opportunity to reduce costs where they can.

Every dollar counts

SQL Server licenses are actually bundled with some products!

What you say.

Yes true, for example, System Center Configuration Manager (SCCM) comes with a free SQL Server Standard Edition license. The catch is that if a database for any additional Microsoft or third-party product shares the SQL Server, you must have a separate license for that SQL Server instance.

Wow, so why would you ever host your SCCM databases on a shared instance. It just doesn’t make sense, unless you really need some enterprise features like Transparent Data Encryption (TDE) or you have a super complex environment.

Developer Edition became free with SQL Server 2014, so it’s a no brainier to use that for non-Production environments. Even pre-SQL Server 2014 you could have looked to use the much cheaper Developer Edition to help reduce costs.

So I’ll start a list and as it grows so can your potential cost savings:

Importing Data Migration Assistant JSON Findings

When using the Data Migration Assistant you’ll have an option to export the findings as CSV or JSON.

The following provides a guide to importing the JSON data into a database table for you to review.

Firstly, create a table to hold the information using:

CREATE TABLE DMA_Findings
(
 [Project_Status] nvarchar(128)
,[Project_Name] nvarchar(128)
,[Project_SourcePlatform] nvarchar(128)
,[Project_TargetPlatform] nvarchar(128)
,[ServerInstances_ServerName] nvarchar(128)
,[ServerInstances_Version] nvarchar(128)
,[ServerInstances_Status] nvarchar(128)
,[Databases_ServerName] nvarchar(128)
,[Databases_Name] nvarchar(128)
,[Databases_CompatibilityLevel] nvarchar(128)
,[Databases_SizeMB] decimal (20,2)
,[Databases_Status] nvarchar(128)
,[Databases_ServerVersion] nvarchar(128)
,[Databases_ServerEdition] nvarchar(128)
,[AssessmentRecommendations_CompatibilityLevel] nvarchar(128)
,[AssessmentRecommendations_Category] nvarchar(128)
,[AssessmentRecommendations_Severity] nvarchar(128)
,[AssessmentRecommendations_ChangeCategory] nvarchar(128)
,[AssessmentRecommendations_RuleId] nvarchar(128)
,[AssessmentRecommendations_Title] nvarchar(160)
,[AssessmentRecommendations_Impact] nvarchar(4000)
,[AssessmentRecommendations_Recommendation] nvarchar(4000)
,[AssessmentRecommendations_MoreInfo] nvarchar(4000)
,[ImpactedObjects_Name] nvarchar(128)
,[ImpactedObjects_ObjectType] nvarchar(128)
,[ImpactedObjects_ImpactDetail] nvarchar(4000)
,[ImpactedObjects_SuggestedFixes] nvarchar(4000)
);

Next, import a single DMA json file using:

INSERT INTO DMA_Findings( 
 [Project_Status]
,[Project_Name]
,[Project_SourcePlatform]
,[Project_TargetPlatform]
,[ServerInstances_ServerName]
,[ServerInstances_Version]
,[ServerInstances_Status]
,[Databases_ServerName]
,[Databases_Name]
,[Databases_CompatibilityLevel] 
,[Databases_SizeMB]
,[Databases_Status]
,[Databases_ServerVersion]
,[Databases_ServerEdition]
,[AssessmentRecommendations_CompatibilityLevel]
,[AssessmentRecommendations_Category]
,[AssessmentRecommendations_Severity]
,[AssessmentRecommendations_ChangeCategory]
,[AssessmentRecommendations_RuleId]
,[AssessmentRecommendations_Title]
,[AssessmentRecommendations_Impact]
,[AssessmentRecommendations_Recommendation]
,[AssessmentRecommendations_MoreInfo]
,[ImpactedObjects_Name]
,[ImpactedObjects_ObjectType]
,[ImpactedObjects_ImpactDetail]
,[ImpactedObjects_SuggestedFixes]
)
SELECT 
 Project.[Status] AS [Project_Status]
,Project.[Name] AS [Project_Name]
,Project.[SourcePlatform] AS [Project_SourcePlatform] 
,Project.[TargetPlatform] AS [Project_TargetPlatform]
,[ServerInstances].[ServerName] AS [ServerInstances_ServerName] 
,[ServerInstances].[Version] AS [ServerInstances_Version]
,[ServerInstances].[Status] AS [ServerInstances_Status]
--,[ServerInstances].[AssessmentRecommendations] AS [ServerInstances_AssessmentRecommendations]
,[Databases].[ServerName] AS [Databases_ServerName]
,[Databases].[Name] AS [Databases_Name]
,[Databases].[CompatibilityLevel] AS [Databases_CompatibilityLevel] 
,[Databases].[SizeMB] AS [Databases_SizeMB]
,[Databases].[Status] AS [Databases_Status]
,[Databases].[ServerVersion] AS [Databases_ServerVersion] 
,[Databases].[ServerEdition] AS [Databases_ServerEdition]
,[AssessmentRecommendations].[CompatibilityLevel] AS [AssessmentRecommendations_CompatibilityLevel]
,[AssessmentRecommendations].[Category] AS [AssessmentRecommendations_Category]
,[AssessmentRecommendations].[Severity] AS [AssessmentRecommendations_Severity]
,[AssessmentRecommendations].[ChangeCategory] AS [AssessmentRecommendations_ChangeCategory]
,[AssessmentRecommendations].[RuleId] AS [AssessmentRecommendations_RuleId]
,[AssessmentRecommendations].[Title] AS [AssessmentRecommendations_Title]
,[AssessmentRecommendations].[Impact] AS [AssessmentRecommendations_Impact]
,[AssessmentRecommendations].[Recommendation] AS [AssessmentRecommendations_Recommendation]
,[AssessmentRecommendations].[MoreInfo] AS [AssessmentRecommendations_MoreInfo]
,[ImpactedObjects].[Name] AS [ImpactedObjects_Name]
,[ImpactedObjects].[ObjectType] AS [ImpactedObjects_ObjectType]
,[ImpactedObjects].[ImpactDetail] AS [ImpactedObjects_ImpactDetail]
,[ImpactedObjects].[SuggestedFixes] AS [ImpactedObjects_SuggestedFixes]
FROM
OPENROWSET(BULK N'C:\pathtoyourJSONFile\yourfile.json', SINGLE_CLOB) AS json
OUTER APPLY OPENJSON(BulkColumn)
WITH (
 [Status] nvarchar(128)
,[Name] nvarchar(128)
,[SourcePlatform] nvarchar(128)
,[TargetPlatform] nvarchar(128)
,[Databases] nvarchar(MAX) AS JSON
,[ServerInstances] nvarchar(MAX) AS JSON
) AS [Project]
OUTER APPLY  OPENJSON([ServerInstances])
WITH (
 [ServerName] nvarchar(128)
,[Version] nvarchar(128)
,[Status] nvarchar(128)
--,[AssessmentRecommendations] nvarchar(4000)
) AS [ServerInstances]
OUTER APPLY  OPENJSON([Databases])
WITH (
 [ServerName] nvarchar(128)
,[Name] nvarchar(128)
,[CompatibilityLevel] nvarchar(128)
,[SizeMB] decimal (20,2)
,[Status] nvarchar(128)
,[ServerVersion] nvarchar(128)
,[ServerEdition] nvarchar(128)
,[AssessmentRecommendations] nvarchar(MAX) AS JSON
) AS [Databases]
OUTER APPLY OPENJSON([AssessmentRecommendations])
WITH (
 [CompatibilityLevel] nvarchar(128)
,[Category] nvarchar(128)
,[Severity] nvarchar(128)
,[ChangeCategory] nvarchar(128)
,[RuleId] nvarchar(128)
,[Title] nvarchar(160)
,[Impact] nvarchar(4000)
,[Recommendation] nvarchar(4000)
,[MoreInfo] nvarchar(4000)
,[ImpactedObjects] nvarchar(MAX) AS JSON
) AS [AssessmentRecommendations]
OUTER APPLY OPENJSON([ImpactedObjects])
WITH (
 [Name] nvarchar(128)
,[ObjectType] nvarchar(128)
,[ImpactDetail] nvarchar(4000)
,[SuggestedFixes] nvarchar(4000)
) AS [ImpactedObjects];

Or, if you have lots of DMA json files you can quickly load them all using the following PowerShell

$dma_json_folder = 'C:\....\' # Location of the the DMA json files
$instance_name = '...' # SQL instance which will contain the findings
$database_destination = '...' # Database on the SQL instance

# Get a list of the audit queries
$dma_json_files = @(Get-ChildItem $dma_json_folder *.json) # Get a list of audit queries

if ($dma_json_files.Length -ne 0) # Proceed as queries to run
{

try 
{
Foreach ($dma_json_file in $dma_json_files) # Cycle through each query
    {
    [string]$json_file = $dma_json_file.BaseName + '.json' # Set output file name

$query = @"
INSERT INTO DMA_Findings( 
 [Project_Status]
,[Project_Name]
,[Project_SourcePlatform]
,[Project_TargetPlatform]
,[ServerInstances_ServerName]
,[ServerInstances_Version]
,[ServerInstances_Status]
,[Databases_ServerName]
,[Databases_Name]
,[Databases_CompatibilityLevel] 
,[Databases_SizeMB]
,[Databases_Status]
,[Databases_ServerVersion]
,[Databases_ServerEdition]
,[AssessmentRecommendations_CompatibilityLevel]
,[AssessmentRecommendations_Category]
,[AssessmentRecommendations_Severity]
,[AssessmentRecommendations_ChangeCategory]
,[AssessmentRecommendations_RuleId]
,[AssessmentRecommendations_Title]
,[AssessmentRecommendations_Impact]
,[AssessmentRecommendations_Recommendation]
,[AssessmentRecommendations_MoreInfo]
,[ImpactedObjects_Name]
,[ImpactedObjects_ObjectType]
,[ImpactedObjects_ImpactDetail]
,[ImpactedObjects_SuggestedFixes]
)
SELECT 
 Project.[Status] AS [Project_Status]
,Project.[Name] AS [Project_Name]
,Project.[SourcePlatform] AS [Project_SourcePlatform] 
,Project.[TargetPlatform] AS [Project_TargetPlatform]
,[ServerInstances].[ServerName] AS [ServerInstances_ServerName] 
,[ServerInstances].[Version] AS [ServerInstances_Version]
,[ServerInstances].[Status] AS [ServerInstances_Status]
--,[ServerInstances].[AssessmentRecommendations] AS [ServerInstances_AssessmentRecommendations]
,[Databases].[ServerName] AS [Databases_ServerName]
,[Databases].[Name] AS [Databases_Name]
,[Databases].[CompatibilityLevel] AS [Databases_CompatibilityLevel] 
,[Databases].[SizeMB] AS [Databases_SizeMB]
,[Databases].[Status] AS [Databases_Status]
,[Databases].[ServerVersion] AS [Databases_ServerVersion] 
,[Databases].[ServerEdition] AS [Databases_ServerEdition]
,[AssessmentRecommendations].[CompatibilityLevel] AS [AssessmentRecommendations_CompatibilityLevel]
,[AssessmentRecommendations].[Category] AS [AssessmentRecommendations_Category]
,[AssessmentRecommendations].[Severity] AS [AssessmentRecommendations_Severity]
,[AssessmentRecommendations].[ChangeCategory] AS [AssessmentRecommendations_ChangeCategory]
,[AssessmentRecommendations].[RuleId] AS [AssessmentRecommendations_RuleId]
,[AssessmentRecommendations].[Title] AS [AssessmentRecommendations_Title]
,[AssessmentRecommendations].[Impact] AS [AssessmentRecommendations_Impact]
,[AssessmentRecommendations].[Recommendation] AS [AssessmentRecommendations_Recommendation]
,[AssessmentRecommendations].[MoreInfo] AS [AssessmentRecommendations_MoreInfo]
,[ImpactedObjects].[Name] AS [ImpactedObjects_Name]
,[ImpactedObjects].[ObjectType] AS [ImpactedObjects_ObjectType]
,[ImpactedObjects].[ImpactDetail] AS [ImpactedObjects_ImpactDetail]
,[ImpactedObjects].[SuggestedFixes] AS [ImpactedObjects_SuggestedFixes]
FROM
OPENROWSET(BULK N'$dma_json_folder$json_file', SINGLE_CLOB) AS json
OUTER APPLY OPENJSON(BulkColumn)
WITH (
 [Status] nvarchar(128)
,[Name] nvarchar(128)
,[SourcePlatform] nvarchar(128)
,[TargetPlatform] nvarchar(128)
,[Databases] nvarchar(MAX) AS JSON
,[ServerInstances] nvarchar(MAX) AS JSON
) AS [Project]
OUTER APPLY  OPENJSON([ServerInstances])
WITH (
 [ServerName] nvarchar(128)
,[Version] nvarchar(128)
,[Status] nvarchar(128)
--,[AssessmentRecommendations] nvarchar(4000)
) AS [ServerInstances]
OUTER APPLY  OPENJSON([Databases])
WITH (
 [ServerName] nvarchar(128)
,[Name] nvarchar(128)
,[CompatibilityLevel] nvarchar(128)
,[SizeMB] decimal (20,2)
,[Status] nvarchar(128)
,[ServerVersion] nvarchar(128)
,[ServerEdition] nvarchar(128)
,[AssessmentRecommendations] nvarchar(MAX) AS JSON
) AS [Databases]
OUTER APPLY OPENJSON([AssessmentRecommendations])
WITH (
 [CompatibilityLevel] nvarchar(128)
,[Category] nvarchar(128)
,[Severity] nvarchar(128)
,[ChangeCategory] nvarchar(128)
,[RuleId] nvarchar(128)
,[Title] nvarchar(160)
,[Impact] nvarchar(4000)
,[Recommendation] nvarchar(4000)
,[MoreInfo] nvarchar(4000)
,[ImpactedObjects] nvarchar(MAX) AS JSON
) AS [AssessmentRecommendations]
OUTER APPLY OPENJSON([ImpactedObjects])
WITH (
 [Name] nvarchar(128)
,[ObjectType] nvarchar(128)
,[ImpactDetail] nvarchar(4000)
,[SuggestedFixes] nvarchar(4000)
) AS [ImpactedObjects];
"@
   
    #
    Write-Host "Processing" $dma_json_folder$json_file
    # Use Invoke-Sqlcmd to extract the information
    Invoke-Sqlcmd -Query $query -ServerInstance $instance_name -Database $database_destination -QueryTimeout 600 -ErrorAction Stop 
    }
}
catch 
{
Write-Host 'SQL query error: ' $_.Exception.Message -ForegroundColor Red
#Exit; # Exit and go no further
}
}

Now you’ll be able to query the findings:

SELECT * FROM [DMA_Findings];

An example of finding issues for a migration to SQL Server 2017 you could use a query like:

SELECT 
 [Databases_ServerName] AS Instance
,[Databases_ServerEdition] AS InstanceEdition
,[ServerInstances_Version] AS InstanceBuild
,[Project_SourcePlatform] AS SourcePlatform
,[Project_TargetPlatform] AS TargetPlatform
,[Databases_Name] AS DatabaseName
,[Databases_CompatibilityLevel] AS DatabaseCompatibilityLevel
,[AssessmentRecommendations_CompatibilityLevel] AS TargetCompatibilityLevel
,[Databases_SizeMB] AS Database_SizeMB
,[AssessmentRecommendations_Category]
,[AssessmentRecommendations_Severity]
,[AssessmentRecommendations_ChangeCategory]
,[AssessmentRecommendations_RuleId]
,[AssessmentRecommendations_Title]
,[AssessmentRecommendations_Impact]
,[AssessmentRecommendations_Recommendation]
,[AssessmentRecommendations_MoreInfo]
,[ImpactedObjects_Name]
,[ImpactedObjects_ObjectType]
,[ImpactedObjects_ImpactDetail]
FROM [DMA_Findings]
WHERE 1=1
AND [AssessmentRecommendations_CompatibilityLevel] = 'CompatLevel140'
ORDER BY [Databases_ServerName],[Databases_Name],[AssessmentRecommendations_Severity],[AssessmentRecommendations_ChangeCategory];