Azure Data Migration Service Review

The Data Migration Service (DMS) is a new Azure service still in preview. DMS will help you migrate existing on-premises SQL Server, Oracle, and MySQL databases to Azure SQL Database, Azure SQL Database Managed Instance or SQL Server on an Azure virtual machine.

Since it is now in public preview, for the time being, it is “free” to use.

It has several advantages for migrating data, in that once the data is migrated to the target, it is possible to run a continuous synchronization, or synchronize as needed to keep the target DB up to date. Best of all, only changes (deltas) are synchronized from the source to target on subsequent synchronization.

The easiest way to test DMS, is to stand up SQL Server running on an Azure VM. If the target DB will be on premise, you will need a VNET in place to Azure.

There is a useful migration guide as well, Database Migration Guide <>

Here are the high-level steps involved:

1. Upgrade (or install latest) Database Migration Assistant (DMA) to v3.3

2. Create blank Azure SQL DB

3. Open firewall rules on server for Azure SQL DB

4. Run DMA assessment (New Project)

5. Run DMA Migration (New Project)

a. Generate Schema only

b. Deploy Schema

6. Register DMS Service in Azure

7. Using DMS:

a. Create a new DMS Service

b. Schedule a data synchronize task

c. Run task

From the Azure Portal, create an empty Azure SQL DB, open the firewall rules (as needed) for the SQL Server container. I named mine DMA2, this will be your TARGET DB.

To prepare the SOURCE server:

In Azure, I created a SQL Server 2016 Standard Instance on Server 2016 (used a standard template).

On the target server, used a sample database from Adventure Works 2012

Restored Adventure Works 2012 database

Download SQL Data Migration Assistant v3.3

DMA – Start a new project, Migrate Schema only (using source DB from SQL 2016, on Server 2016), click Next


Choose the TARGET DB, click Next


Allow default settings to be selected, click Generate SQL Script


If desired, click Save to capture the script. Further, review any potential issues. Click Deploy Schema.


You should see the schema sucessfully deploy


Register Microsoft.DataMigration Provider

From the Azure portal, browse to Subscriptions > Resource Providers > search on ‘data’ > locate Microsoft.DataMigration and click ‘Register’


Once Microsoft.DataMigration shows registered you can proceed:


Add the Database Migration Service

From the Azure portal Marketplace, search on ‘data migration service’


Click Create


Enter unique Service Name (for this example, used DMSMigration2), choose pricing tier, then Create.

From the new Data Migration Service, choose New Migration Service


Enter the Source Server name (FQDN) or IP address


Select the source DB name


Enter the TARGET Azure SQL DB, SQL Server container name, click Save


Click Configure Required Settings, then choose Do not validate my database(s), click Save


Review the summary dialog, then click Run Migration


Migration in progress


Migration complete


Posted in Azure SQL Server, Migration, SQL Server 2016 | Leave a comment

TSQL2sday #94–SQL Server and PowerShell

T-SQL Tuesday (#tsql2sday), I’ve been absent for a few, figured it was time to jump back in!

SQL and PowerShell, what a great combination. First, let me thank the sponsor, you can find his site from the following link.


One of the first scripting languages I mastered for automation was VBScript. I know. Stop laughing.

However, that background gave me the ability to learn PowerShell and use it in ways to automate the more mundane aspects of SQL Server administration. Two examples of how PowerShell has proved to be an awesome “force multiplier” follow.

Using PowerShell to Automate a SQL Server database restore

The challenge was taking a backup from one SQL Server, written to a Server UNC, and restoring it to another SQL Server on a daily basis. This process is documented here. There were two challenging parts of this process; how to identify the latest backup file, since it was written out with a unique name. The other to restore it to a different location on the target server. Review the code to locate these solutions.

The bonus was not only automation, but a huge savings in time to complete the restore! Using the UI to restore the database ~650GB took over 2.5 hours, using PowerShell and the ADO methods, less than 1 hour!

Using PowerShell to Audit SQL Server Instances

Tasked with investigating many different SQL Server instances for configuration, possible performance issues and documenting settings; I investigated using PowerShell to gather this information via queries. Then, in turn, place these results in an Excel workbook. In this manner, it provides guidance in evaluating and making recommendations on improving settings such as configuration. Plus, you’ll have a baseline of current instance and database settings. This process is documented here.

This PowerShell script is extensible, I’ve re-written it once to take advantage of a re-usable connection object for enhanced performance. It likely needs to be enhanced again in the near future.


If you’ve been thinking of automating your SQL Server instance or database, check out PowerShell. The time invested in learning and creating solutions can not only save a LOT of time, it will free up your time to work on other things. Do share your creative solutions with the community.

Posted in PowerShell, SQL Server | Leave a comment

Installing Power BI Report Server

Thought I’d post my initial impressions of installing Power BI Report Server. Unless you’ve been disconnected from the web for any length of time, you’ve heard of Power BI. In May 2017, Microsoft released the Power BI Premium service tier, which includes the Power BI Report Server.

The home page will remind you of SSRS:


You have the option to install PBI Report Server as standalone service, or on the same server with SQL Server and/or SQL Server Reporting Services. In any event, you need to install PBI Report Server on a server running a SQL Server instance.

If you are familiar with the Power BI desktop software, there is a new PBI Desktop software you’ll need to install to work with PBI Report Server.

Here are the details and link:

Microsoft Power BI Report Server – June 2017 GA


Power BI Report Server, available as part of Power BI Premium, enables on-premises web and mobile viewing of Power BI reports, plus the enterprise reporting capabilities of SQL Server Reporting Services.

In this version of Power BI Report Server, you can:

  • Connect “live” to Analysis Services models – both Tabular and Multidimensional (cubes)
  • Visually explore data and create an interactive report
  • Use custom visuals in your Power BI Report
  • Save that report to your report server running the preview
  • View and interact with the report in your web browser
  • Add and view comments about the report in the web browser
  • View and interact with the report in the Power BI Mobile App

From <>

Let’s walk though an install of Power BI Report Server

Run PowerBIReportServer.exe, select install PBI Report Server


Choose evaluation version, unless you have the product key, click Next


Agree to license terms, click Next


You’re installing on a server with the database engine installed, click Next


Click Install


Monitor progress…


Installation complete.


Next up, configure report Server.

Connect to your local SQL Server instance


Note that once connected, your instance will be:



Next, click the Web Service URL on left and change name of VirtualDirectory to: ReportServerPBI (this allows SSRS to co-exist). Click Apply.


Click database, on left

Click Next


Click Test Connection to validate. If everything is fine, connection succeeded


Add a PBI suffix to the Database name (so it doesn’t over-write SSRS)

Click Next


Use default service credentials, click next


Review summary, click next


Completion dialog

Click Finish


Completed successfully


Click Web Portal URL on left, then add PBI suffix to Virtual Directory name, click Apply


Click the hyperlink shown the last dialog and it should take you to the Power BI Server.


Watch this blog for future updates on using the PBI Server and the new PBI Desktop software.

Posted in Power BI Report Server, PowerBI, SQL, SQL Server | Leave a comment

Properly configure SQL Server ConfigMgr database

Ideally, this will take place before you install ConfigMgr!

First, estimate size(s) of the databases:

How? Use the  SCCM DB Sizing estimator!

Shown is an example of the Database sizing worksheet, based on estimated client counts:


Anthony has documented this approach as follows:

Get the sizing estimator here.

Using the estimator to calculate the number and size of the data files, pre-create the SQL Server database for ConfigMgr. Key take away, will be the total estimate sizes.

Multiple data files may be appropriate for a couple of reasons. SQL Server is multi-threaded, which means you may get performance gains by having multiple data files. Further, these data files could be located on different logical drives to further distribute load, and increase IOPS. on As a general rule of thumb, use the following as a guideline:

Total Estimate Client Count

Number of CM data files

< 10K


10K to 25K


>25K to 50K




Other factors to consider:

  • Drives dedicated for SQL data and logs should be formatted with NTFS in 64K block size.
  • Always place TempDB data and logs on a dedicated logical drive
  • DB files should be equal size (hint: divide total estimated size / # data files)
  • Create data files to be an even GB
  • Always pre-create the db files prior to installing CM – auto-growth is bad.
  • If you enable auto-growth:
    • Do not use percent (%) growth
    • Set auto-growth to be evenly divisible into a whole GB; (i.e. 512 or 1024MB)
  • Monitor available free space
  • Always use a single LOG file
  • If possible, place LOG file on separate logical drive

It is possible to migrate a single data file to multiple data files. Use this reference to get you started. It is an older article, the screen shots are missing, however the technique is valid.

Note: Be cautious if you attempt to implement in a multi-site hierarchy! Table partitioning may present an issue.

Final thoughts: As always, there is never a right or wrong way to do this, this is intended be used a guideline only. Test in a lab environment first. No cute furry animals were harmed with this process and never stick your fingers inside the cage.

Posted in ConfigMgr, SQL, SQL Server | 15 Comments

Supported SQL Server versions for SCCM CB upgrades

Had a question recently on what version of SQL Server is supported for upgrade to SCCM CB (current branch). Scenario was this; client was running SQL 2012 SP2 for their SCCM 2012 R2. However, in order to upgrade to SCCM  CB, they needed to be at a minimum SQL version of SQL 2012 SP3.

Then, the question became, can I just upgrade to SQL Server 2016 and skip an upgrade step?

Not with the SCCM version in place, this would probably break SCCM 2012 R2. In this case, I’d recommend installing SQL 2012 SP3. Upgrade SCCM to CB 1702 (or later), then upgrade SQL Server 2016.

Reference the SCCM / SQL Server compatibly version matrix for the details.

Posted in ConfigMgr, SQL Server | 4 Comments

Properly size SQL Server TempDB for ConfigMgr

At MMS 2017, Benjamin Reynolds and I covered properly sizing TempDB as one facet in our Optimizing and Tuning SQL Server for ConfigMgr session. Because of a few conversations that occurred that week at MMS, I think it is appropriate to cover this here in a bit more detail.

As background, SQL Server TempDB is used for temporary objects created by queries, sort operations and more… from Microsoft docs:

The tempdb system database is a global resource that is available to all users connected to the instance of SQL Server and is used to hold the following:

  • Temporary user objects that are explicitly created, such as: global or local temporary tables, temporary stored procedures, table variables, or cursors.

  • Internal objects that are created by the SQL Server Database Engine, for example, work tables to store intermediate results for spools or sorting.

First, the primary talking points:

  • Total TempDB should approximate 25-30% of the SCCM total size.
  • TempDB data files should be equally sized
  • Place TempDB on a dedicated drive, with log file
  • Create 4 (or more) equally sized data files. This helps you avoid the GAM/SGAM/PFS page contention issues described in Microsoft KB 2154845.
  • Turn off auto-growth
  • Create no more than 8 data files

Now, more detail on each of these points:

Total TempDB should approximate 25-30% of the SCCM total size.

As a starting point, calculate the total estimated size of the SCCM database. If you are not sure what that size will be, use Anthony Clendenen’s sizing calculator. Once you know the SCCM size, the total TempDB size can be calculated.

TempDB data files should be equally sized

This is often overlooked, once you have determined the total TempDB size, divide that by the number of data files (start with 4) and create EQUAL size data files. This will allow SQL Server to automatically choose the next available data file for operations.

Place TempDB on a dedicated drive, with log file

TempDB can be very IO intensive, if you have the ability to choose SSD storage for TempDB, do it. There is no benefit to separate the TempDB log file to another drive, place all TempDB data and log file(s) on a single, logical drive.

Create 4 (or more) equally sized data files. This helps you avoid the GAM/SGAM/PFS page contention issues described in Microsoft KB 2154845.

Page contention can occur on TempDB. By creating multiple data files, you can reduce contention as SQL Server will automatically roll to another TempDB data file if one is in use. Hence, performance can be increased by reducing contention.

Turn off auto-growth

If you have a logical drive dedicated for TempDB data and log file(s), consider filling the drive with the TempDB data files and turning off file Autogrowth.

Create no more than 8 data files

In general, never create more than 8 data files for TempDB. For SCCM, 4 equally sized data files seems to work quite well.

Posted in Uncategorized | 5 Comments

SQL Backup to URL (Azure Storage account)

How to backup up a copy of an on premise database, or Azure SQL Server database to an Azure Storage account.

Notes: Any names that need to be added at time of creation are indicated with <>, such as <sqlstorageaccountname> used in the next example are suggestions only and will not include the <> symbols! You are encouraged to develop your own naming conventions. Screen captures shown are representative examples of each task.

Backup Steps

  1. Create an Azure Storage account
  2. Create a Credential
  3. Perform a database backup

Create an Azure Storage account

In order to backup a SQL Server database to Azure, you will need to first create an Azure Storage account.

  1. The steps to backup a database to a URL are as follows:
    Login the Azure Portal
  2. Locate Storage, click Add
  3. Create Storage container – Enter the following information:
    1. Name – <sqlstorageaccountname>Note: this must be globally unique!
    2. Account Kind – General Purpose
    3. Performance – Standard
    4. Storage service encryption – Enabled
    5. Resource group
      1. Create new
      2. <sqlbootcamp>

  1. Within storage container (created last step)
  2. On the container properties page, under Blob Services, click Containers.
  3. To Create a container for the SQL backups, Click (+ Container)
    1. Name: <sqlbackup>
    2. Access type – Blob
    3. Click Create

Select the newly created container (last step), then Properties. Locate the URL line on right and copy this information into the clipboard. Note: Pasting this information into Notepad is a convenient way to store this information for a subsequent lab (hint!).

It should appear similar to this (substituting storage container and container name):

  1. Next, obtain the access key for this storage account. Select the (root) storage container name, then under SETTINGS, select Access keys. To the right of key1, select the copy to clipboard icon. Save this key to notepad (paste) – it will be used in a future step.

Creating a Credential

You will now create a Credential with SQL Server, which will allow access to the Azure storage account from SSMS. This Credential is stored locally within SQL Server.

Take a moment and review the following example – SQL Server credential is created for authentication to the Microsoft Azure Blob storage service.

Using storage account identity and access key

Notes: T-SQL Sample, <mycredentialname> = credential name used internally, <mystorageaccountname> = name of the storage account, <mystorageaccountaccesskey> = key 1 captured in previous task.

IF NOT EXISTS (SELECT * FROM sys.credentials WHERE name = '<mycredentialname>') 
CREATE CREDENTIAL [<mycredentialname>] WITH IDENTITY = '<mystorageaccountname>' ,
SECRET = '<mystorageaccountaccesskey>'; 

(SELECT * FROM sys.credentials WHERE name = '<mycredentialname>') 

CREATE CREDENTIAL [<mycredentialname>] WITH IDENTITY = '<mystorageaccountname>' 
,SECRET = '<mystorageaccountaccesskey>'; 

Now, you will translate this sample T-SQL into a credential for your Backup!

Using Notepad to build your T-SQL statement – Substitute the bracketed areas next with your account information.

CREATE CREDENTIAL [backup] WITH IDENTITY = '<YourAccountName>' ,
SECRET = '<YourStorageAccessKey>'; 

CREATE CREDENTIAL [backup] WITH IDENTITY = '<YourAccountName>' 
,SECRET = '<YourStorageAccessKey>'; 

IDENTITY = ‘YourAccountName‘ (Note: from Create an Azure Storage account – just the first name, not the complete FQDN)

SECRET = ‘YourStorageAccessKey‘ (Note: from Create an Azure Storage accountobtain storage access key – last step)

Then, copy the entire statement into the clipboard.

Open SQL Server Management Studio (SSMS) and connect to your local copy of SQL Server.

Click New Query.

Copy and paste the above T-SQL into the new query window.

Then click Execute.

The command should show Completed Successfully.

Perform a full database backup to URL

The following example perform a full database backup of the AdventureWorks2016 database to the Microsoft Azure Blob storage service. Take a moment and review the syntax.

Backup To URL using storage account identity and access key

WITH IDENTITY = '<YourAccountName>',
SECRET = '<YourStorageAccessKey>'; 

WITH IDENTITY = '<YourAccountName>', 
SECRET = '<YourStorageAccessKey>'; 


Note: You will need to substitute YourAccountName and backup

WITH IDENTITY = '<YourAccountName>',
SECRET = '<YourStorageAccessKey>'; 

WITH IDENTITY = '<YourAccountName>' 
,SECRET = '<YourStorageAccessKey>'; 


With the storage account name and blob name. Substitute the credential name if necessary.

Open SQL Server Management Studio (SSMS) and connect to your local copy of SQL Server.

Click New Query.

Copy and paste the following T-SQL into the new query window.

Then click Execute.

The command should show Completed Successfully.

Posted in Azure SQL Server, SQL Server | 4 Comments