SCCM High Availability Role

At MMS2018, one of the more interesting items was the introduction of the new SCCM HA Role (hint: this is in 1805 TP).

Now, it could be that I’m only slightly partial, given that it was one of sessions that I jointly presented with Kerim Hanif and Benjamin Reynolds.

After we got done covering the existing SCCM High Availability capabilities (good background here); Kerim did a great job describing and demonstrating this role.

If we examine the following architecture:


You’ll note that the content library was moved to an SMB share (one additional step to make this HA and new functionality). Also, note that SQL Server Always On Availability Groups are added for SQL HA.

Another Server (Primary2) is added to the SCCM hierarchy with the new Site Server HA role added. Initially, it will be operating in Passive Mode.

Tip: this approach can be used to rename a site server, or upgrade the site server OS!

Once the new role, on the new server, is running the original SCCM site will operate in Active mode.


The process to promote (failover) from Primary1 to Primary2 is presently manual, and the net result will be the second server will now be operating in Active Mode.


This framework opens a lot of possibilities for future enhancements in the realm of High Availability!

When time permits, I’ll cover the steps involved in taking advantage of this functionality.

Posted in High Availability, SCCM | 4 Comments

Giving Back – T-SQL Tuesday

This month’s topic is giving back to the community, or if you routinely give back, how did you get started?

For background information.


I’ve been fortunate in many ways to be acknowledged as an MVP, now for about 23 consecutive years! How did that start?

Years ago, in a distant galaxy, we used CompuServe to gain access to online services. In a company role as Product Manager for Microsoft Access, I started closely monitoring all threads on Access to learn more about the product. After a while, I started noticing answers that were less than ideal… in fact some would have caused additional issues. In turn, I would answer the original poster, giving the best information I had. In time those replies, were noticed by Microsoft.

While at a conference in Phoenix AZ, sometime mid-1995, I was approached by a Microsoft PM, who read my name badge, and said “You are going to become an MVP”. Knowing very little about the MVP program (this was only the 2nd yr. of the MVP program), I asked what is that!? He explained the program to me, and later that year (Dec 95) I received my first MVP welcome kit, and a letter (still have it, may have to scan it at some point).

Fast forward, I moved from MS Access into the SQL Server MVP camp when it appeared that no further development was going to occur to Jet (the Access DB engine). Trivia time – how many know that Jet blue was used in DNS, WINS and other infrastructure services? Smile

While a SQL Server MVP, I was given an opportunity to design and deploy a large SMS 1.2 (Systems Management Server) environment for an insurance company. Throughout this time, I focused on participating in the community forums to share knowledge, and through sharing, I found I often learned more! In time, this involvement led to moving from the SQL Server group to SCCM MVP community, generally with a focus on SQL.

Getting involved as a conference speaker has been a tremendous benefit; not only for my career advancement, but for the connections and friendships that have been forged. I’ve been fortunate to have had the opportunity to have spoken at many conferences through the years; Microsoft Management Summit, Microsoft TechEd and the Midwest Management Summit (will be there next week).

My advice to those that would like participate in the community; start small, we all have something that we can share, someone that can benefit by what we know. If you are not, start blogging, create a twitter account, and get involved. Most of all have fun.

Posted in MVP, SQL, T-SQL Tuesday | Leave a comment

Enhancing WSUS database cleanup performance SQL script

There have been many reports of slow and sluggish clean up and removal of expired updates. An old WSUS database can include 1000’s of updates that need to be removed.

Turns out that adding two non-clustered indexes on SUSDB can DRAMTICALLY improve removal of expired updates.

This is a fine example of a community solution.

Some background reading:

Wsus Delete Obsolete Updates – where the key is the addition of the added indexes to SUSDB.

Fixing WSUS – When the Best Defense is a Good Offense – Johan does an excellent job of documenting the many things folks have tried, with varying degrees of success.

Finally, an awesome solution that includes creating the Indexes needed if they do not exist. Heavily commented. Copy and paste the following into SQL Server Management Studio.

Authored by Benjamin Reynolds (Microsoft) – be sure to follow him on Twitter.


DECLARE @BatchSize int = NULL; -- NULL = do all;  <<-- Control batches here


-- Check to see if the delete indexes exist; if not don't try to run the script:
DECLARE @IndexRetry bit = 1;
IF INDEXPROPERTY(OBJECT_ID(N'tbRevisionSupersedesUpdate'),N'IX_tbRevisionSupersedesUpdate',N'IndexID') IS NULL
   OR INDEXPROPERTY(OBJECT_ID(N'tbLocalizedPropertyForRevision'),N'IX_tbLocalizedPropertyForRevision',N'IndexID') IS NULL
GOTO NeedIndexes;

-- Create tables/variables:
DROP TABLE IF EXISTS #Results; -- This will only work on newer versions of SQL Server 2016+
DECLARE  @UpdateId int
        ,@CurUpdate int
        ,@TotalToDelete int
        ,@Msg varchar(2000);
INSERT INTO #Results (UpdateId)
EXECUTE dbo.spGetObsoleteUpdatesToCleanup;

-- If a batch size was provided update the table so we only take care of that many items during this run:
DELETE #Results
 WHERE RowNum > @BatchSize;

-- Assign working variables:
SELECT @TotalToDelete = MAX(RowNum)
  FROM #Results;
SELECT @Msg = 'Total Updates to Delete: '+CONVERT(varchar(10),@TotalToDelete);

-- Create the loop to delete the updates one at a time:
    -- Grab the "current" item:
    SELECT  TOP 1 @CurUpdate = RowNum
           ,@UpdateId = UpdateId
      FROM #Results
     ORDER BY RowNum;
    -- Provide some info during the script runtime:
    SELECT @Msg = CONVERT(varchar(30),GETDATE(),20) + ': Deleting ' + CONVERT(varchar(5),@CurUpdate) + '/' + CONVERT(varchar(5),@TotalToDelete) + ' = ' + CONVERT(varchar(10), @UpdateId);
    -- Delete the current update from the DB:
    EXECUTE dbo.spDeleteUpdate @localUpdateID = @UpdateId;
    -- Delete the current update from the table so we can get the next item:
    DELETE #Results
     WHERE RowNum = @CurUpdate;
GOTO EndScript;

-- If the indexes don't exist we'll try to create them and start over or end if we already tried once:
IF @IndexRetry = 0
    PRINT N'Indexes Required to run this script do not exist! Create them and re-run for optimal performance!';
    GOTO EndScript;
    IF INDEXPROPERTY(OBJECT_ID(N'tbRevisionSupersedesUpdate'),N'IX_tbRevisionSupersedesUpdate',N'IndexID') IS NULL
        SELECT @Msg = CONVERT(varchar(30),GETDATE(),20) + ': Index "IX_tbRevisionSupersedesUpdate" does not exist; Creating it...';
        RAISERROR(@Msg,0,1) WITH NOWAIT;
        EXECUTE (N'USE [SUSDB]; CREATE NONCLUSTERED INDEX IX_tbRevisionSupersedesUpdate ON dbo.tbRevisionSupersedesUpdate(SupersededUpdateID);');
        SELECT @Msg = CONVERT(varchar(30),GETDATE(),20) + ': ..."IX_tbRevisionSupersedesUpdate" created.';
        RAISERROR(@Msg,0,1) WITH NOWAIT;
    IF INDEXPROPERTY(OBJECT_ID(N'tbLocalizedPropertyForRevision'),N'IX_tbLocalizedPropertyForRevision',N'IndexID') IS NULL
        SELECT @Msg = CONVERT(varchar(30),GETDATE(),20) + ': Index "IX_tbLocalizedPropertyForRevision" does not exist; Creating it...';
        RAISERROR(@Msg,0,1) WITH NOWAIT;
        EXECUTE (N'USE [SUSDB]; CREATE NONCLUSTERED INDEX IX_tbLocalizedPropertyForRevision ON dbo.tbLocalizedPropertyForRevision(LocalizedPropertyID);');
        SELECT @Msg = CONVERT(varchar(30),GETDATE(),20) + ': ..."IX_tbLocalizedPropertyForRevision" created.';
        RAISERROR(@Msg,0,1) WITH NOWAIT;
    SET @IndexRetry = 0;
    GOTO IndexRetry;


Posted in SCCM, SQL Server, Uncategorized, WSUS | 29 Comments

Backup SQL Server to an Azure Storage Account

One of the benefits of backing up SQL Server to Azure, is an immediate “off site” storage solution. Another, is that Azure storage account can be geo-replicated for additional backup copies.

There are a few ways to make this work, here are the details I was able to work out for a “hands on” lab.

For this lab, you will backup up a copy of an on premise SQL Server database to an Azure Storage account.

LAB Notes: Any names that need to be added at time of creation are indicated with <>, such as <sqlstorageaccountname> used in the next example are suggestions only and will not include the <> symbols! You are encouraged to develop your own naming conventions. Screen captures shown are representative examples of each task.

Steps involved:

  1. Create an Azure Storage account
  2. Create a Credential
  3. Perform a full database backup to a URL

Create an Azure Storage account

In order to backup a SQL Server database to Azure, you will need to first create an Azure Storage account.

  1. The steps to backup a database to a URL are as follows:
    Login the Azure Portal
  2. Locate Storage account, click Create
  3. Create Storage Account – Enter the following information:
    1. Name – <sqlstorageaccountname>Note: this must be globally unique!
    2. Account Kind – General Purpose
    3. Performance – Standard
    4. Storage service encryption – Enabled
    5. Resource group
      1. Create new
      2. <sqlbootcamp>



  1. Within storage container (created last step)
  2. On the container properties page, under Blob Services, click Containers.
  3. To Create a container for the SQL backups, Click (+ Container)
    1. Name: <sqlbackup>
    2. Access type – Private (default)
    3. Click OK


Select the newly created container (last step), then Properties. Locate the URL line on right and copy this information into the clipboard.

Note: Pasting this information into Notepad is a convenient way to store this information for a subsequent lab (hint!).


It should appear similar to this (substituting storage container and container name):
https://<sqlstorageaccountname ><sqlbackup>

Next, obtain the access key for this storage account. Select the (root) storage container name, then under SETTINGS, select Access keys. To the right of key1, select the copy to clipboard icon. Save this key to notepad (paste) – it will be used in a future step.



Creating a Credential

You will now create a Credential with SQL Server, which will allow access to the Azure storage account from SSMS. This Credential is stored locally within SQL Server.

Using storage account identity and access key

Take a moment and review the following example – SQL Server credential is created to allow authentication to the Microsoft Azure Blob storage service.

CREATE CREDENTIAL [backup] WITH IDENTITY = '<YourAccountName>'  
,SECRET = '<YourStorageAccessKey>';  

Tip: Use Notepad to build your T-SQL statement – Substitute the following <bracketed> areas next with your account information.

CREDENTIAL [backup]   Note: Whatever credential name you use here, you will need to specify for the following task! “Perform a full database backup to a URL”

IDENTITY =YourAccountName‘   Note: from Create an Azure Storage accountStep 3 A just the first name, not the complete FQDN

SECRET =YourStorageAccessKey‘   Note: from Create an Azure Storage accountobtain storage access key – last step

Then, copy the entire statement into the clipboard.

Open SQL Server Management Studio (SSMS) and connect to your local copy of SQL Server.

Click New Query.

Copy and paste the above T-SQL into the new query window.

Then click Execute.

The command should show Completed Successfully.

Perform a full database backup to URL

The following example perform a full database backup of the AdventureWorks2016 database to the Microsoft Azure Blob storage service. Take a moment and review the syntax.

Backup To URL using storage account identity and access key

BACKUP DATABASE AdventureWorks2014 
TO URL = 'https://<YourAccountName><sqlbackup>/AdventureWorks2014.bak' WITH CREDENTIAL = '<backup>', COMPRESSION, STATS = 5;

Note: Using NOTEPAD, you will need to substitute YourAccountName and sqlbackup with the storage account name and blob name. Substitute the credential name backup used in the preceding step. Be sure to copy the entire SQL statement!

Open SQL Server Management Studio (SSMS) and connect to your local copy of SQL Server.

Click New Query.

From NOTEPAD – Copy and paste the T-SQL directly into the new query window.

Then click Execute.

The command should show Completed Successfully.

The following sample OUTPUT is provided:

5 percent processed.
10 percent processed.
15 percent processed.
20 percent processed.
25 percent processed.
30 percent processed.
35 percent processed.
40 percent processed.
45 percent processed.
50 percent processed.
55 percent processed.
60 percent processed.
65 percent processed.
70 percent processed.
75 percent processed.
80 percent processed.
85 percent processed.
90 percent processed.
95 percent processed.
Processed 24288 pages for database 'AdventureWorks2014', file 'AdventureWorks2014_Data' on file 1.
100 percent processed.
Processed 2 pages for database 'AdventureWorks2014', file 'AdventureWorks2014_Log' on file 1.
BACKUP DATABASE successfully processed 24290 pages in 91.076 seconds (2.083 MB/sec).
Posted in Azure, Azure SQL Server, SCCM, SQL Server | Leave a comment

MMS 2018 Sessions

I’m honored to be speaking at MMS, and to have some excellent co-presenters!

If you are attending MMS 2018 please take a moment and check out our sessions. Sign up now, and please stop by to say hello!

Configuring SQL Performance for System Center

Optimal performance for SQL Server starts with proper configuration, design and implementation. Learn from industry experts how to properly configure your SQL Server Instance to work best with System Center applications.
What you will learn:

  • Tuning and optimization techniques
  • Important considerations for sizing your database for ConfigMgr, SCOM, VMM, and more
  • How to correct common misconfigurations

Speakers: Benjamin Reynolds and Steve Thompson

SQL Server AlwaysOn Availability Groups

Have you wondered about this SQL Always On Availability Group stuff? What is it? What considerations are involved in the design? How do you install it? Learn how can you use this to extend ConfigMgr high availability capabilities.
What you will learn:

  • SQL Always On Availability Group design considerations
  • SQL Always On Availability Group interactions with ConfigMgr
  • SQL Always On Availability Group important capabilities

Speakers: Benjamin Reynolds and Steve Thompson

High Availability Options for ConfigMgr and SQL Server Always On Availability

Adding to the material delivered in the SQL Server Always On Availability, this session will go deeper on:
– Site Server HA (High Availability)
– Remote Site Server Content Library
– Report Server and HA
– What Reference Architecture(s) should you consider?
– MSIT – Lessons learned (architecture, deployment, etc.)
What you will learn:

  • Learn more about ConfigMgr Site server HA and futures.
  • What do you need to know about other roles (such as Report Server) and HA
  • Learn how MSIT is handling these challenges at scale!

Speakers: Kerim Hanif, Benjamin Reynolds and Steve Thompson

Posted in MMS, MMS 2018 | 2 Comments

How to apply a SQL Server update for ConfigMgr

    This example is used to illustrate applying the latest SQL Cumulative Update to SQL 2016, this CM Site server happens to be running current branch 1710. This same process would be used to update SQL Server with a service pack as well.


    SQL Backup – double check that you have a recent SQL Server backup for CM. That is, something that is less than 24 hours old.

    Update Center for Microsoft SQL Server – if you don’t have this site bookmarked, you should. At this time, the latest SQL Server 2016 update is CU7. Download that update from the download page.


    Optionally, you can extract this update by using the <updatename>.exe /X

    Note: I tend to extract if I need to perform more than one update

    Plan for a SQL Server restart. Before running the update, Stop SCCM Services. (helps avoid contention and speeds up the process.

    On the CM Site server, stop the following services:


    On the SQL Server, from the folder you have downloaded update – the Right click, run as admin


    Agree to license terms, click Next


    Default settings, Click Next


    Wait for file in use check…


    Click Next


    Click Update


    Review completed changes, close.


    I prefer to reboot the SQL Server at this time for the updated binaries to take effect, and services will restart. If CM is collocated, then these services will also restart. If CM is not collocated, then restart the services or the server to complete.

    Once SQL Server restarts, open SQL Server Management Studio, as a query run

    Select @@VERSION

    to validate latest version.


Posted in ConfigMgr, SQL Server, SQL Server 2016 | Leave a comment

Azure Data Migration Service Review

The Data Migration Service (DMS) is a new Azure service still in preview. DMS will help you migrate existing on-premises SQL Server, Oracle, and MySQL databases to Azure SQL Database, Azure SQL Database Managed Instance or SQL Server on an Azure virtual machine.

Since it is now in public preview, for the time being, it is “free” to use.

It has several advantages for migrating data, in that once the data is migrated to the target, it is possible to run a continuous synchronization, or synchronize as needed to keep the target DB up to date. Best of all, only changes (deltas) are synchronized from the source to target on subsequent synchronization.

The easiest way to test DMS, is to stand up SQL Server running on an Azure VM. If the target DB will be on premise, you will need a VNET in place to Azure.

There is a useful migration guide as well, Database Migration Guide <>

Here are the high-level steps involved:

1. Upgrade (or install latest) Database Migration Assistant (DMA) to v3.3

2. Create blank Azure SQL DB

3. Open firewall rules on server for Azure SQL DB

4. Run DMA assessment (New Project)

5. Run DMA Migration (New Project)

a. Generate Schema only

b. Deploy Schema

6. Register DMS Service in Azure

7. Using DMS:

a. Create a new DMS Service

b. Schedule a data synchronize task

c. Run task

From the Azure Portal, create an empty Azure SQL DB, open the firewall rules (as needed) for the SQL Server container. I named mine DMA2, this will be your TARGET DB.

To prepare the SOURCE server:

In Azure, I created a SQL Server 2016 Standard Instance on Server 2016 (used a standard template).

On the target server, used a sample database from Adventure Works 2012

Restored Adventure Works 2012 database

Download SQL Data Migration Assistant v3.3

DMA – Start a new project, Migrate Schema only (using source DB from SQL 2016, on Server 2016), click Next


Choose the TARGET DB, click Next


Allow default settings to be selected, click Generate SQL Script


If desired, click Save to capture the script. Further, review any potential issues. Click Deploy Schema.


You should see the schema sucessfully deploy


Register Microsoft.DataMigration Provider

From the Azure portal, browse to Subscriptions > Resource Providers > search on ‘data’ > locate Microsoft.DataMigration and click ‘Register’


Once Microsoft.DataMigration shows registered you can proceed:


Add the Database Migration Service

From the Azure portal Marketplace, search on ‘data migration service’


Click Create


Enter unique Service Name (for this example, used DMSMigration2), choose pricing tier, then Create.

From the new Data Migration Service, choose New Migration Service


Enter the Source Server name (FQDN) or IP address


Select the source DB name


Enter the TARGET Azure SQL DB, SQL Server container name, click Save


Click Configure Required Settings, then choose Do not validate my database(s), click Save


Review the summary dialog, then click Run Migration


Migration in progress


Migration complete


Posted in Azure SQL Server, Migration, SQL Server 2016 | Leave a comment

TSQL2sday #94–SQL Server and PowerShell

T-SQL Tuesday (#tsql2sday), I’ve been absent for a few, figured it was time to jump back in!

SQL and PowerShell, what a great combination. First, let me thank the sponsor, you can find his site from the following link.


One of the first scripting languages I mastered for automation was VBScript. I know. Stop laughing.

However, that background gave me the ability to learn PowerShell and use it in ways to automate the more mundane aspects of SQL Server administration. Two examples of how PowerShell has proved to be an awesome “force multiplier” follow.

Using PowerShell to Automate a SQL Server database restore

The challenge was taking a backup from one SQL Server, written to a Server UNC, and restoring it to another SQL Server on a daily basis. This process is documented here. There were two challenging parts of this process; how to identify the latest backup file, since it was written out with a unique name. The other to restore it to a different location on the target server. Review the code to locate these solutions.

The bonus was not only automation, but a huge savings in time to complete the restore! Using the UI to restore the database ~650GB took over 2.5 hours, using PowerShell and the ADO methods, less than 1 hour!

Using PowerShell to Audit SQL Server Instances

Tasked with investigating many different SQL Server instances for configuration, possible performance issues and documenting settings; I investigated using PowerShell to gather this information via queries. Then, in turn, place these results in an Excel workbook. In this manner, it provides guidance in evaluating and making recommendations on improving settings such as configuration. Plus, you’ll have a baseline of current instance and database settings. This process is documented here.

This PowerShell script is extensible, I’ve re-written it once to take advantage of a re-usable connection object for enhanced performance. It likely needs to be enhanced again in the near future.


If you’ve been thinking of automating your SQL Server instance or database, check out PowerShell. The time invested in learning and creating solutions can not only save a LOT of time, it will free up your time to work on other things. Do share your creative solutions with the community.

Posted in PowerShell, SQL Server | Leave a comment

Installing Power BI Report Server

Thought I’d post my initial impressions of installing Power BI Report Server. Unless you’ve been disconnected from the web for any length of time, you’ve heard of Power BI. In May 2017, Microsoft released the Power BI Premium service tier, which includes the Power BI Report Server.

The home page will remind you of SSRS:


You have the option to install PBI Report Server as standalone service, or on the same server with SQL Server and/or SQL Server Reporting Services. In any event, you need to install PBI Report Server on a server running a SQL Server instance.

If you are familiar with the Power BI desktop software, there is a new PBI Desktop software you’ll need to install to work with PBI Report Server.

Here are the details and link:

Microsoft Power BI Report Server – June 2017 GA


Power BI Report Server, available as part of Power BI Premium, enables on-premises web and mobile viewing of Power BI reports, plus the enterprise reporting capabilities of SQL Server Reporting Services.

In this version of Power BI Report Server, you can:

  • Connect “live” to Analysis Services models – both Tabular and Multidimensional (cubes)
  • Visually explore data and create an interactive report
  • Use custom visuals in your Power BI Report
  • Save that report to your report server running the preview
  • View and interact with the report in your web browser
  • Add and view comments about the report in the web browser
  • View and interact with the report in the Power BI Mobile App

From <>

Let’s walk though an install of Power BI Report Server

Run PowerBIReportServer.exe, select install PBI Report Server


Choose evaluation version, unless you have the product key, click Next


Agree to license terms, click Next


You’re installing on a server with the database engine installed, click Next


Click Install


Monitor progress…


Installation complete.


Next up, configure report Server.

Connect to your local SQL Server instance


Note that once connected, your instance will be:



Next, click the Web Service URL on left and change name of VirtualDirectory to: ReportServerPBI (this allows SSRS to co-exist). Click Apply.


Click database, on left

Click Next


Click Test Connection to validate. If everything is fine, connection succeeded


Add a PBI suffix to the Database name (so it doesn’t over-write SSRS)

Click Next


Use default service credentials, click next


Review summary, click next


Completion dialog

Click Finish


Completed successfully


Click Web Portal URL on left, then add PBI suffix to Virtual Directory name, click Apply


Click the hyperlink shown the last dialog and it should take you to the Power BI Server.


Watch this blog for future updates on using the PBI Server and the new PBI Desktop software.

Posted in Power BI Report Server, PowerBI, SQL, SQL Server | Leave a comment

Properly configure SQL Server ConfigMgr database

Ideally, this will take place before you install ConfigMgr!

First, estimate size(s) of the databases:

How? Use the  SCCM DB Sizing estimator!

Shown is an example of the Database sizing worksheet, based on estimated client counts:


Anthony has documented this approach as follows:

Get the sizing estimator here.

Using the estimator to calculate the number and size of the data files, pre-create the SQL Server database for ConfigMgr. Key take away, will be the total estimate sizes.

Multiple data files may be appropriate for a couple of reasons. SQL Server is multi-threaded, which means you may get performance gains by having multiple data files. Further, these data files could be located on different logical drives to further distribute load, and increase IOPS. on As a general rule of thumb, use the following as a guideline:

Total Estimate Client Count

Number of CM data files

< 10K


10K to 25K


>25K to 50K




Other factors to consider:

  • Drives dedicated for SQL data and logs should be formatted with NTFS in 64K block size.
  • Always place TempDB data and logs on a dedicated logical drive
  • DB files should be equal size (hint: divide total estimated size / # data files)
  • Create data files to be an even GB
  • Always pre-create the db files prior to installing CM – auto-growth is bad.
  • If you enable auto-growth:
    • Do not use percent (%) growth
    • Set auto-growth to be evenly divisible into a whole GB; (i.e. 512 or 1024MB)
  • Monitor available free space
  • Always use a single LOG file
  • If possible, place LOG file on separate logical drive

It is possible to migrate a single data file to multiple data files. Use this reference to get you started. It is an older article, the screen shots are missing, however the technique is valid.

Note: Be cautious if you attempt to implement in a multi-site hierarchy! Table partitioning may present an issue.

Final thoughts: As always, there is never a right or wrong way to do this, this is intended be used a guideline only. Test in a lab environment first. No cute furry animals were harmed with this process and never stick your fingers inside the cage.

Posted in ConfigMgr, SQL, SQL Server | 18 Comments