Enhancing WSUS database cleanup performance SQL script

There have been many reports of slow and sluggish clean up and removal of expired updates. An old WSUS database can include 1000’s of updates that need to be removed.

Turns out that adding two non-clustered indexes on SUSDB can DRAMTICALLY improve removal of expired updates.

This is a fine example of a community solution.

Some background reading:

Wsus Delete Obsolete Updates – where the key is the addition of the added indexes to SUSDB.

Fixing WSUS – When the Best Defense is a Good Offense – Johan does an excellent job of documenting the many things folks have tried, with varying degrees of success.

Finally, an awesome solution that includes creating the Indexes needed if they do not exist. Heavily commented. Copy and paste the following into SQL Server Management Studio.

Authored by Benjamin Reynolds (Microsoft) – be sure to follow him on Twitter.

USE [SUSDB];

DECLARE @BatchSize int = NULL; -- NULL = do all;  <<-- Control batches here

SET NOCOUNT ON;

-- Check to see if the delete indexes exist; if not don't try to run the script:
DECLARE @IndexRetry bit = 1;
IndexRetry:
IF INDEXPROPERTY(OBJECT_ID(N'tbRevisionSupersedesUpdate'),N'IX_tbRevisionSupersedesUpdate',N'IndexID') IS NULL
   OR INDEXPROPERTY(OBJECT_ID(N'tbLocalizedPropertyForRevision'),N'IX_tbLocalizedPropertyForRevision',N'IndexID') IS NULL
GOTO NeedIndexes;

-- Create tables/variables:
DROP TABLE IF EXISTS #Results; -- This will only work on newer versions of SQL Server 2016+
DECLARE  @UpdateId int
        ,@CurUpdate int
        ,@TotalToDelete int
        ,@Msg varchar(2000);
CREATE TABLE #Results (RowNum int IDENTITY(1,1) PRIMARY KEY CLUSTERED NOT NULL, UpdateId int NOT NULL);
INSERT INTO #Results (UpdateId)
EXECUTE dbo.spGetObsoleteUpdatesToCleanup;

-- If a batch size was provided update the table so we only take care of that many items during this run:
IF @BatchSize IS NOT NULL
DELETE #Results
 WHERE RowNum > @BatchSize;

-- Assign working variables:
SELECT @TotalToDelete = MAX(RowNum)
  FROM #Results;
-- 
SELECT @Msg = 'Total Updates to Delete: '+CONVERT(varchar(10),@TotalToDelete);
RAISERROR (@Msg,0,1) WITH NOWAIT;

-- Create the loop to delete the updates one at a time:
WHILE EXISTS (SELECT * FROM #Results)
BEGIN
    -- Grab the "current" item:
    SELECT  TOP 1 @CurUpdate = RowNum
           ,@UpdateId = UpdateId
      FROM #Results
     ORDER BY RowNum;
    
    -- Provide some info during the script runtime:
    SELECT @Msg = CONVERT(varchar(30),GETDATE(),20) + ': Deleting ' + CONVERT(varchar(5),@CurUpdate) + '/' + CONVERT(varchar(5),@TotalToDelete) + ' = ' + CONVERT(varchar(10), @UpdateId);
    RAISERROR(@Msg,0,1) WITH NOWAIT;
    
    -- Delete the current update from the DB:
    EXECUTE dbo.spDeleteUpdate @localUpdateID = @UpdateId;
    
    -- Delete the current update from the table so we can get the next item:
    DELETE #Results
     WHERE RowNum = @CurUpdate;
END;
GOTO EndScript;

NeedIndexes:
-- If the indexes don't exist we'll try to create them and start over or end if we already tried once:
IF @IndexRetry = 0
BEGIN
    PRINT N'Indexes Required to run this script do not exist! Create them and re-run for optimal performance!';
    GOTO EndScript;
END;
ELSE
BEGIN
    IF INDEXPROPERTY(OBJECT_ID(N'tbRevisionSupersedesUpdate'),N'IX_tbRevisionSupersedesUpdate',N'IndexID') IS NULL
    BEGIN
        SELECT @Msg = CONVERT(varchar(30),GETDATE(),20) + ': Index "IX_tbRevisionSupersedesUpdate" does not exist; Creating it...';
        RAISERROR(@Msg,0,1) WITH NOWAIT;
        EXECUTE (N'USE [SUSDB]; CREATE NONCLUSTERED INDEX IX_tbRevisionSupersedesUpdate ON dbo.tbRevisionSupersedesUpdate(SupersededUpdateID);');
        SELECT @Msg = CONVERT(varchar(30),GETDATE(),20) + ': ..."IX_tbRevisionSupersedesUpdate" created.';
        RAISERROR(@Msg,0,1) WITH NOWAIT;
    END;
    IF INDEXPROPERTY(OBJECT_ID(N'tbLocalizedPropertyForRevision'),N'IX_tbLocalizedPropertyForRevision',N'IndexID') IS NULL
    BEGIN
        SELECT @Msg = CONVERT(varchar(30),GETDATE(),20) + ': Index "IX_tbLocalizedPropertyForRevision" does not exist; Creating it...';
        RAISERROR(@Msg,0,1) WITH NOWAIT;
        EXECUTE (N'USE [SUSDB]; CREATE NONCLUSTERED INDEX IX_tbLocalizedPropertyForRevision ON dbo.tbLocalizedPropertyForRevision(LocalizedPropertyID);');
        SELECT @Msg = CONVERT(varchar(30),GETDATE(),20) + ': ..."IX_tbLocalizedPropertyForRevision" created.';
        RAISERROR(@Msg,0,1) WITH NOWAIT;
    END;
    
    SET @IndexRetry = 0;
    GOTO IndexRetry;
END;

EndScript:
DROP TABLE IF EXISTS #Results;

Advertisements
Posted in SCCM, SQL Server, Uncategorized, WSUS | 17 Comments

Backup SQL Server to an Azure Storage Account

One of the benefits of backing up SQL Server to Azure, is an immediate “off site” storage solution. Another, is that Azure storage account can be geo-replicated for additional backup copies.

There are a few ways to make this work, here are the details I was able to work out for a “hands on” lab.

For this lab, you will backup up a copy of an on premise SQL Server database to an Azure Storage account.

LAB Notes: Any names that need to be added at time of creation are indicated with <>, such as <sqlstorageaccountname> used in the next example are suggestions only and will not include the <> symbols! You are encouraged to develop your own naming conventions. Screen captures shown are representative examples of each task.

Steps involved:

  1. Create an Azure Storage account
  2. Create a Credential
  3. Perform a full database backup to a URL

Create an Azure Storage account

In order to backup a SQL Server database to Azure, you will need to first create an Azure Storage account.

  1. The steps to backup a database to a URL are as follows:
    Login the Azure Portal https://portal.azure.com
  2. Locate Storage account, click Create
  3. Create Storage Account – Enter the following information:
    1. Name – <sqlstorageaccountname>Note: this must be globally unique!
    2. Account Kind – General Purpose
    3. Performance – Standard
    4. Storage service encryption – Enabled
    5. Resource group
      1. Create new
      2. <sqlbootcamp>

      Lab1-img1

 

  1. Within storage container (created last step)
  2. On the container properties page, under Blob Services, click Containers.
  3. To Create a container for the SQL backups, Click (+ Container)
    1. Name: <sqlbackup>
    2. Access type – Private (default)
    3. Click OK

Lab1b-2a

Select the newly created container (last step), then Properties. Locate the URL line on right and copy this information into the clipboard.

Note: Pasting this information into Notepad is a convenient way to store this information for a subsequent lab (hint!).

Lab1-img3

It should appear similar to this (substituting storage container and container name):
https://<sqlstorageaccountname >.blob.core.windows.net/<sqlbackup>

Next, obtain the access key for this storage account. Select the (root) storage container name, then under SETTINGS, select Access keys. To the right of key1, select the copy to clipboard icon. Save this key to notepad (paste) – it will be used in a future step.

SNAGHTML1949f7b1

 

Creating a Credential

You will now create a Credential with SQL Server, which will allow access to the Azure storage account from SSMS. This Credential is stored locally within SQL Server.

Using storage account identity and access key

Take a moment and review the following example – SQL Server credential is created to allow authentication to the Microsoft Azure Blob storage service.

CREATE CREDENTIAL [backup] WITH IDENTITY = '<YourAccountName>'  
,SECRET = '<YourStorageAccessKey>';  
GO

Tip: Use Notepad to build your T-SQL statement – Substitute the following <bracketed> areas next with your account information.

CREDENTIAL [backup]   Note: Whatever credential name you use here, you will need to specify for the following task! “Perform a full database backup to a URL”

IDENTITY =YourAccountName‘   Note: from Create an Azure Storage accountStep 3 A just the first name, not the complete FQDN

SECRET =YourStorageAccessKey‘   Note: from Create an Azure Storage accountobtain storage access key – last step

Then, copy the entire statement into the clipboard.

Open SQL Server Management Studio (SSMS) and connect to your local copy of SQL Server.

Click New Query.

Copy and paste the above T-SQL into the new query window.

Then click Execute.

The command should show Completed Successfully.

Perform a full database backup to URL

The following example perform a full database backup of the AdventureWorks2016 database to the Microsoft Azure Blob storage service. Take a moment and review the syntax.

Backup To URL using storage account identity and access key

BACKUP DATABASE AdventureWorks2014 
TO URL = 'https://<YourAccountName>.blob.core.windows.net/<sqlbackup>/AdventureWorks2014.bak' WITH CREDENTIAL = '<backup>', COMPRESSION, STATS = 5;

Note: Using NOTEPAD, you will need to substitute YourAccountName and sqlbackup with the storage account name and blob name. Substitute the credential name backup used in the preceding step. Be sure to copy the entire SQL statement!

Open SQL Server Management Studio (SSMS) and connect to your local copy of SQL Server.

Click New Query.

From NOTEPAD – Copy and paste the T-SQL directly into the new query window.

Then click Execute.

The command should show Completed Successfully.

The following sample OUTPUT is provided:

Output
5 percent processed.
10 percent processed.
15 percent processed.
20 percent processed.
25 percent processed.
30 percent processed.
35 percent processed.
40 percent processed.
45 percent processed.
50 percent processed.
55 percent processed.
60 percent processed.
65 percent processed.
70 percent processed.
75 percent processed.
80 percent processed.
85 percent processed.
90 percent processed.
95 percent processed.
Processed 24288 pages for database 'AdventureWorks2014', file 'AdventureWorks2014_Data' on file 1.
100 percent processed.
Processed 2 pages for database 'AdventureWorks2014', file 'AdventureWorks2014_Log' on file 1.
BACKUP DATABASE successfully processed 24290 pages in 91.076 seconds (2.083 MB/sec).
Posted in Azure, Azure SQL Server, SCCM, SQL Server | Leave a comment

MMS 2018 Sessions

I’m honored to be speaking at MMS, and to have some excellent co-presenters!

If you are attending MMS 2018 please take a moment and check out our sessions. Sign up now, and please stop by to say hello!

Configuring SQL Performance for System Center

Description:
Optimal performance for SQL Server starts with proper configuration, design and implementation. Learn from industry experts how to properly configure your SQL Server Instance to work best with System Center applications.
What you will learn:

  • Tuning and optimization techniques
  • Important considerations for sizing your database for ConfigMgr, SCOM, VMM, and more
  • How to correct common misconfigurations

Speakers: Benjamin Reynolds and Steve Thompson

SQL Server AlwaysOn Availability Groups

Description:
Have you wondered about this SQL Always On Availability Group stuff? What is it? What considerations are involved in the design? How do you install it? Learn how can you use this to extend ConfigMgr high availability capabilities.
What you will learn:

  • SQL Always On Availability Group design considerations
  • SQL Always On Availability Group interactions with ConfigMgr
  • SQL Always On Availability Group important capabilities

Speakers: Benjamin Reynolds and Steve Thompson

High Availability Options for ConfigMgr and SQL Server Always On Availability

Description:
Adding to the material delivered in the SQL Server Always On Availability, this session will go deeper on:
– Site Server HA (High Availability)
– Remote Site Server Content Library
– Report Server and HA
– What Reference Architecture(s) should you consider?
– MSIT – Lessons learned (architecture, deployment, etc.)
What you will learn:

  • Learn more about ConfigMgr Site server HA and futures.
  • What do you need to know about other roles (such as Report Server) and HA
  • Learn how MSIT is handling these challenges at scale!

Speakers: Kerim Hanif, Benjamin Reynolds and Steve Thompson

Posted in MMS, MMS 2018 | 2 Comments

How to apply a SQL Server update for ConfigMgr

    This example is used to illustrate applying the latest SQL Cumulative Update to SQL 2016, this CM Site server happens to be running current branch 1710. This same process would be used to update SQL Server with a service pack as well.

    Pre-requisites 

    SQL Backup – double check that you have a recent SQL Server backup for CM. That is, something that is less than 24 hours old.

    Update Center for Microsoft SQL Server – if you don’t have this site bookmarked, you should. At this time, the latest SQL Server 2016 update is CU7. Download that update from the download page.

    clip_image008

    Optionally, you can extract this update by using the <updatename>.exe /X

    Note: I tend to extract if I need to perform more than one update

    Plan for a SQL Server restart. Before running the update, Stop SCCM Services. (helps avoid contention and speeds up the process.

    On the CM Site server, stop the following services:

    clip_image001

    On the SQL Server, from the folder you have downloaded update – the Right click, run as admin

    clip_image009

    Agree to license terms, click Next

    clip_image010

    Default settings, Click Next

    clip_image011

    Wait for file in use check…

    clip_image012

    Click Next

    clip_image013

    Click Update

    clip_image014

    Review completed changes, close.

    clip_image015

    I prefer to reboot the SQL Server at this time for the updated binaries to take effect, and services will restart. If CM is collocated, then these services will also restart. If CM is not collocated, then restart the services or the server to complete.

    Once SQL Server restarts, open SQL Server Management Studio, as a query run

    Select @@VERSION

    to validate latest version.

    clip_image016

Posted in ConfigMgr, SQL Server, SQL Server 2016 | Leave a comment

Azure Data Migration Service Review

The Data Migration Service (DMS) is a new Azure service still in preview. DMS will help you migrate existing on-premises SQL Server, Oracle, and MySQL databases to Azure SQL Database, Azure SQL Database Managed Instance or SQL Server on an Azure virtual machine.

Since it is now in public preview, for the time being, it is “free” to use.

It has several advantages for migrating data, in that once the data is migrated to the target, it is possible to run a continuous synchronization, or synchronize as needed to keep the target DB up to date. Best of all, only changes (deltas) are synchronized from the source to target on subsequent synchronization.

The easiest way to test DMS, is to stand up SQL Server running on an Azure VM. If the target DB will be on premise, you will need a VNET in place to Azure.

There is a useful migration guide as well, Database Migration Guide <https://datamigration.microsoft.com/>

Here are the high-level steps involved:

1. Upgrade (or install latest) Database Migration Assistant (DMA) to v3.3

2. Create blank Azure SQL DB

3. Open firewall rules on server for Azure SQL DB

4. Run DMA assessment (New Project)

5. Run DMA Migration (New Project)

a. Generate Schema only

b. Deploy Schema

6. Register DMS Service in Azure

7. Using DMS:

a. Create a new DMS Service

b. Schedule a data synchronize task

c. Run task

From the Azure Portal, create an empty Azure SQL DB, open the firewall rules (as needed) for the SQL Server container. I named mine DMA2, this will be your TARGET DB.

To prepare the SOURCE server:

In Azure, I created a SQL Server 2016 Standard Instance on Server 2016 (used a standard template).

On the target server, used a sample database from Adventure Works 2012

https://github.com/Microsoft/sql-server-samples/releases/tag/adventureworks2012

Restored Adventure Works 2012 database

Download SQL Data Migration Assistant v3.3

https://www.microsoft.com/en-us/download/details.aspx?id=53595

DMA – Start a new project, Migrate Schema only (using source DB from SQL 2016, on Server 2016), click Next

clip_image002

Choose the TARGET DB, click Next

clip_image004

Allow default settings to be selected, click Generate SQL Script

clip_image006

If desired, click Save to capture the script. Further, review any potential issues. Click Deploy Schema.

clip_image008

You should see the schema sucessfully deploy

clip_image010

Register Microsoft.DataMigration Provider

https://docs.microsoft.com/en-us/azure/dms/tutorial-sql-server-to-azure-sql#register-the-microsoftdatamigration-resource-provider

From the Azure portal, browse to Subscriptions > Resource Providers > search on ‘data’ > locate Microsoft.DataMigration and click ‘Register’

clip_image012

Once Microsoft.DataMigration shows registered you can proceed:

clip_image014

Add the Database Migration Service

From the Azure portal Marketplace, search on ‘data migration service’

clip_image016

Click Create

clip_image018

Enter unique Service Name (for this example, used DMSMigration2), choose pricing tier, then Create.

From the new Data Migration Service, choose New Migration Service

clip_image020

Enter the Source Server name (FQDN) or IP address

clip_image022

Select the source DB name

clip_image024

Enter the TARGET Azure SQL DB, SQL Server container name, click Save

clip_image026

Click Configure Required Settings, then choose Do not validate my database(s), click Save

clip_image028

Review the summary dialog, then click Run Migration

clip_image030

Migration in progress

clip_image002[4]

Migration complete

clip_image004[4]

Posted in Azure SQL Server, Migration, SQL Server 2016 | Leave a comment

TSQL2sday #94–SQL Server and PowerShell

T-SQL Tuesday (#tsql2sday), I’ve been absent for a few, figured it was time to jump back in!

SQL and PowerShell, what a great combination. First, let me thank the sponsor, you can find his site from the following link.

clip_image002

One of the first scripting languages I mastered for automation was VBScript. I know. Stop laughing.

However, that background gave me the ability to learn PowerShell and use it in ways to automate the more mundane aspects of SQL Server administration. Two examples of how PowerShell has proved to be an awesome “force multiplier” follow.

Using PowerShell to Automate a SQL Server database restore

The challenge was taking a backup from one SQL Server, written to a Server UNC, and restoring it to another SQL Server on a daily basis. This process is documented here. There were two challenging parts of this process; how to identify the latest backup file, since it was written out with a unique name. The other to restore it to a different location on the target server. Review the code to locate these solutions.

The bonus was not only automation, but a huge savings in time to complete the restore! Using the UI to restore the database ~650GB took over 2.5 hours, using PowerShell and the ADO methods, less than 1 hour!

Using PowerShell to Audit SQL Server Instances

Tasked with investigating many different SQL Server instances for configuration, possible performance issues and documenting settings; I investigated using PowerShell to gather this information via queries. Then, in turn, place these results in an Excel workbook. In this manner, it provides guidance in evaluating and making recommendations on improving settings such as configuration. Plus, you’ll have a baseline of current instance and database settings. This process is documented here.

This PowerShell script is extensible, I’ve re-written it once to take advantage of a re-usable connection object for enhanced performance. It likely needs to be enhanced again in the near future.

Summary

If you’ve been thinking of automating your SQL Server instance or database, check out PowerShell. The time invested in learning and creating solutions can not only save a LOT of time, it will free up your time to work on other things. Do share your creative solutions with the community.

Posted in PowerShell, SQL Server | Leave a comment

Installing Power BI Report Server

Thought I’d post my initial impressions of installing Power BI Report Server. Unless you’ve been disconnected from the web for any length of time, you’ve heard of Power BI. In May 2017, Microsoft released the Power BI Premium service tier, which includes the Power BI Report Server.

The home page will remind you of SSRS:

image

You have the option to install PBI Report Server as standalone service, or on the same server with SQL Server and/or SQL Server Reporting Services. In any event, you need to install PBI Report Server on a server running a SQL Server instance.

If you are familiar with the Power BI desktop software, there is a new PBI Desktop software you’ll need to install to work with PBI Report Server.

Here are the details and link:

Microsoft Power BI Report Server – June 2017 GA

clip_image001

Power BI Report Server, available as part of Power BI Premium, enables on-premises web and mobile viewing of Power BI reports, plus the enterprise reporting capabilities of SQL Server Reporting Services.

In this version of Power BI Report Server, you can:

  • Connect “live” to Analysis Services models – both Tabular and Multidimensional (cubes)
  • Visually explore data and create an interactive report
  • Use custom visuals in your Power BI Report
  • Save that report to your report server running the preview
  • View and interact with the report in your web browser
  • Add and view comments about the report in the web browser
  • View and interact with the report in the Power BI Mobile App

From <https://www.microsoft.com/en-us/download/details.aspx?id=55329>

Let’s walk though an install of Power BI Report Server

Run PowerBIReportServer.exe, select install PBI Report Server

clip_image001[4]

Choose evaluation version, unless you have the product key, click Next

clip_image002

Agree to license terms, click Next

clip_image003

You’re installing on a server with the database engine installed, click Next

clip_image004

Click Install

clip_image005

Monitor progress…

clip_image006

Installation complete.

clip_image007

Next up, configure report Server.

Connect to your local SQL Server instance

SNAGHTML5caa98

Note that once connected, your instance will be:

<hostname>\PBIRS

SNAGHTML5e0f68

Next, click the Web Service URL on left and change name of VirtualDirectory to: ReportServerPBI (this allows SSRS to co-exist). Click Apply.

SNAGHTML6a7aaa

Click database, on left

Click Next

image

Click Test Connection to validate. If everything is fine, connection succeeded

SNAGHTML612051

Add a PBI suffix to the Database name (so it doesn’t over-write SSRS)

Click Next

image

Use default service credentials, click next

image

Review summary, click next

SNAGHTML630d1d

Completion dialog

Click Finish

image

Completed successfully

SNAGHTML67e9fd

Click Web Portal URL on left, then add PBI suffix to Virtual Directory name, click Apply

SNAGHTML68f969

Click the hyperlink shown the last dialog and it should take you to the Power BI Server.

image

Watch this blog for future updates on using the PBI Server and the new PBI Desktop software.

Posted in Power BI Report Server, PowerBI, SQL, SQL Server | Leave a comment