Supported SQL Server versions for SCCM CB upgrades

Had a question recently on what version of SQL Server is supported for upgrade to SCCM CB (current branch). Scenario was this; client was running SQL 2012 SP2 for their SCCM 2012 R2. However, in order to upgrade to SCCM  CB, they needed to be at a minimum SQL version of SQL 2012 SP3.

Then, the question became, can I just upgrade to SQL Server 2016 and skip an upgrade step?

Not with the SCCM version in place, this would probably break SCCM 2012 R2. In this case, I’d recommend installing SQL 2012 SP3. Upgrade SCCM to CB 1702 (or later), then upgrade SQL Server 2016.

Reference the SCCM / SQL Server compatibly version matrix for the details.

Posted in ConfigMgr, SQL Server | 4 Comments

Properly size SQL Server TempDB for ConfigMgr

At MMS 2017, Benjamin Reynolds and I covered properly sizing TempDB as one facet in our Optimizing and Tuning SQL Server for ConfigMgr session. Because of a few conversations that occurred that week at MMS, I think it is appropriate to cover this here in a bit more detail.

As background, SQL Server TempDB is used for temporary objects created by queries, sort operations and more… from Microsoft docs:

The tempdb system database is a global resource that is available to all users connected to the instance of SQL Server and is used to hold the following:

  • Temporary user objects that are explicitly created, such as: global or local temporary tables, temporary stored procedures, table variables, or cursors.

  • Internal objects that are created by the SQL Server Database Engine, for example, work tables to store intermediate results for spools or sorting.

First, the primary talking points:

  • Total TempDB should approximate 25-30% of the SCCM total size.
  • TempDB data files should be equally sized
  • Place TempDB on a dedicated drive, with log file
  • Create 4 (or more) equally sized data files. This helps you avoid the GAM/SGAM/PFS page contention issues described in Microsoft KB 2154845.
  • Turn off auto-growth
  • Create no more than 8 data files

Now, more detail on each of these points:

Total TempDB should approximate 25-30% of the SCCM total size.

As a starting point, calculate the total estimated size of the SCCM database. If you are not sure what that size will be, use Anthony Clendenen’s sizing calculator. Once you know the SCCM size, the total TempDB size can be calculated.

TempDB data files should be equally sized

This is often overlooked, once you have determined the total TempDB size, divide that by the number of data files (start with 4) and create EQUAL size data files. This will allow SQL Server to automatically choose the next available data file for operations.

Place TempDB on a dedicated drive, with log file

TempDB can be very IO intensive, if you have the ability to choose SSD storage for TempDB, do it. There is no benefit to separate the TempDB log file to another drive, place all TempDB data and log file(s) on a single, logical drive.

Create 4 (or more) equally sized data files. This helps you avoid the GAM/SGAM/PFS page contention issues described in Microsoft KB 2154845.

Page contention can occur on TempDB. By creating multiple data files, you can reduce contention as SQL Server will automatically roll to another TempDB data file if one is in use. Hence, performance can be increased by reducing contention.

Turn off auto-growth

If you have a logical drive dedicated for TempDB data and log file(s), consider filling the drive with the TempDB data files and turning off file Autogrowth.

Create no more than 8 data files

In general, never create more than 8 data files for TempDB. For SCCM, 4 equally sized data files seems to work quite well.

Posted in Uncategorized | 6 Comments

SQL Backup to URL (Azure Storage account)

How to backup up a copy of an on premise database, or Azure SQL Server database to an Azure Storage account.

Notes: Any names that need to be added at time of creation are indicated with <>, such as <sqlstorageaccountname> used in the next example are suggestions only and will not include the <> symbols! You are encouraged to develop your own naming conventions. Screen captures shown are representative examples of each task.

Backup Steps

  1. Create an Azure Storage account
  2. Create a Credential
  3. Perform a database backup

Create an Azure Storage account

In order to backup a SQL Server database to Azure, you will need to first create an Azure Storage account.

  1. The steps to backup a database to a URL are as follows:
    Login the Azure Portal
  2. Locate Storage, click Add
  3. Create Storage container – Enter the following information:
    1. Name – <sqlstorageaccountname>Note: this must be globally unique!
    2. Account Kind – General Purpose
    3. Performance – Standard
    4. Storage service encryption – Enabled
    5. Resource group
      1. Create new
      2. <sqlbootcamp>

  1. Within storage container (created last step)
  2. On the container properties page, under Blob Services, click Containers.
  3. To Create a container for the SQL backups, Click (+ Container)
    1. Name: <sqlbackup>
    2. Access type – Blob
    3. Click Create

Select the newly created container (last step), then Properties. Locate the URL line on right and copy this information into the clipboard. Note: Pasting this information into Notepad is a convenient way to store this information for a subsequent lab (hint!).

It should appear similar to this (substituting storage container and container name):

  1. Next, obtain the access key for this storage account. Select the (root) storage container name, then under SETTINGS, select Access keys. To the right of key1, select the copy to clipboard icon. Save this key to notepad (paste) – it will be used in a future step.

Creating a Credential

You will now create a Credential with SQL Server, which will allow access to the Azure storage account from SSMS. This Credential is stored locally within SQL Server.

Take a moment and review the following example – SQL Server credential is created for authentication to the Microsoft Azure Blob storage service.

Using storage account identity and access key

Notes: T-SQL Sample, <mycredentialname> = credential name used internally, <mystorageaccountname> = name of the storage account, <mystorageaccountaccesskey> = key 1 captured in previous task.

IF NOT EXISTS (SELECT * FROM sys.credentials WHERE name = '<mycredentialname>') 
CREATE CREDENTIAL [<mycredentialname>] WITH IDENTITY = '<mystorageaccountname>' ,
SECRET = '<mystorageaccountaccesskey>'; 

(SELECT * FROM sys.credentials WHERE name = '<mycredentialname>') 

CREATE CREDENTIAL [<mycredentialname>] WITH IDENTITY = '<mystorageaccountname>' 
,SECRET = '<mystorageaccountaccesskey>'; 

Now, you will translate this sample T-SQL into a credential for your Backup!

Using Notepad to build your T-SQL statement – Substitute the bracketed areas next with your account information.

CREATE CREDENTIAL [backup] WITH IDENTITY = '<YourAccountName>' ,
SECRET = '<YourStorageAccessKey>'; 

CREATE CREDENTIAL [backup] WITH IDENTITY = '<YourAccountName>' 
,SECRET = '<YourStorageAccessKey>'; 

IDENTITY = ‘YourAccountName‘ (Note: from Create an Azure Storage account – just the first name, not the complete FQDN)

SECRET = ‘YourStorageAccessKey‘ (Note: from Create an Azure Storage accountobtain storage access key – last step)

Then, copy the entire statement into the clipboard.

Open SQL Server Management Studio (SSMS) and connect to your local copy of SQL Server.

Click New Query.

Copy and paste the above T-SQL into the new query window.

Then click Execute.

The command should show Completed Successfully.

Perform a full database backup to URL

The following example perform a full database backup of the AdventureWorks2016 database to the Microsoft Azure Blob storage service. Take a moment and review the syntax.

Backup To URL using storage account identity and access key

WITH IDENTITY = '<YourAccountName>',
SECRET = '<YourStorageAccessKey>'; 

WITH IDENTITY = '<YourAccountName>', 
SECRET = '<YourStorageAccessKey>'; 


Note: You will need to substitute YourAccountName and backup

WITH IDENTITY = '<YourAccountName>',
SECRET = '<YourStorageAccessKey>'; 

WITH IDENTITY = '<YourAccountName>' 
,SECRET = '<YourStorageAccessKey>'; 


With the storage account name and blob name. Substitute the credential name if necessary.

Open SQL Server Management Studio (SSMS) and connect to your local copy of SQL Server.

Click New Query.

Copy and paste the following T-SQL into the new query window.

Then click Execute.

The command should show Completed Successfully.

Posted in Azure SQL Server, SQL Server | 4 Comments

Azure SQL Migration Tools

In preparing a presentation for data migration to Azure, thought I’d share some of my research. There are other methods not specified here, but this should be a good start! Each toolset includes notes about that tool, and where to find it.

SQL Server Management Studio – Migration Wizard

•Built into core SSMS


Data Migration Assistant

•Enables assessment of your on-premises SQL Server instance migrating to Azure SQL databases


•Migration blocking issues

•Partially or unsupported features and functions


SQL Server Migration Assistant

•Supports migration for Oracle, MySQL, SAP ASE (formerly SAP Sybase ASE), DB2 and Access

•Lets users convert database schema to Microsoft SQL Server schema, upload the schema, and migrate data to the target SQL Server


Azure Data Factory Copy Wizard

•Data Factory is a cloud-based data integration service that orchestrates and automates the movement and transformation of data

•Can be incredibly fast


SQL Database Migration wizard

•Can be useful for validation


Posted in Azure SQL Server, Migration, SQL Server | Tagged , | 1 Comment

SCCM File Backup Considerations

Everyone (hopefully) is aware that the SCCM database needs to be backed up. And, that there a number of other components that need to be backed up in order to have everything on hand in case there is a need to do a full recovery. Note: search my blog for native SQL backup steps.

Obviously, if you have a third party backup software product in place, you are probably OK.

Here are the key parts:

Backup Databases

  • SCCM

Report Server Databases

  • ReportServer DB
  • ReportServerTempDB DB

Reporting Services encryption keys using the SQL Server Reporting Services Configuration Manager.

Report Designer files – Report Definition (.RDL), Report Model (.SMDL), Shared Data Sources (.RDS), Data Views (.DV), Data Sources (.DS), Report Server Project (.RPTPROJ), and Report Solutions (.SLN)

Report Services Files located in [SQL Server Installation Direcotry]\MSRS14.MSSQLSERVER\Reporting Services\ReportServer

  • Rsreportserver.config
  • Rssvrpolicy.config
  • Rsmgrpolicy.config
  • Reportingservicesservice.exe.config
  • Web.config for both the Report Server and Report Manager ASP.NET applications
  • Machine.config for ASP.NET

WSUS – Backup folder containing WSUS data


  • Current branch

Backup Content Files

Backup Package Source Files 

The content library in Configuration Manager is the location where all content files are stored for software updates, applications, operating system deployment, and so on. The content library is located on the site server and each distribution point. The Backup Site Server maintenance task does not include a backup for the content library or the package source files. When a site server fails, the information about the content library files is restored to the site database, but you must restore the content library and package source files on the site server.

  • Content library: The content library must be restored before you can redistribute content to distribution points. When you start content redistribution, Configuration Manager copies the files from the content library on the site server to the distribution points. The content library for the site server is in the SCCMContentLib folder, which is typically located on the drive with the most free disk space at the time that the site installed. For more information about the content library, see Introduction to Content Management in Configuration Manager.
  • Package source files: The package source files must be restored before you can update content on distribution points. When you start a content update, Configuration Manager copies new or modified files from the package source to the content library, which in turn copies the files to associated distribution points. You can run the following query in SQL Server to find the package source location for all packages and applications:

SELECT * FROM v_Package

You can identify the package source site by looking at the first three characters of the package ID. For example, if the package ID is CEN00001, the site code for the source site is CEN. When you restore the package source files, they must be restored to the same location in which they were before the failure. For more information about updating content, see the Update Content on Distribution Points section in the Operations and Maintenance for Content Management in Configuration Manager topic.

Important: Verify that you include both the content library and package source locations in your file system backup for the site server.

Back Up Custom Software Updates

On the computer that runs Updates Publisher, browse the Updates Publisher 2011 database file (Scupdb.sdf) in %USERPROFILE%\AppData\Local\Microsoft\System Center Updates Publisher 2011\5.00.1727.0000\. There is a different database file for each user that runs Updates Publisher 2011.

From <>

Post Recovery Tasks

  • The recovery report is saved to C:\ConfigMgrPostRecoveryActions.html
  • Re-enter user account passwords
  • Re-enter Sideloading Keys
  • Recreate Microsoft Intune Subscription
  • Configure SSL for Site System Roles that use IIS
  • Reinstall hotfixes
  • Recover Custom Reports
  • Recover Content Files
  • Recover Custom Software Updates (SCUP)
  • Recover USMT
  • Update Certificates for Cloud DP’s
  • Reprovision Intel AMT-based computers

SCCM Secondary Site

  • How to backup
    • Step 1: No need!
  • Recover Secondary Site
    • Use the Recover Secondary Site action from the Sites node

Posted in ConfigMgr, SCCM, SQL Server | 1 Comment

Top 5 blog posts for 2016

Read another post of top 5 blog posts… thank you for viewing my work.

SQL Server Backup Recommendations for Configuration Manager

Optimizing ConfigMgr databases

Running Powershell as an SQL Server Agent task

How to recover a ConfigMgr 2012 site using a restored db

SCCM 2012 computer information query

Posted in Uncategorized | Leave a comment

Optimizing SCCM Databases Revisited

Since Optimizing ConfigMgr databases was published in May 2013, I have received some tremendous feedback, all positive.

The basics of that original post have not changed, Ola Hallengrens Index Optimization scripts, hands down, is the best solution for 99% of ALL SCCM implementations. This post is about making it easier to implement for the non DBA, and ensuring that the proper options are implemented for the Index Optimization job.


Download Ola Hallengrens MaintenanceSolution.sql script, here


The following steps are required to implement this solution. Details on each step, follow.

1) Create the CMMonitor database

2) Install MaintenanceSolution.sql

3) Configure and Schedule the Index Optimization SQL Agent job task

4) Verify the Index Optimization SQL Agent job task

Create the CMMonitor database

Open SQL Server Management Studio (SSMS), connect to the SQL instance where the SCCM database is located.

Copy and paste the following into SSMS, change the file paths as appropriate and Execute.



IF EXISTS (SELECT name FROM sys.databases 
WHERE name = N'CMMonitor')




( NAME = N'CMMonitor', 
FILENAME = N'C:\MSSQL\DATA\CMMonitor.mdf' , 


( NAME = N'CMMonitor_log', 
FILENAME = N'C:\MSSQL\Log\CMMonitor_log.ldf' , 
SIZE = 1024KB , MAXSIZE = 2048GB , 





Install MaintenanceSolution.sql

Using SSMS (or your favorite editor) edit the MaintenanceSolution.sql (downloaded as part of pre-requisites) to point to your newly created CMMonitor database.

Note: SQL Server Agent should be running. If it is not, please start this service and set start mode to Auto.

Near the beginning of the MaintenanceSolution.sql, locate the USE [Master] entry and modify as follows:

-- USE [master] -- Specify the database in which the objects will be created.

USE [CMMonitor]

Then Execute the SQL query.


Configure the Index Optimization SQL Agent job task

To perform this manually, Using SSMS, expand the SQL Server Agent node, then Jobs.

Note: Otherwise skip to end of this section to copy / run the SQL Script…

Right click Jobs, then New Job…

On the General page:

  • Enter the Name: IndexOptimize – Custom
  • Change the owner to sa


Click Steps, then New…

On the New Job Step – General page

Step Name: Optimize

  • Type: Transact-SQL script (T-SQL) – should be default
  • Database: CMMonitor

Command (copy / paste the following)

EXECUTE dbo.IndexOptimize

@Databases = 'USER_DATABASES',

@FragmentationLow = NULL,



@FragmentationLevel1 = 10,

@FragmentationLevel2 = 40,

@UpdateStatistics = 'ALL',

@OnlyModifiedStatistics = 'Y',


--@SortInTempdb = 'Y',

@LogToTable = 'Y'

Then click OK


New Job Step

On the New Job – Schedules page, click New…

On the New Job Schedule page:

Name: Run Optimization

Frequency: Weekly, select Wednesday (and Sunday)

Click OK


Click OK to Save New Job.

SQL Script to create IndexOptimization – Custom job

On the instance where you have installed CMMonitor and the MaintenanceSolution, Copy / paste the following SQL syntax into SSMS and execute.

Note: do not run this step if you have manually configured this process!

USE [msdb]


/****** Object: Job [IndexOptimize - Custom] Script Date: 11/28/2016 2:35:32 PM ******/



SELECT @ReturnCode = 0

/****** Object: JobCategory [[Uncategorized (Local)]] Script Date: 11/28/2016 2:35:32 PM ******/

IF NOT EXISTS (SELECT name FROM msdb.dbo.syscategories WHERE name=N'[Uncategorized (Local)]' AND category_class=1)


EXEC @ReturnCode = msdb.dbo.sp_add_category @class=N'JOB', @type=N'LOCAL', @name=N'[Uncategorized (Local)]'

IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback



EXEC @ReturnCode = msdb.dbo.sp_add_job @job_name=N'IndexOptimize - Custom',







@description=N'No description available.',

@category_name=N'[Uncategorized (Local)]',

@owner_login_name=N'sa', @job_id = @jobId OUTPUT

IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback

/****** Object: Step [Optimize] Script Date: 11/28/2016 2:35:32 PM ******/

EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @job_id=@jobId, @step_name=N'Optimize',









@os_run_priority=0, @subsystem=N'TSQL',

@command=N'EXECUTE dbo.IndexOptimize

@Databases = ''USER_DATABASES'',

@FragmentationLow = NULL,



@FragmentationLevel1 = 10,

@FragmentationLevel2 = 40,

@UpdateStatistics = ''ALL'',

@OnlyModifiedStatistics = ''Y'',


--@SortInTempdb = ''Y'',

@LogToTable = ''Y''




IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback

EXEC @ReturnCode = msdb.dbo.sp_update_job @job_id = @jobId, @start_step_id = 1

IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback

EXEC @ReturnCode = msdb.dbo.sp_add_jobschedule @job_id=@jobId, @name=N'Run Optimization',













IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback

EXEC @ReturnCode = msdb.dbo.sp_add_jobserver @job_id = @jobId, @server_name = N'(local)'

IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback


GOTO EndSave





Verify the Index Optimization SQL Agent job task

You can wait until after the scheduled date/time to observe results. (Default schedule is 2x/week. On a smaller site, you may want to test the IndexOptimization – Custom job immediately. To do so, locate the new SQL Agent job, right click and select Start Job at Step…


When complete, it should indicate that it was a success.


To review the results that were logged to the CommandLog table in the CMMonitor database, use the following SQL query:

SELECT TOP (1000) [ID]
















FROM [CMMonitor].[dbo].[CommandLog]


Summary (other things to think about)

Disable the built in SCCM maintenance to re-index tables, you won’t need this anymore.

You may have noticed that a number of new SQL Server Agent jobs were created. These may be useful to you at some point, but are beyond the scope of this post.

I do recommend enabling the CommandLog Cleanup Agent job, as this will flush commandlog entries older than 30 days.

You may have noticed two options that were commented out in the IndexOptimize – Custom job:


–@SortInTempdb = ‘Y’,

Depending on your scenario, you may find that it may speed up the Optimize process. Be aware a very large database may use significant TempDB with SortInTempdb – monitor and adjust as needed.

All USER Databases will be optimized on the SQL Server where this job is run. That includes SUSDB, ReportServer and CM databases.

Check Ola’s site as updates are released on a fairly regular basis. Updating the Maintenance solution is very straight forward, from Ola’s FAQ.

I’m using an old version of the SQL Server Maintenance Solution. How do I upgrade to the most recent version?

To upgrade to the most recent version of the SQL Server Maintenance Solution, follow these steps.

1. Download MaintenanceSolution.sql.

2. Change this line in the script so as not to create jobs:
SET @CreateJobs = ‘N’

3. Execute MaintenanceSolution.sql. The script drops the old versions of the stored procedures and creates new versions.

4. Start the jobs and verify that they are completing successfully.

From <>

Posted in ConfigMgr, SQL Server | 32 Comments

Why you should not use the ConfigMgr rebuild index task

Rebuilding the database indexes is a good thing, yes? Absolutely.

Why not use the ConfigMgr rebuild index maintenance task?

When the maintenance task works, and I have proof that some of the time it fails to run on ConfigMgr:

  • It REINDEXES all indexes with more than 10% fragmentation
  • It does not update statistics

Let’s break these two items down…

Rebuild Indexes

By updating all indexes, the SQL Engine has to plow through EVERY index and REINDEX the index, REGARDLESS if it really needs the update! This makes the process very inefficient. You would not notice this in a lab, or a small environment. However, you WILL notice it on a very large site. To review the gory details, history time here.

It is far more efficient to Reorganize indexes that are slightly fragmented and Rebuild indexes that are very fragmented.


Database statistics are interesting to me… if you have ever studied statistics, you’ll know that the sampling method of any data will lead to really good results or perhaps, poor results. SQL statistics are used to map the distribution of values in a given column. The SQL Server query optimizer uses statistics to determine which indexes (if any) will be used to generate the query plan, then subsequently execute the query based on that plan.

If the statistics are up-to-date, then, most of the time the proper indexes will be selected for the query plan, leading to an efficient query plan execution. However, if the statistics are NOT up-to-date, expect that the WRONG query plan will be selected and a poorly executed query will result.

The analogy I like to use here, lets say you travel to an unfamiliar city. You pick up your rental car, and you are handed a street map for that city (this is pre GPS). If that street map is 10 years old (or older), you may have trouble finding your destination due to the changes in the road systems! You need an up-to-date map to determine an efficient route.

Key takeaways

Updating indexes without updating statistics is a disaster waiting to happen.

Do use Ola Hallengrens solution to INTELLIGENTLY update the indexes and modified statistics. Your database will be happier, your users will be happier.

Posted in ConfigMgr, SQL Server | 3 Comments

SQL Server MAXDOP and effect on ConfigMgr

What is MAXDOP?

SQL Server MAX Degree Of Parallelism (MAXDOP) configuration option controls the number of processors that are used for the execution of a query in a parallel plan. This option determines the computing and thread resources that are used for the query plan operators that perform the work in parallel.

Sounds cool, what does that mean?

It means that SQL Server allows you to control the number of processors that are available to process a single statement. Statements include things such as: queries, DDL (data definition language) operations, bulk loads using parallel inserts, and a few other scenarios.

As a consultant, I’ve used and tuned MAXDOP setting for large data warehouse data loading operations. The improvement in tuning the parallelism can be quite dramatic.

By default, MAXDOP is set to 0. This means that all available processors are available to be deployed to process statements. That is good, right? Not necessarily. It depends.

Here are the current (somewhat generic) selected recommendations:

· To enable the server to determine the maximum degree of parallelism, set this option to 0, the default value. Setting maximum degree of parallelism to 0 allows SQL Server to use all the available processors up to 64 processors. To suppress parallel plan generation, set max degree of parallelism to 1. Set the value to a number from 1 to 32,767 to specify the maximum number of processor cores that can be used by a single query execution. If a value greater than the number of available processors is specified, the actual number of available processors is used. If the computer has only one processor, the max degree of parallelism value is ignored.

· Index operations that create or rebuild an index, or that drop a clustered index, can be resource intensive. You can override the max degree of parallelism value for index operations by specifying the MAXDOP index option in the index statement. The MAXDOP value is applied to the statement at execution time and is not stored in the index metadata.

The effect on ConfigMgr?

For smaller sites, you may not need to change the default setting. When you scale out too much larger environments, particularly with the dreaded CAS in place, it is worth setting MAXDOP and monitoring.

Recently, I asked for feedback from the community on those that have set MAXDOP for ConfigMgr environments. Here is what I received (slightly edited for clarity):

Very large site: With a CAS, MAXDOP with a higher value makes collection queries faster, lower setting makes inbox processing faster, not a lot of room for compromise. We use 2.

Large site: With a CAS scenario with MaxDOP of 6 we’ve seen replication blocking itself. Set it to one and it hasn’t reoccurred. (Note: MAXDOP was disabled)


MAXDOP can be useful for certain scenarios. Modify and monitor for performance gains, or possible performance penalties. Adjust as needed.


Configure the max degree of parallelism Server Configuration Option

How It Works: Maximizing Max Degree Of Parallelism (MAXDOP)

From <>

TIP: Setting SQL Max Degree of Parallelism Achieve Big System Center SQL Performance Gains

(note: interesting, as always your mileage may vary)

From <>

Posted in ConfigMgr, SQL, SQL Server 2016 | 1 Comment

SQL Server 2005 End of Life resources

Received an email today from Microsoft with some useful links and resources.


Extended support for SQL Server 2005 ended on April 12, 2016, which means you’ll no longer get security updates and hotfixes and could be at risk of data vulnerabilities.
Upgrade now to get the speed, productivity, and cost benefits of SQL Server 2014. Using familiar, trusted tools you can:

Achieve mission-critical performance.

Help maintain security and compliance.

Gain data insights faster.

Optimize your data infrastructure.

If you haven’t yet made the move, get started today—our resources will help you identify older versions of SQL Server running in your datacenter, review your upgrade options, and upgrade to a modern database that works for you.

Discover—Review your current IT infrastructure using the Microsoft Assessment and Planning Toolkit.

Target—Learn more about your upgrade options with our decision tool.

Upgrade—Use these resources to begin your upgrade:

SQL Server 2014 Upgrade Advisor

SQL Server 2014 Upgrade Technical Guide

Read more about our commitment to industry leading technology—see what’s coming up in SQL Server 2016.
Contact Microsoft or a partner for help today.

Posted in SQL Server, SQL Server 2014, SQL Server 2016 | Leave a comment