SCCM File Backup Considerations

Everyone (hopefully) is aware that the SCCM database needs to be backed up. And, that there a number of other components that need to be backed up in order to have everything on hand in case there is a need to do a full recovery. Note: search my blog for native SQL backup steps.

Obviously, if you have a third party backup software product in place, you are probably OK.

Here are the key parts:

Backup Databases

  • SCCM
  • SUSDB

Report Server Databases

  • ReportServer DB
  • ReportServerTempDB DB

Reporting Services encryption keys using the SQL Server Reporting Services Configuration Manager.

Report Designer files – Report Definition (.RDL), Report Model (.SMDL), Shared Data Sources (.RDS), Data Views (.DV), Data Sources (.DS), Report Server Project (.RPTPROJ), and Report Solutions (.SLN)

Report Services Files located in [SQL Server Installation Direcotry]\MSRS14.MSSQLSERVER\Reporting Services\ReportServer

  • Rsreportserver.config
  • Rssvrpolicy.config
  • Rsmgrpolicy.config
  • Reportingservicesservice.exe.config
  • Web.config for both the Report Server and Report Manager ASP.NET applications
  • Machine.config for ASP.NET

WSUS – Backup folder containing WSUS data

Backup CD.LATEST

  • Current branch

Backup Content Files

Backup Package Source Files 

The content library in Configuration Manager is the location where all content files are stored for software updates, applications, operating system deployment, and so on. The content library is located on the site server and each distribution point. The Backup Site Server maintenance task does not include a backup for the content library or the package source files. When a site server fails, the information about the content library files is restored to the site database, but you must restore the content library and package source files on the site server.

  • Content library: The content library must be restored before you can redistribute content to distribution points. When you start content redistribution, Configuration Manager copies the files from the content library on the site server to the distribution points. The content library for the site server is in the SCCMContentLib folder, which is typically located on the drive with the most free disk space at the time that the site installed. For more information about the content library, see Introduction to Content Management in Configuration Manager.
  • Package source files: The package source files must be restored before you can update content on distribution points. When you start a content update, Configuration Manager copies new or modified files from the package source to the content library, which in turn copies the files to associated distribution points. You can run the following query in SQL Server to find the package source location for all packages and applications:

SELECT * FROM v_Package

You can identify the package source site by looking at the first three characters of the package ID. For example, if the package ID is CEN00001, the site code for the source site is CEN. When you restore the package source files, they must be restored to the same location in which they were before the failure. For more information about updating content, see the Update Content on Distribution Points section in the Operations and Maintenance for Content Management in Configuration Manager topic.

Important: Verify that you include both the content library and package source locations in your file system backup for the site server.

Back Up Custom Software Updates

On the computer that runs Updates Publisher, browse the Updates Publisher 2011 database file (Scupdb.sdf) in %USERPROFILE%\AppData\Local\Microsoft\System Center Updates Publisher 2011\5.00.1727.0000\. There is a different database file for each user that runs Updates Publisher 2011.

From <https://technet.microsoft.com/en-us/library/gg712697.aspx>

Post Recovery Tasks

  • The recovery report is saved to C:\ConfigMgrPostRecoveryActions.html
  • Re-enter user account passwords
  • Re-enter Sideloading Keys
  • Recreate Microsoft Intune Subscription
  • Configure SSL for Site System Roles that use IIS
  • Reinstall hotfixes
  • Recover Custom Reports
  • Recover Content Files
  • Recover Custom Software Updates (SCUP)
  • Recover USMT
  • Update Certificates for Cloud DP’s
  • Reprovision Intel AMT-based computers

SCCM Secondary Site

  • How to backup
    • Step 1: No need!
  • Recover Secondary Site
    • Use the Recover Secondary Site action from the Sites node

https://technet.microsoft.com/en-us/library/gg712697.aspx

Advertisements
Posted in ConfigMgr, SCCM, SQL Server | 1 Comment

Top 5 blog posts for 2016

Read another post of top 5 blog posts… thank you for viewing my work.

SQL Server Backup Recommendations for Configuration Manager https://stevethompsonmvp.wordpress.com/2013/06/07/sql-server-backup-recommendations-for-configuration-manager/

Optimizing ConfigMgr databases

https://stevethompsonmvp.wordpress.com/2013/05/07/optimizing-configmgr-databases/

Running Powershell as an SQL Server Agent task

https://stevethompsonmvp.wordpress.com/2013/10/07/running-powershell-as-an-sql-server-agent-task/

How to recover a ConfigMgr 2012 site using a restored db

https://stevethompsonmvp.wordpress.com/2014/08/01/how-to-recover-a-configmgr-2012-site-using-a-restored-db/

SCCM 2012 computer information query

https://stevethompsonmvp.wordpress.com/2014/02/05/sccm-2012-computer-information-query/

Posted in Uncategorized | Leave a comment

Optimizing SCCM Databases Revisited

Since Optimizing ConfigMgr databases was published in May 2013, I have received some tremendous feedback, all positive.

The basics of that original post have not changed, Ola Hallengrens Index Optimization scripts, hands down, is the best solution for 99% of ALL SCCM implementations. This post is about making it easier to implement for the non DBA, and ensuring that the proper options are implemented for the Index Optimization job.

Pre-requisites:

Download Ola Hallengrens MaintenanceSolution.sql script, here

Steps

The following steps are required to implement this solution. Details on each step, follow.

1) Create the CMMonitor database

2) Install MaintenanceSolution.sql

3) Configure and Schedule the Index Optimization SQL Agent job task

4) Verify the Index Optimization SQL Agent job task

Create the CMMonitor database

Open SQL Server Management Studio (SSMS), connect to the SQL instance where the SCCM database is located.

Copy and paste the following into SSMS, change the file paths as appropriate and Execute.

USE MASTER

GO

IF EXISTS (SELECT name FROM sys.databases 
WHERE name = N'CMMonitor')

DROP DATABASE [CMMonitor]

GO

CREATE DATABASE [CMMonitor] ON PRIMARY 

( NAME = N'CMMonitor', 
FILENAME = N'C:\MSSQL\DATA\CMMonitor.mdf' , 
SIZE = 10240KB , MAXSIZE = UNLIMITED, 
FILEGROWTH = 10240KB )

LOG ON 

( NAME = N'CMMonitor_log', 
FILENAME = N'C:\MSSQL\Log\CMMonitor_log.ldf' , 
SIZE = 1024KB , MAXSIZE = 2048GB , 
FILEGROWTH = 5120KB )

GO

ALTER DATABASE [CMMonitor] SET RECOVERY SIMPLE 

GO

clip_image002[5]

Install MaintenanceSolution.sql

Using SSMS (or your favorite editor) edit the MaintenanceSolution.sql (downloaded as part of pre-requisites) to point to your newly created CMMonitor database.

Note: SQL Server Agent should be running. If it is not, please start this service and set start mode to Auto.

Near the beginning of the MaintenanceSolution.sql, locate the USE [Master] entry and modify as follows:

-- USE [master] -- Specify the database in which the objects will be created.

USE [CMMonitor]

Then Execute the SQL query.

clip_image004[4]

Configure the Index Optimization SQL Agent job task

To perform this manually, Using SSMS, expand the SQL Server Agent node, then Jobs.

Note: Otherwise skip to end of this section to copy / run the SQL Script…

Right click Jobs, then New Job…

On the General page:

  • Enter the Name: IndexOptimize – Custom
  • Change the owner to sa

clip_image006[4]

Click Steps, then New…

On the New Job Step – General page

Step Name: Optimize

  • Type: Transact-SQL script (T-SQL) – should be default
  • Database: CMMonitor

Command (copy / paste the following)

EXECUTE dbo.IndexOptimize

@Databases = 'USER_DATABASES',

@FragmentationLow = NULL,

@FragmentationMedium = 'INDEX_REORGANIZE,INDEX_REBUILD_ONLINE,INDEX_REBUILD_OFFLINE',

@FragmentationHigh = 'INDEX_REBUILD_ONLINE,INDEX_REBUILD_OFFLINE',

@FragmentationLevel1 = 10,

@FragmentationLevel2 = 40,

@UpdateStatistics = 'ALL',

@OnlyModifiedStatistics = 'Y',

--@MaxDOP=2,

--@SortInTempdb = 'Y',

@LogToTable = 'Y'

Then click OK

clip_image008[4]

New Job Step

On the New Job – Schedules page, click New…

On the New Job Schedule page:

Name: Run Optimization

Frequency: Weekly, select Wednesday (and Sunday)

Click OK

clip_image010[4]

Click OK to Save New Job.

SQL Script to create IndexOptimization – Custom job

On the instance where you have installed CMMonitor and the MaintenanceSolution, Copy / paste the following SQL syntax into SSMS and execute.

Note: do not run this step if you have manually configured this process!

USE [msdb]

GO

/****** Object: Job [IndexOptimize - Custom] Script Date: 11/28/2016 2:35:32 PM ******/

BEGIN TRANSACTION

DECLARE @ReturnCode INT

SELECT @ReturnCode = 0

/****** Object: JobCategory [[Uncategorized (Local)]] Script Date: 11/28/2016 2:35:32 PM ******/

IF NOT EXISTS (SELECT name FROM msdb.dbo.syscategories WHERE name=N'[Uncategorized (Local)]' AND category_class=1)

BEGIN

EXEC @ReturnCode = msdb.dbo.sp_add_category @class=N'JOB', @type=N'LOCAL', @name=N'[Uncategorized (Local)]'

IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback

END

DECLARE @jobId BINARY(16)

EXEC @ReturnCode = msdb.dbo.sp_add_job @job_name=N'IndexOptimize - Custom',

@enabled=1,

@notify_level_eventlog=0,

@notify_level_email=0,

@notify_level_netsend=0,

@notify_level_page=0,

@delete_level=0,

@description=N'No description available.',

@category_name=N'[Uncategorized (Local)]',

@owner_login_name=N'sa', @job_id = @jobId OUTPUT

IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback

/****** Object: Step [Optimize] Script Date: 11/28/2016 2:35:32 PM ******/

EXEC @ReturnCode = msdb.dbo.sp_add_jobstep @job_id=@jobId, @step_name=N'Optimize',

@step_id=1,

@cmdexec_success_code=0,

@on_success_action=1,

@on_success_step_id=0,

@on_fail_action=2,

@on_fail_step_id=0,

@retry_attempts=0,

@retry_interval=0,

@os_run_priority=0, @subsystem=N'TSQL',

@command=N'EXECUTE dbo.IndexOptimize

@Databases = ''USER_DATABASES'',

@FragmentationLow = NULL,

@FragmentationMedium = ''INDEX_REORGANIZE,INDEX_REBUILD_ONLINE,INDEX_REBUILD_OFFLINE'',

@FragmentationHigh = ''INDEX_REBUILD_ONLINE,INDEX_REBUILD_OFFLINE'',

@FragmentationLevel1 = 10,

@FragmentationLevel2 = 40,

@UpdateStatistics = ''ALL'',

@OnlyModifiedStatistics = ''Y'',

--@MaxDOP=2,

--@SortInTempdb = ''Y'',

@LogToTable = ''Y''

',

@database_name=N'CMMonitor',

@flags=0

IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback

EXEC @ReturnCode = msdb.dbo.sp_update_job @job_id = @jobId, @start_step_id = 1

IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback

EXEC @ReturnCode = msdb.dbo.sp_add_jobschedule @job_id=@jobId, @name=N'Run Optimization',

@enabled=1,

@freq_type=8,

@freq_interval=9,

@freq_subday_type=1,

@freq_subday_interval=0,

@freq_relative_interval=0,

@freq_recurrence_factor=1,

@active_start_date=20161128,

@active_end_date=99991231,

@active_start_time=0,

@active_end_time=235959,

@schedule_uid=N'72893373-3c7b-4073-8b17-54dcaa9634bd'

IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback

EXEC @ReturnCode = msdb.dbo.sp_add_jobserver @job_id = @jobId, @server_name = N'(local)'

IF (@@ERROR <> 0 OR @ReturnCode <> 0) GOTO QuitWithRollback

COMMIT TRANSACTION

GOTO EndSave

QuitWithRollback:

IF (@@TRANCOUNT > 0) ROLLBACK TRANSACTION

EndSave:

GO

Verify the Index Optimization SQL Agent job task

You can wait until after the scheduled date/time to observe results. (Default schedule is 2x/week. On a smaller site, you may want to test the IndexOptimization – Custom job immediately. To do so, locate the new SQL Agent job, right click and select Start Job at Step…

clip_image012

When complete, it should indicate that it was a success.

clip_image014

To review the results that were logged to the CommandLog table in the CMMonitor database, use the following SQL query:

SELECT TOP (1000) [ID]

,[DatabaseName]

,[SchemaName]

,[ObjectName]

,[ObjectType]

,[IndexName]

,[IndexType]

,[StatisticsName]

,[PartitionNumber]

,[ExtendedInfo]

,[Command]

,[CommandType]

,[StartTime]

,[EndTime]

,[ErrorNumber]

,[ErrorMessage]

FROM [CMMonitor].[dbo].[CommandLog]

ORDER BY ID DESC

Summary (other things to think about)

Disable the built in SCCM maintenance to re-index tables, you won’t need this anymore.

You may have noticed that a number of new SQL Server Agent jobs were created. These may be useful to you at some point, but are beyond the scope of this post.

I do recommend enabling the CommandLog Cleanup Agent job, as this will flush commandlog entries older than 30 days.

You may have noticed two options that were commented out in the IndexOptimize – Custom job:

–@MaxDOP=2,

–@SortInTempdb = ‘Y’,

Depending on your scenario, you may find that it may speed up the Optimize process. Be aware a very large database may use significant TempDB with SortInTempdb – monitor and adjust as needed.

All USER Databases will be optimized on the SQL Server where this job is run. That includes SUSDB, ReportServer and CM databases.

Check Ola’s site as updates are released on a fairly regular basis. Updating the Maintenance solution is very straight forward, from Ola’s FAQ.

I’m using an old version of the SQL Server Maintenance Solution. How do I upgrade to the most recent version?

To upgrade to the most recent version of the SQL Server Maintenance Solution, follow these steps.

1. Download MaintenanceSolution.sql.

2. Change this line in the script so as not to create jobs:
SET @CreateJobs = ‘N’

3. Execute MaintenanceSolution.sql. The script drops the old versions of the stored procedures and creates new versions.

4. Start the jobs and verify that they are completing successfully.

From <https://ola.hallengren.com/frequently-asked-questions.html>

Posted in ConfigMgr, SQL Server | 13 Comments

Why you should not use the ConfigMgr rebuild index task

Rebuilding the database indexes is a good thing, yes? Absolutely.

Why not use the ConfigMgr rebuild index maintenance task?

When the maintenance task works, and I have proof that some of the time it fails to run on ConfigMgr:

  • It REINDEXES all indexes with more than 10% fragmentation
  • It does not update statistics

Let’s break these two items down…

Rebuild Indexes

By updating all indexes, the SQL Engine has to plow through EVERY index and REINDEX the index, REGARDLESS if it really needs the update! This makes the process very inefficient. You would not notice this in a lab, or a small environment. However, you WILL notice it on a very large site. To review the gory details, history time here.

It is far more efficient to Reorganize indexes that are slightly fragmented and Rebuild indexes that are very fragmented.

Statistics

Database statistics are interesting to me… if you have ever studied statistics, you’ll know that the sampling method of any data will lead to really good results or perhaps, poor results. SQL statistics are used to map the distribution of values in a given column. The SQL Server query optimizer uses statistics to determine which indexes (if any) will be used to generate the query plan, then subsequently execute the query based on that plan.

If the statistics are up-to-date, then, most of the time the proper indexes will be selected for the query plan, leading to an efficient query plan execution. However, if the statistics are NOT up-to-date, expect that the WRONG query plan will be selected and a poorly executed query will result.

The analogy I like to use here, lets say you travel to an unfamiliar city. You pick up your rental car, and you are handed a street map for that city (this is pre GPS). If that street map is 10 years old (or older), you may have trouble finding your destination due to the changes in the road systems! You need an up-to-date map to determine an efficient route.

Key takeaways

Updating indexes without updating statistics is a disaster waiting to happen.

Do use Ola Hallengrens solution to INTELLIGENTLY update the indexes and modified statistics. Your database will be happier, your users will be happier.

Posted in ConfigMgr, SQL Server | 3 Comments

SQL Server MAXDOP and effect on ConfigMgr

What is MAXDOP?

SQL Server MAX Degree Of Parallelism (MAXDOP) configuration option controls the number of processors that are used for the execution of a query in a parallel plan. This option determines the computing and thread resources that are used for the query plan operators that perform the work in parallel.

Sounds cool, what does that mean?

It means that SQL Server allows you to control the number of processors that are available to process a single statement. Statements include things such as: queries, DDL (data definition language) operations, bulk loads using parallel inserts, and a few other scenarios.

As a consultant, I’ve used and tuned MAXDOP setting for large data warehouse data loading operations. The improvement in tuning the parallelism can be quite dramatic.

By default, MAXDOP is set to 0. This means that all available processors are available to be deployed to process statements. That is good, right? Not necessarily. It depends.

Here are the current (somewhat generic) selected recommendations:

· To enable the server to determine the maximum degree of parallelism, set this option to 0, the default value. Setting maximum degree of parallelism to 0 allows SQL Server to use all the available processors up to 64 processors. To suppress parallel plan generation, set max degree of parallelism to 1. Set the value to a number from 1 to 32,767 to specify the maximum number of processor cores that can be used by a single query execution. If a value greater than the number of available processors is specified, the actual number of available processors is used. If the computer has only one processor, the max degree of parallelism value is ignored.

· Index operations that create or rebuild an index, or that drop a clustered index, can be resource intensive. You can override the max degree of parallelism value for index operations by specifying the MAXDOP index option in the index statement. The MAXDOP value is applied to the statement at execution time and is not stored in the index metadata.

The effect on ConfigMgr?

For smaller sites, you may not need to change the default setting. When you scale out too much larger environments, particularly with the dreaded CAS in place, it is worth setting MAXDOP and monitoring.

Recently, I asked for feedback from the community on those that have set MAXDOP for ConfigMgr environments. Here is what I received (slightly edited for clarity):

Very large site: With a CAS, MAXDOP with a higher value makes collection queries faster, lower setting makes inbox processing faster, not a lot of room for compromise. We use 2.

Large site: With a CAS scenario with MaxDOP of 6 we’ve seen replication blocking itself. Set it to one and it hasn’t reoccurred. (Note: MAXDOP was disabled)

Summary

MAXDOP can be useful for certain scenarios. Modify and monitor for performance gains, or possible performance penalties. Adjust as needed.

References:

Configure the max degree of parallelism Server Configuration Option

https://msdn.microsoft.com/en-us/library/ms189094.aspx

How It Works: Maximizing Max Degree Of Parallelism (MAXDOP)

From <https://blogs.msdn.microsoft.com/psssql/2013/09/27/how-it-works-maximizing-max-degree-of-parallelism-maxdop/>

TIP: Setting SQL Max Degree of Parallelism Achieve Big System Center SQL Performance Gains

(note: interesting, as always your mileage may vary)

From <http://www.systemcentercentral.com/tip-setting-sql-max-degree-of-parallelism-achieve-big-database-performance-gains/>

Posted in ConfigMgr, SQL, SQL Server 2016 | 1 Comment

SQL Server 2005 End of Life resources

Received an email today from Microsoft with some useful links and resources.

 

Extended support for SQL Server 2005 ended on April 12, 2016, which means you’ll no longer get security updates and hotfixes and could be at risk of data vulnerabilities.
Upgrade now to get the speed, productivity, and cost benefits of SQL Server 2014. Using familiar, trusted tools you can:

Achieve mission-critical performance.

Help maintain security and compliance.

Gain data insights faster.

Optimize your data infrastructure.

If you haven’t yet made the move, get started today—our resources will help you identify older versions of SQL Server running in your datacenter, review your upgrade options, and upgrade to a modern database that works for you.

Discover—Review your current IT infrastructure using the Microsoft Assessment and Planning Toolkit.

Target—Learn more about your upgrade options with our decision tool.

Upgrade—Use these resources to begin your upgrade:

SQL Server 2014 Upgrade Advisor

SQL Server 2014 Upgrade Technical Guide

Read more about our commitment to industry leading technology—see what’s coming up in SQL Server 2016.
Contact Microsoft or a partner for help today.

Posted in SQL Server, SQL Server 2014, SQL Server 2016 | Leave a comment

Upgrade from Oracle to SQL Server – free licenses

Just learned about an attractive offer from Microsoft to migrate Oracle, DB2 and a few others to SQL Server 2016 and get free licenses. You’ll need to purchase SA (Software Assurance), however, the cost savings can be fairly substantial!

SQL Server license migration

https://www.microsoft.com/en-us/cloud-platform/sql-license-migration

Posted in SQL Server, SQL Server 2016 | Leave a comment