TSQL2sday #94–SQL Server and PowerShell

T-SQL Tuesday (#tsql2sday), I’ve been absent for a few, figured it was time to jump back in!

SQL and PowerShell, what a great combination. First, let me thank the sponsor, you can find his site from the following link.

clip_image002

One of the first scripting languages I mastered for automation was VBScript. I know. Stop laughing.

However, that background gave me the ability to learn PowerShell and use it in ways to automate the more mundane aspects of SQL Server administration. Two examples of how PowerShell has proved to be an awesome “force multiplier” follow.

Using PowerShell to Automate a SQL Server database restore

The challenge was taking a backup from one SQL Server, written to a Server UNC, and restoring it to another SQL Server on a daily basis. This process is documented here. There were two challenging parts of this process; how to identify the latest backup file, since it was written out with a unique name. The other to restore it to a different location on the target server. Review the code to locate these solutions.

The bonus was not only automation, but a huge savings in time to complete the restore! Using the UI to restore the database ~650GB took over 2.5 hours, using PowerShell and the ADO methods, less than 1 hour!

Using PowerShell to Audit SQL Server Instances

Tasked with investigating many different SQL Server instances for configuration, possible performance issues and documenting settings; I investigated using PowerShell to gather this information via queries. Then, in turn, place these results in an Excel workbook. In this manner, it provides guidance in evaluating and making recommendations on improving settings such as configuration. Plus, you’ll have a baseline of current instance and database settings. This process is documented here.

This PowerShell script is extensible, I’ve re-written it once to take advantage of a re-usable connection object for enhanced performance. It likely needs to be enhanced again in the near future.

Summary

If you’ve been thinking of automating your SQL Server instance or database, check out PowerShell. The time invested in learning and creating solutions can not only save a LOT of time, it will free up your time to work on other things. Do share your creative solutions with the community.

Advertisements
Posted in PowerShell, SQL Server | Leave a comment

Installing Power BI Report Server

Thought I’d post my initial impressions of installing Power BI Report Server. Unless you’ve been disconnected from the web for any length of time, you’ve heard of Power BI. In May 2017, Microsoft released the Power BI Premium service tier, which includes the Power BI Report Server.

The home page will remind you of SSRS:

image

You have the option to install PBI Report Server as standalone service, or on the same server with SQL Server and/or SQL Server Reporting Services. In any event, you need to install PBI Report Server on a server running a SQL Server instance.

If you are familiar with the Power BI desktop software, there is a new PBI Desktop software you’ll need to install to work with PBI Report Server.

Here are the details and link:

Microsoft Power BI Report Server – June 2017 GA

clip_image001

Power BI Report Server, available as part of Power BI Premium, enables on-premises web and mobile viewing of Power BI reports, plus the enterprise reporting capabilities of SQL Server Reporting Services.

In this version of Power BI Report Server, you can:

  • Connect “live” to Analysis Services models – both Tabular and Multidimensional (cubes)
  • Visually explore data and create an interactive report
  • Use custom visuals in your Power BI Report
  • Save that report to your report server running the preview
  • View and interact with the report in your web browser
  • Add and view comments about the report in the web browser
  • View and interact with the report in the Power BI Mobile App

From <https://www.microsoft.com/en-us/download/details.aspx?id=55329>

Let’s walk though an install of Power BI Report Server

Run PowerBIReportServer.exe, select install PBI Report Server

clip_image001[4]

Choose evaluation version, unless you have the product key, click Next

clip_image002

Agree to license terms, click Next

clip_image003

You’re installing on a server with the database engine installed, click Next

clip_image004

Click Install

clip_image005

Monitor progress…

clip_image006

Installation complete.

clip_image007

Next up, configure report Server.

Connect to your local SQL Server instance

SNAGHTML5caa98

Note that once connected, your instance will be:

<hostname>\PBIRS

SNAGHTML5e0f68

Next, click the Web Service URL on left and change name of VirtualDirectory to: ReportServerPBI (this allows SSRS to co-exist). Click Apply.

SNAGHTML6a7aaa

Click database, on left

Click Next

image

Click Test Connection to validate. If everything is fine, connection succeeded

SNAGHTML612051

Add a PBI suffix to the Database name (so it doesn’t over-write SSRS)

Click Next

image

Use default service credentials, click next

image

Review summary, click next

SNAGHTML630d1d

Completion dialog

Click Finish

image

Completed successfully

SNAGHTML67e9fd

Click Web Portal URL on left, then add PBI suffix to Virtual Directory name, click Apply

SNAGHTML68f969

Click the hyperlink shown the last dialog and it should take you to the Power BI Server.

image

Watch this blog for future updates on using the PBI Server and the new PBI Desktop software.

Posted in Power BI Report Server, PowerBI, SQL, SQL Server | Leave a comment

Properly configure SQL Server ConfigMgr database

Ideally, this will take place before you install ConfigMgr!

First, estimate size(s) of the databases:

How? Use the  SCCM DB Sizing estimator!

Shown is an example of the Database sizing worksheet, based on estimated client counts:

image

Anthony has documented this approach as follows:

http://configmgr.com/configmgr-sql-server-sizing-and-optimization/

Get the sizing estimator here.

Using the estimator to calculate the number and size of the data files, pre-create the SQL Server database for ConfigMgr. Key take away, will be the total estimate sizes.

Multiple data files may be appropriate for a couple of reasons. SQL Server is multi-threaded, which means you may get performance gains by having multiple data files. Further, these data files could be located on different logical drives to further distribute load, and increase IOPS. on As a general rule of thumb, use the following as a guideline:

Total Estimate Client Count

Number of CM data files

< 10K

1

10K to 25K

2

>25K to 50K

4

>50K

8

Other factors to consider:

  • Drives dedicated for SQL data and logs should be formatted with NTFS in 64K block size.
  • Always place TempDB data and logs on a dedicated logical drive
  • DB files should be equal size (hint: divide total estimated size / # data files)
  • Create data files to be an even GB
  • Always pre-create the db files prior to installing CM – auto-growth is bad.
  • If you enable auto-growth:
    • Do not use percent (%) growth
    • Set auto-growth to be evenly divisible into a whole GB; (i.e. 512 or 1024MB)
  • Monitor available free space
  • Always use a single LOG file
  • If possible, place LOG file on separate logical drive

It is possible to migrate a single data file to multiple data files. Use this reference to get you started. It is an older article, the screen shots are missing, however the technique is valid.

Note: Be cautious if you attempt to implement in a multi-site hierarchy! Table partitioning may present an issue.

Final thoughts: As always, there is never a right or wrong way to do this, this is intended be used a guideline only. Test in a lab environment first. No cute furry animals were harmed with this process and never stick your fingers inside the cage.

Posted in ConfigMgr, SQL, SQL Server | 10 Comments

Supported SQL Server versions for SCCM CB upgrades

Had a question recently on what version of SQL Server is supported for upgrade to SCCM CB (current branch). Scenario was this; client was running SQL 2012 SP2 for their SCCM 2012 R2. However, in order to upgrade to SCCM  CB, they needed to be at a minimum SQL version of SQL 2012 SP3.

Then, the question became, can I just upgrade to SQL Server 2016 and skip an upgrade step?

Not with the SCCM version in place, this would probably break SCCM 2012 R2. In this case, I’d recommend installing SQL 2012 SP3. Upgrade SCCM to CB 1702 (or later), then upgrade SQL Server 2016.

Reference the SCCM / SQL Server compatibly version matrix for the details.

Posted in ConfigMgr, SQL Server | 4 Comments

Properly size SQL Server TempDB for ConfigMgr

At MMS 2017, Benjamin Reynolds and I covered properly sizing TempDB as one facet in our Optimizing and Tuning SQL Server for ConfigMgr session. Because of a few conversations that occurred that week at MMS, I think it is appropriate to cover this here in a bit more detail.

As background, SQL Server TempDB is used for temporary objects created by queries, sort operations and more… from Microsoft docs:

The tempdb system database is a global resource that is available to all users connected to the instance of SQL Server and is used to hold the following:

  • Temporary user objects that are explicitly created, such as: global or local temporary tables, temporary stored procedures, table variables, or cursors.

  • Internal objects that are created by the SQL Server Database Engine, for example, work tables to store intermediate results for spools or sorting.

First, the primary talking points:

  • Total TempDB should approximate 25-30% of the SCCM total size.
  • TempDB data files should be equally sized
  • Place TempDB on a dedicated drive, with log file
  • Create 4 (or more) equally sized data files. This helps you avoid the GAM/SGAM/PFS page contention issues described in Microsoft KB 2154845.
  • Turn off auto-growth
  • Create no more than 8 data files

Now, more detail on each of these points:

Total TempDB should approximate 25-30% of the SCCM total size.

As a starting point, calculate the total estimated size of the SCCM database. If you are not sure what that size will be, use Anthony Clendenen’s sizing calculator. Once you know the SCCM size, the total TempDB size can be calculated.

TempDB data files should be equally sized

This is often overlooked, once you have determined the total TempDB size, divide that by the number of data files (start with 4) and create EQUAL size data files. This will allow SQL Server to automatically choose the next available data file for operations.

Place TempDB on a dedicated drive, with log file

TempDB can be very IO intensive, if you have the ability to choose SSD storage for TempDB, do it. There is no benefit to separate the TempDB log file to another drive, place all TempDB data and log file(s) on a single, logical drive.

Create 4 (or more) equally sized data files. This helps you avoid the GAM/SGAM/PFS page contention issues described in Microsoft KB 2154845.

Page contention can occur on TempDB. By creating multiple data files, you can reduce contention as SQL Server will automatically roll to another TempDB data file if one is in use. Hence, performance can be increased by reducing contention.

Turn off auto-growth

If you have a logical drive dedicated for TempDB data and log file(s), consider filling the drive with the TempDB data files and turning off file Autogrowth.

Create no more than 8 data files

In general, never create more than 8 data files for TempDB. For SCCM, 4 equally sized data files seems to work quite well.

Posted in Uncategorized | 3 Comments

SQL Backup to URL (Azure Storage account)

How to backup up a copy of an on premise database, or Azure SQL Server database to an Azure Storage account.

Notes: Any names that need to be added at time of creation are indicated with <>, such as <sqlstorageaccountname> used in the next example are suggestions only and will not include the <> symbols! You are encouraged to develop your own naming conventions. Screen captures shown are representative examples of each task.

Backup Steps

  1. Create an Azure Storage account
  2. Create a Credential
  3. Perform a database backup

Create an Azure Storage account

In order to backup a SQL Server database to Azure, you will need to first create an Azure Storage account.

  1. The steps to backup a database to a URL are as follows:
    Login the Azure Portal https://portal.azure.com
  2. Locate Storage, click Add
  3. Create Storage container – Enter the following information:
    1. Name – <sqlstorageaccountname>Note: this must be globally unique!
    2. Account Kind – General Purpose
    3. Performance – Standard
    4. Storage service encryption – Enabled
    5. Resource group
      1. Create new
      2. <sqlbootcamp>

https://i1.wp.com/dataplatformbootcamp.azurewebsites.net/wp-content/uploads/2017/03/lab1b-1.png

  1. Within storage container (created last step)
  2. On the container properties page, under Blob Services, click Containers.
  3. To Create a container for the SQL backups, Click (+ Container)
    1. Name: <sqlbackup>
    2. Access type – Blob
    3. Click Create

https://i2.wp.com/dataplatformbootcamp.azurewebsites.net/wp-content/uploads/2017/03/lab1b-2.png

Select the newly created container (last step), then Properties. Locate the URL line on right and copy this information into the clipboard. Note: Pasting this information into Notepad is a convenient way to store this information for a subsequent lab (hint!).

https://i2.wp.com/dataplatformbootcamp.azurewebsites.net/wp-content/uploads/2017/03/lab1b-3.png

It should appear similar to this (substituting storage container and container name):
https://<sqlstorageaccountname&gt;.blob.core.windows.net/<sqlbackup>

  1. Next, obtain the access key for this storage account. Select the (root) storage container name, then under SETTINGS, select Access keys. To the right of key1, select the copy to clipboard icon. Save this key to notepad (paste) – it will be used in a future step.

https://i1.wp.com/dataplatformbootcamp.azurewebsites.net/wp-content/uploads/2017/03/lab1b-4.png

Creating a Credential

You will now create a Credential with SQL Server, which will allow access to the Azure storage account from SSMS. This Credential is stored locally within SQL Server.

Take a moment and review the following example – SQL Server credential is created for authentication to the Microsoft Azure Blob storage service.

Using storage account identity and access key

Notes: T-SQL Sample, <mycredentialname> = credential name used internally, <mystorageaccountname> = name of the storage account, <mystorageaccountaccesskey> = key 1 captured in previous task.

IF NOT EXISTS (SELECT * FROM sys.credentials WHERE name = '<mycredentialname>') 
CREATE CREDENTIAL [<mycredentialname>] WITH IDENTITY = '<mystorageaccountname>' ,
SECRET = '<mystorageaccountaccesskey>'; 

IF NOT EXISTS 
(SELECT * FROM sys.credentials WHERE name = '<mycredentialname>') 

CREATE CREDENTIAL [<mycredentialname>] WITH IDENTITY = '<mystorageaccountname>' 
,SECRET = '<mystorageaccountaccesskey>'; 

Now, you will translate this sample T-SQL into a credential for your Backup!

Using Notepad to build your T-SQL statement – Substitute the bracketed areas next with your account information.

CREATE CREDENTIAL [backup] WITH IDENTITY = '<YourAccountName>' ,
SECRET = '<YourStorageAccessKey>'; 
GO 

CREATE CREDENTIAL [backup] WITH IDENTITY = '<YourAccountName>' 
,SECRET = '<YourStorageAccessKey>'; 
GO 

IDENTITY = ‘YourAccountName‘ (Note: from Create an Azure Storage account – just the first name, not the complete FQDN)

SECRET = ‘YourStorageAccessKey‘ (Note: from Create an Azure Storage accountobtain storage access key – last step)

Then, copy the entire statement into the clipboard.

Open SQL Server Management Studio (SSMS) and connect to your local copy of SQL Server.

Click New Query.

Copy and paste the above T-SQL into the new query window.

Then click Execute.

The command should show Completed Successfully.

Perform a full database backup to URL

The following example perform a full database backup of the AdventureWorks2016 database to the Microsoft Azure Blob storage service. Take a moment and review the syntax.

Backup To URL using storage account identity and access key

CREATE CREDENTIAL [backup] 
WITH IDENTITY = '<YourAccountName>',
SECRET = '<YourStorageAccessKey>'; 
GO 

CREATE CREDENTIAL [backup] 
WITH IDENTITY = '<YourAccountName>', 
SECRET = '<YourStorageAccessKey>'; 

GO 

Note: You will need to substitute YourAccountName and backup

CREATE CREDENTIAL [backup] 
WITH IDENTITY = '<YourAccountName>',
SECRET = '<YourStorageAccessKey>'; 
GO 

CREATE CREDENTIAL [backup] 
WITH IDENTITY = '<YourAccountName>' 
,SECRET = '<YourStorageAccessKey>'; 
GO 

 

With the storage account name and blob name. Substitute the credential name if necessary.

Open SQL Server Management Studio (SSMS) and connect to your local copy of SQL Server.

Click New Query.

Copy and paste the following T-SQL into the new query window.

Then click Execute.

The command should show Completed Successfully.

Posted in Azure SQL Server, SQL Server | Leave a comment

Azure SQL Migration Tools

In preparing a presentation for data migration to Azure, thought I’d share some of my research. There are other methods not specified here, but this should be a good start! Each toolset includes notes about that tool, and where to find it.

SQL Server Management Studio – Migration Wizard

•Built into core SSMS

Blog: https://docs.microsoft.com/en-us/azure/sql-database/sql-database-cloud-migrate

Data Migration Assistant

•Enables assessment of your on-premises SQL Server instance migrating to Azure SQL databases

•Detects:

•Migration blocking issues

•Partially or unsupported features and functions

Blog: https://blogs.msdn.microsoft.com/datamigration/dma/

SQL Server Migration Assistant

•Supports migration for Oracle, MySQL, SAP ASE (formerly SAP Sybase ASE), DB2 and Access

•Lets users convert database schema to Microsoft SQL Server schema, upload the schema, and migrate data to the target SQL Server

Blog: https://blogs.msdn.microsoft.com/datamigration/2016/12/22/released-sql-server-migration-assistant-ssma-v7-2/

Azure Data Factory Copy Wizard

•Data Factory is a cloud-based data integration service that orchestrates and automates the movement and transformation of data

•Can be incredibly fast

Blog: https://docs.microsoft.com/en-us/azure/data-factory/data-factory-azure-copy-wizard

SQL Database Migration wizard

•Can be useful for validation

Source: https://sqlazuremw.codeplex.com/

Posted in Azure SQL Server, Migration, SQL Server | Tagged , | Leave a comment