2011-11-23

Check database using SMO

For a long time I have checked the databases by DBCC CHECKDB, but I would like to check the database in PowerShell with SMO to include the check in a more general maintenance script with additional features.

The check can be done with the SMO Database class method CheckTables().
[string]$ServerName = 'DNDBA01.dn.lan'
$Server = New-Object Microsoft.SqlServer.Management.Smo.Server $ServerName
$Server.Databases | ForEach-Object {
"{0:s}  Checking database [$($_.Name)]..." -f $([System.DateTime]::Now)
$_.CheckTables([Microsoft.SqlServer.Management.Smo.RepairType]::None, [Microsoft.SqlServer.Management.Smo.RepairOptions]::AllErrorMessages)
}

All errormessages are included to help the administrator if a check fail. Also it can be used for analytics.

Additional features to the script can be considered after reading the discussion by Cindy Gross.

A nice introduction to checking database integrity with SMO is
"Getting Started with SMO in SQL 2005 - Integrity Checks" by Jasper Smith.

2011-10-13

Red-Gate SQL Compare with PowerShell

I would like to compare the structure of two databases as in development vs production.
The tool SQL Compare from Red-Gate I have used on a daily basis, and I want similar result without the GUI, so that I can automate a comparison.
In the SQL Toolbelt there are beside SQL Compare a SQL Comparison SDK, but it is a version behind SQL Compare. One thing I would miss is the possibility to compare with a backup or a scriptfile.
By looking into the C# examples of the SDK and the assemblies with Visual Studio Object Browser, this (very) simple example gives a usable result. Please consider the example as a technical sprint, not a final solution.

# Generate path for Red-Gate SQL Compare assemblies
$SqlComparePath = "\Red Gate\SQL Compare 9"
# True if the OS is 64 bit (x64)
if([IntPtr]::Size -eq 8) { $SqlComparePath = ${env:ProgramFiles(x86)} + $SqlComparePath }
else { $SqlComparePath = ${env:ProgramFiles} + $SqlComparePath }

# Add Red-Gate Compare assembly
Add-Type -Path "$SqlComparePath\RedGate.SQLCompare.Engine.dll"

# Register staging database
[string]$stagingServerName = 'SQLDEV42.sqladmin.lan'
[string]$stagingDbName = 'sqladmin_repository'
$stagingConnectionProperties = New-Object RedGate.SQLCompare.Engine.ConnectionProperties $stagingServerName, $stagingDbName
$stagingDb = New-Object RedGate.SQLCompare.Engine.Database
$stagingDb.Register( $stagingConnectionProperties, [RedGate.SQLCompare.Engine.Options]::Default )

# Register Production database
[string]$prodServerName = 'SQLPRD42.sqladmin.lan'
[string]$prodDbName = 'sqladmin_repository'
$prodConnectionProperties = New-Object RedGate.SQLCompare.Engine.ConnectionProperties $prodServerName, $prodDbName
$prodDb = New-Object RedGate.SQLCompare.Engine.Database
$prodDb.Register( $prodConnectionProperties, [RedGate.SQLCompare.Engine.Options]::Default )

# Compare databases. Result in RedGate.SQLCompare.Engine.Difference object
$stagingDb.CompareWith( $prodDb, [RedGate.SQLCompare.Engine.Options]::Default ) |
Format-Table -Property Type,DatabaseObjectType,Name -AutoSize


The result is something like this:
 Type DatabaseObjectType Name
 ---- ------------------ ----
Equal              Table [sqladmin].[collation]
Equal              Table [sqladmin].[computer]
Equal              Table [sqladmin].[database]
Equal              Table [sqladmin].[database_history]
Equal              Table [sqladmin].[department]
Equal              Table [sqladmin].[edition]
Equal              Table [sqladmin].[environment]
Equal              Table [sqladmin].[file]
Equal              Table [sqladmin].[filegroup]
Equal              Table [sqladmin].[security]
Equal              Table [sqladmin].[ssdb]
Equal              Table [sqladmin].[version]
Equal               Role public
Equal               Role sqlanchor_user
Equal               Role db_owner
Equal               Role db_accessadmin
Equal               Role db_securityadmin
Equal               Role db_ddladmin
Equal               Role db_backupoperator
Equal               Role db_datareader
Equal               Role db_datawriter
Equal               Role db_denydatareader
Equal               Role db_denydatawriter
Equal             Schema dbo
Equal             Schema guest
Equal             Schema INFORMATION_SCHEMA
Equal             Schema sys
Equal             Schema sqladmin
Equal             Schema sqlanchor
Equal             Schema db_owner
Equal             Schema db_accessadmin
Equal             Schema db_securityadmin
Equal             Schema db_ddladmin
Equal             Schema db_backupoperator
Equal             Schema db_datareader
Equal             Schema db_datawriter
Equal             Schema db_denydatareader
Equal             Schema db_denydatawriter
Equal               View [sqladmin].[v_computer]
Equal               View [sqladmin].[v_database]
Equal               View [sqladmin].[v_ssdb]
Equal               View [sqladmin].[v_ssdb_full]
Equal    StoredProcedure [sqlanchor].[ssdb_version-get]
Equal    StoredProcedure [sqlanchor].[environment-get]
Equal    StoredProcedure [sqlanchor].[environment_detail-get]
Equal    StoredProcedure [sqlanchor].[computer_summary-get]

...

Before you use the assemblies, take a look at the Red-Gate license agreement.

2011-09-27

Script a database definition

Recreating a database in a higher environment is a common task. Quite often it is not that smart just to restore a back, as you then get database users, file sizes, obsolete schemas etc into the higher environment.
Scripting out a database can be done from SQL Server Management Studio, but this is a manual operation. When you are serving a small army of developers, they need a definition of the production database on a regular basis. Doing that by SQL Server Management Studio quickly gets boring.
There are third party tools like Red Gate SQL Compare that can generate a schema snapshot or a direct comparison, but such a tool is expensive for a simple task like this. In many other situations I am really happy with such tools, but that is another story.
This PowerShell script generates the T-SQL script to create a database and its structure. The script can be edited and reviewed before execution, and can serve as documentation in a Change procedure.
[string]$SqlServerName = '(local)'
[string]$DbName = 'AdventureWorks'

[string]$SqlBatchSeperator = 'GO'  # SQL Server Management Studio default: GO

"/*`nT-SQL script to define the structure of the database '$($DbName)' on the server '$($SqlServerName)'."
"Script generated $($(Get-Date).ToUniversalTime().ToString('s')) UTC"
"  by the user $([Security.Principal.WindowsIdentity]::GetCurrent().Name)"
"  on the computer $($env:COMPUTERNAME)"
"  with the script file '$($MyInvocation.InvocationName)'"
"*/"

[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.Smo') | Out-Null
$Server = New-Object Microsoft.SqlServer.Management.Smo.Server $SqlServerName
$Options = New-Object Microsoft.SqlServer.Management.Smo.ScriptingOptions
$Options.IncludeHeaders = $true
$Options.Permissions = $true

"`n--*** Logins ***"
$Server.Logins | Where-Object { ($_.Name -ne 'sa') -and ($_.Name -notlike '##MS_Policy*') -and ($_.Name -notlike 'NT *') } | ForEach-Object { $_.Script($Options);$SqlBatchSeperator }
"`n--*** Database definition ***"
$Server.Databases[$DbName].Script();$SqlBatchSeperator
"`n--EXECUTE [$DbName].dbo.sp_changedbowner @loginame = N'sa', @map = true;`n$SqlBatchSeperator"
"`nALTER AUTHORIZATION ON DATABASE::[$DbName] TO [sa];`n$SqlBatchSeperator"
"`nUSE [$DbName];`n$SqlBatchSeperator"
"`n`n--*** Userdefined applicationroles ***"
$Server.Databases[$DbName].ApplicationRoles | Where-Object { $_.IsFixedRole -eq $false } | ForEach-Object { $_.Script($Options);$SqlBatchSeperator }
"`n`n--*** Userdefined databaseroles ***"
$Server.Databases[$DbName].Roles | Where-Object { $_.IsFixedRole -eq $false } | ForEach-Object { $_.Script($Options);$SqlBatchSeperator }
"`n`n--*** Userdefined schemas ***"
$Server.Databases[$DbName].Schemas | Where-Object { $_.IsSystemObject -eq $false } | ForEach-Object { $_.Script($Options);$SqlBatchSeperator }

$Options.IncludeDatabaseRoleMemberships = $true

"`n`n--*** Database users ***"
$Server.Databases[$DbName].Users | Where-Object { $_.IsSystemObject -eq $false } | ForEach-Object { $_.Script($Options);$SqlBatchSeperator }

$Options.DriAll = $true
$Options.Indexes = $true

"`n`n--*** Userdefined assemblies ***"
$Server.Databases[$DbName].Assemblies | Where-Object { $_.IsSystemObject -eq $false } | ForEach-Object { $_.Script($Options);$SqlBatchSeperator }
"`n`n--*** Userdefined certificates ***"
$Server.Databases[$DbName].Certificates | Where-Object { $_.IsSystemObject -eq $false } | ForEach-Object { $_.Script($Options);$SqlBatchSeperator }
"`n`n--*** Userdefined defaults ***"
$Server.Databases[$DbName].Defaults | Where-Object { $_.IsSystemObject -eq $false } | ForEach-Object { $_.Script($Options);$SqlBatchSeperator }
"`n`n--*** Userdefined fulltext catalogs ***"
$Server.Databases[$DbName].FullTextCatalogs | Where-Object { $_.IsSystemObject -eq $false } | ForEach-Object { $_.Script($Options);$SqlBatchSeperator }
"`n`n--*** Userdefined rules ***"
$Server.Databases[$DbName].Rules | Where-Object { $_.IsSystemObject -eq $false } | ForEach-Object { $_.Script($Options);$SqlBatchSeperator }
"`n`n--*** Userdefined stored procedures ***"
$Server.Databases[$DbName].StoredProcedures | Where-Object { $_.IsSystemObject -eq $false } | ForEach-Object { $_.Script($Options);$SqlBatchSeperator }
"`n`n--*** Userdefined tables with indexes ***"
$Server.Databases[$DbName].Tables | Where-Object { $_.IsSystemObject -eq $false } | ForEach-Object { $_.Script($Options);$SqlBatchSeperator }
"`n`n--*** Userdefined Triggers ***"
$Server.Databases[$DbName].Triggers | Where-Object { $_.IsSystemObject -eq $false } | ForEach-Object { $_.Script($Options);$SqlBatchSeperator }
"`n`n--*** Userdefined functions ***"
$Server.Databases[$DbName].UserDefinedFunctions | Where-Object { $_.IsSystemObject -eq $false } | ForEach-Object { $_.Script($Options);$SqlBatchSeperator }
"`n`n--*** Userdefined datatypes ***"
$Server.Databases[$DbName].UserDefindTypes | Where-Object { $_.IsSystemObject -eq $false } | ForEach-Object { $_.Script($Options);$SqlBatchSeperator }
"`n`n--*** Userdefined views ***"
$Server.Databases[$DbName].Views | Where-Object { $_.IsSystemObject -eq $false } | ForEach-Object { $_.Script($Options);$SqlBatchSeperator }


The script covers the basics, and I think that it is easy to fit it to a equal task.

The script can be used in a Visual Studio solution to compare with a database in development. Such a comparison can give a deployment script for a higher and isolated environment.

Reference

TechNet Library: SMO "Server Class"

2011-09-21

PowerShell 3 CTP

The preview is officially released with "Windows Management Framework 3.0 Community Technology Preview". The details are described in a entry on the Windows PowerShell Blog:
Windows Management Framework 3.0 Community Technology Preview (CTP) #1 Available for Download

The autumn looks to be most interesting :-)

2011-09-18

Windows Server 8 and SQL Server 2008 R2

Together with the very public release of Windows 8, Microsoft released Windows Server 8.
Earlier this year I attended TechEd North America, and asked the Microsoft people about a "Windows Server Next", but nobody could say anything. Actually most of the answers were like "I don't know anything about a new Windows Server version.". On the other hand there were a lot about SQL Server "Denali" and Visual Studio Next. The three products usually are released in a new major version rather close together.
But now the preview for Windows Server 8 is here. I got mine from my MSDN subscription. It is not general available like the Windows 8 desktop edition.

Installing Windows Server 8 on vmware was quite simple and without challenges. The installation was done on vmware workstation 8 as Windows 8 the other day.
I installed the full version. Windows Core I will take a look at another time. Especially when I have figured out how to install SQL Server "Denali" on Windows Server 8.

After the initial installation when the system started for the first time, I took a quick look at the used storage.
The storage amount used for the Windows installation is ok. This is nice when you consider consolidation by server virtualization.

Installing SQL Server 2008 R2 (Enterprise Edition x64) requires some .NET and Windows Installer.
This is handled by the installation. Just click OK. The screen shows nothing for a minute or two, but some activity can be spotted in the Task Manager.
In this case I installed Database Services with Fulltext Services, Reporting Services, Integration Services,  Business Intelligence Development Studio and all Management Tools.
After the installation, a little more store is used.
More precisely 5.9 GB are used by the SQL Server installation.


The new Start design I am working on getting used to.
In general the SQL Server 2008 behaves normally.

BTW - When I activated Windows Update, there were 13 critical updates for Windows Server Preview.

SQL Server on Windows 8

The next Windows has just been released as a preview, both as desktop and as server.
Downloading from MSDN took some time. It looked like I was not the only one downloading ;-)

When I tried to install the Windows 8 desktop as a virtual computer i vmware, the initial boot failed with a "HAL_INITIALIZATION_FAILED".
A little browsing using my favorite web search engine, I found several  indications that the problem is a new HAL (Hardware Abstraction Layer) in Windows 8. A HAL that vmware 7.n can not handle...
The next version of vmware (8.0) was released 2011-09-13, but the licensing is not open until the week of 2011-09-19.
Still it was no problem to acquire a trial license and install it (vmware workstation) on my desktop running Windows 7 Ultimate x64.

On the new vmware the Windows 8 installation went without any problems. :-)

The vmware tools also installed on the virtual Windows 8 without any issues.
One thing I learned the hard way is that installing the vmware tools locks the GUI on the virtual Windows 8.

I tried to install SQL Server "Denali", but it failed right in the beginning without any clues. No folders are created, and no entries are made in the registry.

The installation of SQL Server 2008 R2 (Developer Edition, x64) went as on Windows 7. Installing Service Pack 1 went as normal.
Also it looks like everything is running fine. :-)

The new Start menu in Windows 8 holds the SQL Server 2008 icons like this:
From a SQL Server perspective, I have not found a major difference between Windows 7 and Windows 8.

2011-08-29

Servername in SQLCMD script

A variable can be used in both SQLCMD and T-SQL statements.
This is a very short example:
:Setvar _servername "SANDY.sqladmin.lan"
:CONNECT $(_servername)
SELECT
  N'$(_servername)' AS [_servername]
 ,SERVERPROPERTY('SERVERNAME') AS [servername];

2011-07-14

SQL Server 2008 R2 SP1

Service Pack 1 for Microsoft SQL Server 2008 R2 was released 2011-07-11. The Service Pack is described in KB2463333, and the download is found here.
This is a short example on upgrading a test server running Windows Server 2008 R2 x64.

I downloaded the file "SQLServer2008R2SP1-KB2528583-x64-ENU.exe" and executed it as Administrator.
After some unpacking the first windows looks like when we are installing SQL Server.
Fig. 1. The window "SQL Server 2008 R2 update".
 After accepting the license terms, it is time to select what to update. The info to the right about language etc. is only shown when a item is selected in the field "Instances".
Fig. 2. The window "Select Features".
 Then the installation checks the files in use.
Fig. 3. The window "Check Files In Use".
 Like when installing SQL Server, a summary of the effective installation is shown before executing the installation. I like this opportunity to check before actually executing.
Fig. 4. The window "Ready to update".
Why the title is spelled with starting minuscles  where all other titles are spelled with starting capitals I do not know...
The completion message also gives the path for the summary log of the update.
Fig. 5. The window "Complete".
Please notice that the update does not require a computer restart, but the database instance is down during some of the update.
This is shown in the SQL Server Error Log with the start of the update in the elder SQL Server Error Log (e.g. "ERRORLOG.1").
2011-07-14 08:24:58.69 spid53      Configuration option 'Agent XPs' changed from 1 to 0. Run the RECONFIGURE statement to install.
2011-07-14 08:24:58.70 spid53      FILESTREAM: effective level = 0, configured level = 0, file system access share name = 'MSSQLSERVER'.
2011-07-14 08:25:07.12 spid13s     Service Broker manager has shut down.
2011-07-14 08:25:07.31 spid7s      SQL Server is terminating in response to a 'stop' request from Service Control Manager. This is an informational message only. No user action is required.
2011-07-14 08:25:07.31 spid7s      SQL Trace was stopped due to server shutdown. Trace ID = '1'. This is an informational message only; no user action is required.
2011-07-14 08:25:07.36 Server      The SQL Server Network Interface library successfully deregistered the Service Principal Name (SPN) [ MSSQLSvc/SANDY.SQLADMIN.LAN ] for the SQL Server service. 
2011-07-14 08:25:07.36 Server      The SQL Server Network Interface library successfully deregistered the Service Principal Name (SPN) [ MSSQLSvc/SANDY.SQLADMIN.LAN:1433 ] for the SQL Server service. 

In this case the database instance was started about three minutes after the shutdown.
2011-07-14 08:28:47.26 Server      Microsoft SQL Server 2008 R2 (SP1) - 10.50.2500.0 (X64) 
Jun 17 2011 00:54:03 
Copyright (c) Microsoft Corporation
Enterprise Edition (64-bit) on Windows NT 6.1 <X64> (Build 7600: ) (Hypervisor)


2011-07-14 08:28:47.28 Server      (c) Microsoft Corporation.
2011-07-14 08:28:47.28 Server      All rights reserved.
2011-07-14 08:28:47.28 Server      Server process ID is 2436.
Notice that the new version number is 10.50.2500. Before upgrade the version number was 10.50.1600.

When the database instance is started again, the SQL Server Error Log shows that a lot of changes is done after the database instance is online.
2011-07-14 08:28:50.83 spid7s      Database 'master' is upgrading script 'msdb_upgrade_discovery.sql' from level 171050560 to level 171051460.
2011-07-14 08:28:50.84 spid7s       
2011-07-14 08:28:50.84 spid7s      ----------------------------------------------------------------
2011-07-14 08:28:50.84 spid7s      msdb_upgrade_discovery starting
2011-07-14 08:28:50.88 spid7s      MSDB format is: SQL Server 2008
2011-07-14 08:28:50.92 spid7s      User 'sa' is changing database script level entry 4 to a value of 2.
2011-07-14 08:28:50.93 spid7s      User 'sa' is changing database script level entry 5 to a value of 2.
2011-07-14 08:28:50.93 spid7s      User 'sa' is changing database script level entry 6 to a value of 2.
2011-07-14 08:28:50.93 spid7s      User 'sa' is changing database script level entry 6 to a value of 0.
2011-07-14 08:28:50.94 spid7s      Running SQL Server 2005 SP2 to SQL Server 2008 upgrade script
2011-07-14 08:28:50.94 spid7s      ----------------------------------------------------------------
2011-07-14 08:28:50.94 spid7s       
2011-07-14 08:28:51.77 spid7s      Database 'master' is upgrading script 'sqlagent100_msdb_upgrade.sql' from level 0 to level 2.
2011-07-14 08:28:51.77 spid7s      ----------------------------------------
2011-07-14 08:28:51.77 spid7s      Starting execution of PREINSTMSDB100.SQL
2011-07-14 08:28:51.77 spid7s      ----------------------------------------
2011-07-14 08:28:52.00 spid7s      Setting database option COMPATIBILITY_LEVEL to 100 for database msdb.
... and so on

Most changes looks like they are done by executing T-SQL script files.
I tried afterwards to find the files, but I could not. Hopefully the files are deleted after the update. This is OK from an audit point of view.
In this case executing the last changes took about half a minute.
2011-07-14 08:29:18.99 spid7s      Recovery is complete. This is an informational message only. No user action is required.

In the same folder as the summary log (Fig. 5) a detail log is written in the file "Detail.txt". Where the summary log in the file "Summary_SANDY_20110714_081920.txt" is 5 KB, the detail log is 753 KB.
According to the summary log, the update took about ten minutes overall and about six minutes on the database instance.
Overall summary:
  Final result:                  Passed
  Exit code (Decimal):           0
  Exit message:                  Passed
  Start time:                    2011-07-14 08:19:31
  End time:                      2011-07-14 08:29:38
  Requested action:              Patch


Instance MSSQLSERVER overall summary:
  Final result:                  Passed
  Exit code (Decimal):           0
  Exit message:                  Passed
  Start time:                    2011-07-14 08:23:56
  End time:                      2011-07-14 08:29:33
  Requested action:              Patch
This fits with the start to end in the detail log.
2011-07-14 08:19:32 Slp: Log provider 'Microsoft.SqlServer.Chainer.Infrastructure.LogProviderFile' has been registered
...
2011-07-14 08:29:52 Slp: Setup result: 0

Short said the update took ten minutes overall where the update on the database instance took six minutes with three minutes downtime.

SQL Server Management Studio (SSMS) is upgraded by the same file. You do not have to do a seperate udgrade on SSMS using the file "SQLManagementStudio_x{64 | 86}_ENU.exe".

2011-06-29

Simple and quick

Another great example on a simple type being faster than a object attribute:
PowerTip of the Day > Speed Up Loops.

Another facet of KISS.

2011-06-27

Timestamp difference

I was looking into Windows Application Log and SQL Server Error Log to investigate a incident, and needed to calculate the effective duration of the incident.

This is actually quite simple using PowerShell and the CmdLet Get-Date:
(Get-Date "24-06-2011 08:00:24") - (Get-Date "23-06-2011 16:24:49")

Days              : 0
Hours             : 15
Minutes           : 35
Seconds           : 35
Milliseconds      : 0
Ticks             : 561350000000
TotalDays         : 0,649710648148148
TotalHours        : 15,5930555555556
TotalMinutes      : 935,583333333333
TotalSeconds      : 56135
TotalMilliseconds : 56135000

A more compact answer can be acquired by
((Get-Date "24-06-2011 08:00:24") - (Get-Date "23-06-2011 16:24:49")).ToString()
15:35:35

The statement converts the timestamp strings to DateTime (System.DateTime) objects, subtracts the two objects and returns a TimeSpan (System.TimeSpan) object.

2011-05-31

HP OfficeJet 8000

The last couple of weeks we have had increasing problems with our printer for the entire family. We are using a HP OfficeJet Pro 8000, that is connected to the 1 Gb switch together with the workstations and the router.
More and more documents and web pages failed on printing but remained in the queue and blocked other jobs.
We are using both Secunia and SlimDrivers and neither reported a missing update.

To fix the problem I changed both the printer and the workstations:

  • The WSD was configured for discovery only.
  • The existing printer registration was deleted from Devices and Printers.
  • The port was deleted.
  • The driver was deleted with installation image - together with other HP drivers.
The firmware was updated, so I downloaded and installed the complete package for the printer from HP Support.
Also I downloaded and installed the SLP update.
All our computers are running Windows 7 x64, and used the same installation set.

After a restart all print jobs, that failed before, runs with success.
:-)
We are now challenging the printer for the next couple of weeks with various jobs. If we can avoid a purchase of a new printer, we think we have a success.

The conslusion for the moment must be something like: Do not trust central driver updates.
I know I have said something like this before (ati-driver-on-windows-7), but it is still relevant.

2011-05-06

Failed:(0) Alter failed for Server 'abc\\def'

A maintenance plan created with the wizard failed in a Check Database Integrity task with the message in the log file:
Failed:(0) Alter failed for Server 'abc\\def'.
The SQL Server instance is 2008 SP1 (10.0.2531).

Several writings like the Connect case 126163 is talking about Allow Updates, but setting the I/O affinity to all solved the issue.
The solution is in a discussion on SQL Developer Center.

Unfortunately is the root cause unknown.

2011-04-24

SQLIO Batch

SQLIO is a tool that must be installed on the computer where you want to test the storage. The installation is rather simple. I did a few changes away from the default installation.
Default settings. I changed to "Everyone" so that my collegueages can use the tool. The folder was changed to "C:\SQLAdmin\SQLIO".
The tool is 32-bit no matter what Windows build you install it on.
If you want to keep the registry clean, you can install SQLIO on a workstation using the MSI file and copy "sqlio.exe" to the server.

Syntax

Sqlio.exe –k{R|W} –t<threads> -s<seconds> -d<drive(-s)> -R<drive(-s)> -f<stripe> -p[I]<affinity> -a[R[I]]<mask> -o -b<size> -i<ios> -m[{C|S}]<blocks> -L[{S|P}][i]<latency> -B[{N|Y|H|S}] –S<blocks> -v1.1.1 –F<parmfile>

Default

sqlio -kR -t1 -s30 -f64 -b2 -i64 -BN testfile.dat

Example

sqlio -kW -t2 -s120 -dM -o1 -frandom -b64 -BH -LS Testfile.dat

Parameters

  • -k{R|W}: Kind of IO (R=reads, W=writes). Do Write before Read, so that there is something to read.
  • -t<threads>: Number of threads. Maximum is 256.
  • -s<seconds>: Number of seconds to run.
  • -d<drive(-s)>: Use same filename on each drive letter given. Multiple drives are possible. Write driveletters without space delimiter. Maximum is 256.
  • -R<drive(-s)>: Raw drive letters/number for I/O. Multiple drives are possible. Maximum is 256.
  • -f<stripe>: Stripe size in blocks, random, or sequential.
  • -p[I]<affinity>: CPU number for affinity. 0 (zero) based (I=ideal).
  • -a[R[I]]<mask>: CPU mask for (R=roundrobin (I=ideal)) affinity.
  • -o<outstanding>: Depth to use for completion routines. Requests are not piling up.
  • -b<size>: I/O block size in KiB.
  • -i<IOs>: Number of I/Os per I/O run.
  • -m[{C|S}]<blocks>: Do multi block IO (C=copy, S=scatter/gather).
  • -L[{S|P}][i]<latency>: Latencies from (S=system, P=processor) timer.
  • -B[{N|Y|H|S}]: Set buffering (N=none, Y=all, H=hardware, S=software).
  • -S<blocks>: Start I/Os number of blocks into file.
  • -v1.1.1: I/Os runs use same blocks, as in version 1.1.1. sqlio.exe is version 1.5.SG. Check version with the parameter “-?”.
  • -F<parmfile>: Read parameters from <parmfile>. The length of the filename has a maximum of 256.

Scenario

All tests use up to 8 processing threads, run for 10 minutes each, are enabled for multi-IO, and allow up to between 16 and 64 outstanding IOs each.
Files used include 2 data files and 1 log file.
  • Aligned or not.
  • 4 or 64 KB NTFS cluster size
  • VRAID5 or -10
The tests include each of the following:
  1. 8 KB random writes to the data files (pattern for checkpoints, tempdb, etc.)
  2. 8 KB random reads to the data files (pattern for random data reads, singleton seeks, etc.)
  3. 64 KB sequential writes to the log file (bulk log writes)
  4. 8 KB random reads to the log file (rollbacks, log reader, etc.)
  5. 1 KB sequential writes to the log file (small log writes)
  6. 64 KB sequential writes to the data files (checkpoints, reindex, bulk inserts)
  7. 64 KB sequential reads to the data files (read-ahead, reindex, checkdb)
  8. 128 KB sequential reads to the data files (read-ahead, reindex, checkdb)
  9. 128 KB sequential writes to the data files (bulk inserts, reindex)
  10. 256 KB sequential reads to the data files (read-ahead, reindex)
  11. 1 MB sequential reads to the data files (backups)

Batch

This PowerShell script executes SQLIO with various parameter values
#requires -version 2.0
$Drive = @('G', 'H', 'I', 'J')

$IO_Kind = @('W', 'R')  # Write before read so that there is something to read.
$Threads = @(2, 4, 8)
#$Threads = @(2, 4, 8, 16, 32, 64)
$Seconds = 10*60  # Five minutes
$Factor = @('random', 'sequential')
$Outstanding = @(1, 2, 4, 8, 16, 32, 64, 128)
$BlockSize = @(1, 8, 64, 128, 256, 1024)

Clear-Host

$ResultFileName = "SQLIO_Result.{0:yyyy-MM-dd}.txt" -f $([System.DateTime]::Now)
"{0:s}  Results are written to the file `"$ResultFileName`"." -f $([System.DateTime]::Now)
$InvokeNumber = 0
$InvokeCount = $($IO_Kind.Count * $Threads.Count * $Drive.Count * $Factor.Count * $Outstanding.Count * $BlockSize.Count)

"{0:s}  Batch executed by the user `"$([Security.Principal.WindowsIdentity]::GetCurrent().Name)`" on the computer `"$($env:COMPUTERNAME)(.$($env:USERDNSDOMAIN))`"." -f $([System.DateTime]::Now) > $ResultFileName

foreach ($_blocksize in $BlockSize) {
foreach ($_outstanding in $Outstanding) {
foreach ($_factor in $Factor) {
foreach ($_drive in $Drive) {
foreach ($_threads in $Threads) {
$IO_Kind | ForEach-Object {
$InvokeNumber++
"`n{0:s}  Step $InvokeNumber of $InvokeCount." -f $([System.DateTime]::Now) >> $ResultFileName
$SqlioCmd = "-k$_ -t$_threads -s$Seconds -d$_drive -f$_factor -o$_outstanding -b$_blocksize -LS -BH sqlio.test.dat"
Write-Progress -Activity 'Running SQLIO (sqlio.exe).' -Status "Step $InvokeNumber of $($InvokeCount): $SqlioCmd" -Id 1 -PercentComplete ($InvokeNumber/$InvokeCount * 100)
"{0:s}  SQLIO parameters: $SqlioCmd" -f $([System.DateTime]::Now)  >> $ResultFileName
Invoke-Expression -Command ".\sqlio.exe $SqlioCmd" >> $ResultFileName
if ($InvokeNumber -ne $InvokeCount) {
"{0:s}  Step finshed. Batch paused 60 seconds to allow I/O idle state." -f $([System.DateTime]::Now) >> $ResultFileName
Start-Sleep -Seconds 60
}
}
}
}
}
}
}
"`n{0:s}  SQLIO batch finshed." -f $([System.DateTime]::Now)  >> $ResultFileName

The usage of the script is just calling the script:
./SQLIO_Batch.ps1
Typically a batch run will take about three days and will generate a 1.1 MiB result file.

Parse

This PowerShell script parse a result file from the script above "SQLIO_Batch.ps1". The parse script is based on Perl script from Linchi Shea "Parse the sqlio.exe Output".
#requires -version 2.0
param ([string]$ResultFileName = $(throw 'The parameter "ResultFileName" requires a file name.'))

$Sqlio = @()
ForEach($line in Get-Content $ResultFileName) {
$match = $line -match "-k(?<kind>\w) -t(?<threads>\d+) -s(?<seconds>\d+) -d(?<drive>\w*) -f(?<stripe>\w*) -o(?<outstanding>\d+) -b(?<size>\d+)"

if ($match -eq $true) {

$Kind = $Matches['kind']

$Threads = [int]$Matches['threads']
$Seconds = [int]$Matches['seconds']
$Drive = $Matches['drive']
$Stripe = $Matches['stripe']
$Outstanding = [int]$Matches['outstanding']
$Size = [int]$Matches['size']
}
$match = $line -match "^IOs/sec: *(?<ios_per_sec>\d+.\d+)"
if ($match -eq $true) { $IOs = [float]$Matches['ios_per_sec'] }
$match = $line -match "^MBs/sec: *(?<mbs_per_sec>\d+.\d+)"
if ($match -eq $true) { $MBs = [float]$Matches['mbs_per_sec'] }
$match = $line -match "^Min_Latency\(ms\): *(?<min_latency>\d+)"
if ($match -eq $true) { $Latency_min = [int]$Matches['min_latency'] }
$match = $line -match "^Avg_Latency\(ms\): *(?<avg_latency>\d+)"
if ($match -eq $true) { $Latency_avg = [int]$Matches['avg_latency'] }

$match = $line -match "^Max_Latency\(ms\): *(?<max_latency>\d+)"
if ($match -eq $true) { 
$Sqlio += New-Object PSObject -Property @{
Kind = $Kind
Threads = $Threads
Seconds = $Seconds
Drive = $Drive
Stripe = $Stripe
Outstanding = $Outstanding
Size = $Size
IOs = $IOs
MBs = $MBs
Latency_min = $Latency_min
Latency_avg = $Latency_avg
Latency_max = [int]$Matches['max_latency']
}
}
}

#$Sqlio | Format-Table -Property IOs,MBs,Latency_min,Latency_avg,Latency_max -AutoSize
$Sqlio | Format-Table -Property Kind,Threads,Seconds,Drive,Stripe,Outstanding,Size,IOs,MBs,Latency_min,Latency_avg,Latency_max -AutoSize

#$Sqlio | Export-Csv SQLIO_Parse.csv
The script can in the end be altered to give a filtered output in a table or output in a CSV file. A CSV file can be imported in a spreadsheet for further analysis.
Usage of the script requires the name of a result file in the parameter "ResultFileName":
./SQLIO-Parse.ps1 -ResultFileName <file name>
This is an example on a script execution:
./SQLIO-Parse.ps1 -ResultFileName 'SQLIO_Result.2010-12-24.txt'

Discussion

The amount of data is so large, so a deeper analysis could be done using SQL Server Analysis Services.
The results has in my case been rather usefull to get a nice talk with the storage guys about storage and its usage and configuration.

History

2011-04-24 : Entry postet.
2013-09-10 : Parse script fixed on threads as described in comment by Manuel (2013-02-27).
2015-02-03 : Solution discussed by David Klee in the blog post "Storage Benchmarking with SQLIO Batch".
2017-08-22 : Solution mentioned in session on PASS Summit 2014 by David Klee about "Right-Sizing Your SQL Server Virtual Machine" (46:50).

2011-04-14

Token-based server access validation failed

After installing the first SQL Server 2008 R2 in a new domain, I got this error when I tried to log on the database instance:
2011-04-14 13:46:51.35 Logon       Error: 18456, Severity: 14, State: 11.
2011-04-14 13:46:51.35 Logon       Login failed for user 'SANDBOX\myadministrator'. Reason: Token-based server access validation failed with an infrastructure error. Check for previous errors. [CLIENT: <local machine>]

A restart of the database service did not help. Neither did a "Run as administrator" to circumvent the UAC.
A discussion on SQL Server Central showed me the way.
I logged off all computers in the domain to refresh my security token, and then a login could be done - also without "Run as administrator"

Reference:

2011-03-29

Installing DTS Components

I had to work on a installation with DTS packages and was forced to install the DTS components "DTS Designer Components" and "DTS Run-Time Components".

After the installation I could still not open DTS packages. I got a error like
SQL Server 2000 DTS Designer components are required to edit DTS packages. Install the special Web download, "SQL Server 2000 DTS Designer Components" to use this feature. (Microsoft.SqlServer.DtsObjectExplorerUI)

A quick search gave me the thread "SQL Server 2000 DTS Designer components are required to edit DTS packages - in SQL 2008". I changed the environmental variable as described where I put the value for DTS before anything from SQL Server 2008 and Visual Studio.
The thread is saying that restarting Management Studio is enough to get the new setting, but I had to restart the workstation. Perhaps as I am not local administrator with the my normal user.

Reference

MSDN Library: "Support for SQL Server 2000 DTS in SQL Server 2008 R2"
SQL Server Forums: "SQL Server 2000 DTS Designer components are required to edit DTS packages - in SQL 2008"

2011-03-14

Could not load file or assembly Microsoft.AnalysisServices

After a Service Pack upgrade on a SQL 2005 installation from SP2 to SP4, a job failed with
Executed as user: *****. A .NET Framework error occurred during execution of user-defined routine or aggregate "NameOfUserDefinedStoredProcedure":   System.IO.FileLoadException: Could not load file or assembly 'Microsoft.AnalysisServices, Version=9.0.242.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependencies. Assembly in host store has a different signature than assembly in GAC. (Exception from HRESULT: 0x80131050)  System.IO.FileLoadException:      at nnn.mmm.fff(String s. The step failed.

Searching the internet gave me the MSDN Forum thread "Framework error:Could not load file or assembly 'Microsoft.AnalysisServices, Version=9.0.242.0". Here I was directed to KB949080 and the handling of assemblies.

A look at the assemblies in the database with
SELECT * FROM sys.assemblies;
and their files with
SELECT * FROM sys.assembly_files;
gave me the details to refresh the assembly registrations with
ALTER ASSEMBLY [Microsoft.AnalysisServices]
FROM N'C:\Program Files (x86)\Microsoft SQL Server\90\SDK\Assemblies\Microsoft.AnalysisServices.DLL';


Now everything is working again.

Reference

MSDN Library: "ALTER ASSEMBLY (Transact-SQL)"
KB949080: "... Assembly in host store has a different signature than assembly in GAC..."

2011-03-11

Using a SMO object as parameter value

It looks like a SMO object is implicit converted to a String object, when it is used in a function call by a parameter name.

I have a function, where the parameter is defined as
function Import-FileGroup {
param ([Microsoft.SqlServer.Management.Smo.Database]$smoDatabase = $(throw 'Value requered for parameter "smoDatabase" in function Import-FileGroup.'))

...
}

When I call the function with a parameter name
Import-FileGroup -$smoDatabase $smoDb
it fail with this message
Import-FileGroup : Cannot process argument transformation on parameter 'smoDatabase'. Cannot convert the "-" value of type "System.String" to type "Microsoft.SqlServer.Management.Smo.Database".
At U:\sqladmin\dn_sqladmin\Development\Import-Database.ps1:***** char:17
+ Import-FileGroup <<<<  -$smoDatabase $smoDb
    + CategoryInfo          : InvalidData: (:) [Import-FileGroup], ParameterBindin...mationException
    + FullyQualifiedErrorId : ParameterArgumentTransformationError,Import-FileGroup


But when I call the function without a parameter name
Import-FileGroup $smoDb
everything is working fine.

Not what I expected, but I have now made the note for future use...

The variable „$smoDb“ is created like this
$smoSrv = New-Object Microsoft.SqlServer.Management.Smo.Server 'SANDY.sqladmin.lan'
$smoDb = $smoSrv.Databases['my_database']

2011-03-09

"BACKUP DATABASE is terminating abnormally."

This morning I had a error on a SQL 2000/Windows Server 2000 installation in a backup by a Maintenance Plan.
In the SQL Agent job the error was shown by
Executed as user: *****. sqlmaint.exe failed. [SQLSTATE 42000] (Error 22029).  The step failed.

The Maintenance plan log said:
[11] Database ServicedeskDB: Database Backup...
    Destination: [********201103082203.BAK]
[Microsoft SQL-DMO (ODBC SQLState: 42000)] Error 3202: [Microsoft][ODBC SQL Server Driver][SQL Server]Write on '********201103082203.BAK' failed, status = 112. See the SQL Server error log for more details.
[Microsoft][ODBC SQL Server Driver][SQL Server]BACKUP DATABASE is terminating abnormally.


In the SQL Error Log There were these entries.
2011-03-08 22:04:11.52 spid66    BackupMedium::ReportIoError: write failure on backup device '********201103082203.BAK'. Operating system error 112(error not found).
2011-03-08 22:04:11.53 spid66    Internal I/O request 0x6B514990: Op: Write, pBuffer: 0x06350000, Size: 983040, Position: 4470543872, UMS: Internal: 0x103, InternalHigh: 0x0, Offset: 0xA771600, OffsetHigh: 0x1, m_buf: 0x06350000, m_len: 983040, m_actualBytes: 0, m_errcode: 112, BackupFile: ********201103082203.BAK
2011-03-08 22:04:11.53 backup    BACKUP failed to complete the command BACKUP DATABASE [*****] TO  DISK = N'********201103082203.BAK' WITH  INIT ,  NOUNLOAD ,  NOSKIP ,  STATS = 10,  NOFORMAT


By the SQL Error Log it could look like a storage failure :-(
Not a nice way to start the day.

When I looked at the Windows System Log there were no entries on storage failure.
But looking at the storage I saw that there were 4.15 GB available, and that the last backup took 4.16 GB.

After a quick cleanup and a manual execution of the job - with success - the conclusion is that the error indicates a lack of available storage.

2011-02-24

List of collations

I would like to see a list of available collations on a given SQL Server database installation, and a combination of fn_helpcollations() and COLLATIONPROPERTY() gives a quick answer.
SELECT
 [name] 
 ,COLLATIONPROPERTY([name], 'CodePage') AS [CodePage]
 ,COLLATIONPROPERTY([name], 'LCID') AS [LCID]
 ,COLLATIONPROPERTY([name], 'ComparisonStyle') AS [ComparisonStyle]
 ,COLLATIONPROPERTY([name], 'Version') AS [Version]
 ,[description]
FROM ::fn_helpcollations();
On a SQL Server 2008 R2 (10.50.1600) I get 2397 collations.
I do miss the sort order on the SQL collation as the table in "Selecting a SQL Server Collation".

A search for collation objects points at the stored procedures sys.sp_tablecollations, that uses the view sys.spt_tablecollations_view. It looks like this view is placed in the database "mssqlsystemresource", but this I cannot access direct.

Reference

MSDN Library: "Selecting a SQL Server Collation".
MSDN Library: "Collation and Unicode Support".
Alexander Barkov: "MS SQL Server collation charts".
The Unicode Consortium: "Unicode Collation charts".
G-Productions: "sys.sp_tablecollations".

2011-02-10

Windows 2008 Firewall

There are several guides on how to configure the Windows Server 2008 (R2) firewall to allow SQL Server access.
I would like to add some items to the rule definition:

  • Specify both program (sqlservr.exe) and service.
  • Specify the interface type, typically Local area network.
Also I think other issues like the scope should be considered to make the rule as tight as possible.

Reference

MSDN: "How to: Configure a Windows Firewall for Database Engine Access"
Shawn Hernan: "SQL Server and the Windows Server 2008 Firewall"
Ashish Kumar Mehta: "Configure Windows Firewall for SQL Server 2008 Database Engine in Windows Server 2008 R2"

2011-01-18

Share backup files

When I have to establish a database mirror or restore a database in another environment, I usually restore direct from the original backup file without a copy or move before. The restore is done on a UNC reference to the backup file.
This can not be done in the Management Studio GUI, but must be done by T-SQL as the GUI fails on a UNC reference to the backup file.
The UNC reference is on a share I create on the backup disk. I prefer to name the share "SQLBackup".
I have looked around for Best Practices on file shares and security. The best description I found is by Derek Melber (see Reference), but I have some issues...

It makes sense in general to control the details in access and rights i NTFS, and make a general access on the share. Derek suggests to grant access on the share to Authenticated Users in the domain and give Full Control rights. This should be done while you keep in mind that a Windows default right is that Everyone can Read.
When we are dealing with database backup files, that contain sensible data, I find this access a little too "broad" as is grant access to too many accounts.

Instead I have given specific Read access to the share, and depends on the general Read access to Everyone.
If Everyone is removed, and this I have seen in several organisations, you have to take care of the file access and rights also.

And again: Keep in mind that the backup files are to be handled as sensitive data.

Reference

Derek Melber: "Share Permissions".

2011-01-11

VLF count

The number of Virtual Log Files (VLF) in a database is discussed by several persons. I have some listed in the Reference section below.
Michelle Ufford (SQLFool) wrote a blog entry that is the inspiration of this entry. The script of Ufford is change to fit my present needs.
The script is in three parts, where one part is in two editions to handle one database or all databases in a instance

Part I (temporary data)

SET NOCOUNT ON;

CREATE TABLE #results(
[database_name] sysname
,[vlf_count] INT
,[log_file_count] INT
,[log_size_mb] numeric(15,1)
);

CREATE TABLE #stage(
[file_id] INT
,[file_size] BIGINT
,[start_offset] BIGINT
,[f_seq_no] BIGINT
,[status] BIGINT
,[parity] BIGINT
,[create_lsn] NUMERIC(38)
);


The temporary table #stage holds the output of DBCC LogInfo as a staging area. The data is transformed into the temporary table #results.

Part IIa (Single database)

INSERT INTO #stage EXECUTE (N'DBCC LogInfo WITH no_infomsgs');
INSERT INTO #results
SELECT
DB_NAME()
,COUNT(*)
,COUNT(DISTINCT [file_id])
,(SELECT SUM([size]*8/1024.) FROM sys.database_files WHERE [type_desc] = 'LOG')
FROM #stage;
DROP TABLE #stage;


The value [size] is the number of database pages, and is recalculated to MiB in the column #resuls.log_size_mb.

Part IIb (All databases)

DECLARE csr_db CURSOR FAST_FORWARD READ_ONLY FOR
SELECT [name]
FROM master.sys.sysdatabases;
DECLARE @dbname nvarchar(128);

OPEN csr_db;
FETCH NEXT FROM csr_db INTO @dbname;

WHILE (@@fetch_status <> -1)
BEGIN
INSERT INTO #stage
EXECUTE (N'DBCC LogInfo([' + @dbname + N']) WITH no_infomsgs');

INSERT INTO #results
SELECT
@dbname
,COUNT(*)
,COUNT(DISTINCT [file_id])
,(SELECT SUM([size]*8/1024.)
FROM master.sys.master_files
INNER JOIN master.sys.databases
ON master.sys.master_files.database_id = master.sys.databases.database_id
WHERE [master_files].[type_desc]=N'LOG' AND [databases].[name]=@dbname)
FROM #stage;

TRUNCATE TABLE #stage;

FETCH NEXT FROM csr_db INTO @dbname;
END

CLOSE csr_db;
DEALLOCATE csr_db;
DROP TABLE #stage;

Ufford used the (undocumented) stored procedure master.sys.sp_MSforeachdb, where I use a cursor to avoid a undocumented object.

Part III (Analysis)

DECLARE @vlfPerGBHighLimit int;
SET @vlfPerGBHighLimit = 50;
DECLARE @vlfPerGBLowLimit int;
SET @vlfPerGBLowLimit = 2;

SELECT
[database_name]
,[vlf_count]
,[log_file_count]
,[log_size_mb]
,([log_size_mb]/1024) AS N'log_size_gb'
,N'maintenance' =
CASE
WHEN [log_size_mb] < 1024 THEN N'Log too small to evaluate.'
WHEN [vlf_count]/([log_size_mb]/1024) <= @vlfPerGBLowLimit THEN N'Too few VLFs. Shrink and grow log in 8 GB chunks.'
WHEN [vlf_count]/([log_size_mb]/1024) >= @vlfPerGBHighLimit THEN N'Too many VLFs. Shrink and grow log in 8 GB chunks.'
ELSE N'VLF count OK :-)'
END
FROM #results
ORDER BY [vlf_count] DESC;

DROP TABLE #results;

This is inspired by a comment by Kendra Little to the blog entry by Michelle Ufford. I rewrote it to use CASE instead of IF statements.
There was a error in SQL Server when expanding in multiplies of 4 GB, and this is why Tripp and others recommend to expand by 8000 MB instead of 8192 MB.

Reference

Michelle Ufford (SQLFool): „Check VLF Counts
Kimberly L. Tripp: „Transaction Log VLFs - too many or too few?
Linchi Shea: „Performance impact: a large number of virtual log files“ - Part I and Part II
Paul S. Randall: „Understanding Logging and Recovery in SQL Server
David Levy (Adventures in SQL): „A Busy/Accidental DBA’s Guide to Managing VLFs