Showing posts with label SQL Server Agent. Show all posts
Showing posts with label SQL Server Agent. Show all posts

2023-02-14

Add SQL Agent job step with tokens in output file name

 If you create a new SQL Agent job step with tokens in output file name using the stored procedure sp_add_jobstep or want to add a filename with tokens to an existing job step using the stored procedure sp_update_jobstep you will get a error like

A fatal scripting error occurred.
Variable SQLLOGDIR is not defined.

In this case I was trying to use the token SQLLOGDIR, but it does not matter which token.

Many users of Ola Hallengren (LinkedIn) Maintenance Solution (link) had similar errors. Actually Ola is aware of the challenge - he just forgot to promote his solution ;-)

When you use the stored procedure sp_add_jobstep you with that call the internal stored procedure sp_add_jobstep_internal. You can see the definition of both procedures in the system database msdb with SQL Server Management Studio (SSMS).

The root-challenge is that somewhere along the path from calling sp_add_jobstep to setting the output file name on the job step the content of the parameter is interpreted. But at that point the execution is outside SQL Agent, and the tokens are not available.

The trick - that Ola forgot to promote - is to build a string with each token element seperated.

DECLARE @_token_log_directory NVARCHAR(MAX) = N'$' + N'(ESCAPE_SQUOTE(SQLLOGDIR))';
DECLARE @_token_job_name NVARCHAR(MAX) = N'$' + N'(ESCAPE_SQUOTE(JOBNAME))';
DECLARE @_token_date NVARCHAR(MAX) = N'$' + N'(ESCAPE_SQUOTE(DATE))';
DECLARE @_token_time NVARCHAR(MAX) = N'$' + N'(ESCAPE_SQUOTE(TIME))';
DECLARE @_output_file_name NVARCHAR(MAX) = @_token_log_directory + N'\' + @_token_job_name +  N'_' + @_token_date + N'_' + @_token_time + N'.txt';

This string can then be used as parameter value to the parameter @output_file_name in the stored procedures to a SQL Agent job step.

EXECUTE msdb.dbo.sp_update_jobstep @job_name=N'IndexOptimize - Db1.Schema1.Tbl1',
  @step_id=1,
  @output_file_name= @_output_file_name;

If you dig into Olas code to install the maintenance solution and search for the string „@TokenLogDirectory“ you will see that Ola build this string element by element. With careful seperation of $-sign and token names.

2016-11-09

MS16-136

For the first time in a long time we have a security update for SQL Server with MS16-136.
Some technical details on the update itself are in KB3199641.

The update fixes elevation of privilege on three different components in SQL Server:

  • Database Engine (RDBMS); there are three CVE's spread over the different major versions of Database Engine. The documentation speaks of "improperly handles pointer casting" without further details.
  • Master Data Services - MDS; There is a cross-site-scripting (XSS) vulnability in the MDS API. This could be in the web application part of the API.
  • Analysis Services - SSAS; in this case the vulnability is due to "improperly checks FILESTREAM path.".
  • SQL Server Agent; the vulnability lies in "incorrectly check ACLs on atxcore.dll". This file is a part of the SQL Server Agent ActiveX subsystem, which is - finally - removed from SQL Server with the 2016 version.

The update is for SQL Server 2012 and newer. SQL Server 2008 (R2) are not hit by this security issue.
And not a word about SQL Server 2005, 2000 - or older ;-)

History

2016-11-09 Post created with initial references.
2016-11-27 Details on SQL Server components added.

2014-10-09

DBCC CHECKDB with PowerShell

Microsoft Transact-SQL (T-SQL) is really great for many things inside the SQL Server Database Engine, but automated maintenance with file handling, logging and other nice things is more of a challenge. This is why I and other fellow DBAs use PowerShell to automate tasks.

A standard DBA task is to check the databases, also their integrity with the console command DBCC CHECKDB.
The output like below is (somewhat) nicely formatted for human reading, but I would like to catch it inside the automation so that I can process and log the output.

Service Broker Msg 9675, State 1: Message Types analyzed: 14.
Service Broker Msg 9676, State 1: Service Contracts analyzed: 6.
Service Broker Msg 9667, State 1: Services analyzed: 3.
Service Broker Msg 9668, State 1: Service Queues analyzed: 3.
Service Broker Msg 9669, State 1: Conversation Endpoints analyzed: 0.
Service Broker Msg 9674, State 1: Conversation Groups analyzed: 0.
Service Broker Msg 9670, State 1: Remote Service Bindings analyzed: 0.
Service Broker Msg 9605, State 1: Conversation Priorities analyzed: 0.
DBCC results for 'sys.sysrscols'.
There are 1970 rows in 24 pages for object "sys.sysrscols".
DBCC results for 'sys.sysrowsets'.
There are 346 rows in 4 pages for object "sys.sysrowsets".
...
DBCC results for 'Person.vStateProvinceCountryRegion'.
There are 181 rows in 2 pages for object "Person.vStateProvinceCountryRegion".
DBCC results for 'sys.plan_persist_query'.
There are 0 rows in 0 pages for object "sys.plan_persist_query".
CHECKDB found 0 allocation errors and 0 consistency errors in database 'AdventureWorks2014'.
DBCC execution completed. If DBCC printed error messages, contact your system administrator.

(example on DBCC CHECKDB output from the database [AdventureWorks2014])

The PowerShell code is
$DatabaseName = 'AdventureWorks2014'
$Sql = "DBCC CHECKDB ([$DatabaseName]);"
$CheckDbMessages = @()
$CheckDbWatch = [System.Diagnostics.Stopwatch]::StartNew()
SQLCMD.EXE -E -Q $Sql |
ForEach-Object {
  $CheckDbMessages += $_
  "  {0:s}Z  $_" -f ([System.DateTime]::UtcNow)
}
$CheckDbWatch.Stop()

"{0:s}Z  Database [$DatabaseName] integrity check done in $($CheckDbWatch.Elapsed.ToString()) [hh:mm:ss.ddd]." -f ([System.DateTime]::UtcNow)
if ($CheckDbMessages[($CheckDbMessages.Count) - 2] -clike 'CHECKDB found 0 allocation errors and 0 consistency errors in database *') {
  "{0:s}Z  Database [$DatabaseName] integrity check is OK." -f ([System.DateTime]::UtcNow)
}
else {
  "{0:s}Z  Error in integrity check of the database [$DatabaseName]:`n  $($CheckDbMessages[($CheckDbMessages.Count) - 2])" -f ([System.DateTime]::UtcNow)
  throw "Integrity check of the database [$DatabaseName] failed."
}

(example on PowerShell script to execute DBCC CHECKDB)

I have put it inside a (advanced) function, but that is not shown here as is is the DBCC CHECKDB command execution that is the subject. I can only recommend that you take a serious look on PowerShell advanced functions. The best introduction that I still go back to is the book „Learn PowerShell Toolmaking in a Month of Lunches“ by Don Jones and Jeffery Hicks.
The variable $DatabaseName is a String object containing the name of the database to check.The script above does not check the prerequisites to the DBCC CHECKDB command.

  2014-10-09T17:55:04Z  DBCC results for 'AdventureWorks2014'.
  2014-10-09T17:55:04Z  Service Broker Msg 9675, State 1: Message Types analyzed: 14.
  2014-10-09T17:55:04Z  Service Broker Msg 9676, State 1: Service Contracts analyzed: 6.
  2014-10-09T17:55:04Z  Service Broker Msg 9667, State 1: Services analyzed: 3.
  2014-10-09T17:55:04Z  Service Broker Msg 9668, State 1: Service Queues analyzed: 3.
  2014-10-09T17:55:04Z  Service Broker Msg 9669, State 1: Conversation Endpoints analyzed: 0.
  2014-10-09T17:55:04Z  Service Broker Msg 9674, State 1: Conversation Groups analyzed: 0.
  2014-10-09T17:55:04Z  Service Broker Msg 9670, State 1: Remote Service Bindings analyzed: 0.
  2014-10-09T17:55:04Z  Service Broker Msg 9605, State 1: Conversation Priorities analyzed: 0.
  2014-10-09T17:55:04Z  DBCC results for 'sys.sysrscols'.
  2014-10-09T17:55:04Z  There are 1970 rows in 24 pages for object "sys.sysrscols".
  2014-10-09T17:55:04Z  DBCC results for 'sys.sysrowsets'.
  2014-10-09T17:55:04Z  There are 346 rows in 4 pages for object "sys.sysrowsets".
...
  2014-10-09T17:55:04Z  DBCC results for 'Person.vStateProvinceCountryRegion'.
  2014-10-09T17:55:04Z  There are 181 rows in 2 pages for object "Person.vStateProvinceCountryRegion".
  2014-10-09T17:55:04Z  DBCC results for 'sys.plan_persist_query'.
  2014-10-09T17:55:04Z  There are 0 rows in 0 pages for object "sys.plan_persist_query".
  2014-10-09T17:55:04Z  CHECKDB found 0 allocation errors and 0 consistency errors in database 'AdventureWorks2014'.
  2014-10-09T17:55:04Z  DBCC execution completed. If DBCC printed error messages, contact your system administrator.
2014-10-09T17:55:04Z  Database [AdventureWorks2014] integrity check done in 00:00:04.1890229 [hh:mm:ss.ddd].
2014-10-09T17:55:04Z  Database [AdventureWorks2014] integrity check is OK.

(example on DBCC CHECKDB output from script - AdventureWorks)

Using the good-old SQLCMD.EXE gives the output to the default output stream, where the output then can be processed in the automation by PowerShell.

The SQL Server PowerShell module SQLPS CmdLet Invoke-SqlCmd can also execute the command DBCC CHECKDB, and by setting the -Verbose flag the output is shown - but it is in the Verbose stream and can not be accessed direct in the automation for further processing.
Streaming the Verbose to a file with a redirection operator is possible, but introduces extra complexity...

With ADO.NET the challenge is the same as with the CmdLet Invoke-SqlCmd, and I guess that it is because the CmdLet is constructed in .NET using ADO.NET.

Actually it is a general challenge to catch T-SQL output shown as messages in SQL Server Management Studio. This is because the output is sent async as events.
It is possible to build a strict .NET solution by using SqlInfoMessageEventHandler but I find the solution with SQLCMD.EXE more simple, and that I do like.

The option TABLERESULTS to the command DBCC CHECKDB is not documented by Microsoft, but widely recognized. I do not use the TABLERESULTS option in this solution as the results are delivered during execution but af one answer when the execution is finish. This is the behavior in all cases, also with ADO.NET ExecuteReader.
I have some larger databases, where a check takes several hours and I would like to have a log of the execution. Is everything running? Any errors corrected up til now? and so on...
This is also why I add a timestamp to each message line from DBCC CHECKDB.

2014-10-08

SQL Agent check error count step

I had to create a workaround on a database backup job for a server withe 50+ important databases.
To be sure that an attempt is made to take a backup on each database, have some history easy to access and also a option to reconfigure the job for en extra backup on one or more databases I set up one job step for a backup of each database.
This job structure could be illustrated like this:

  1. job step 1: Backup database alpha.
  2. job step 2: Backup database beta.
  3. job step 3: Backup database gamma.
  4. etcetera...
Each job step continues to the next no matter if the job step fails or not. This is to ensure that an attempt to backup each database is made. This structure will give a false status if the last jobs step does not fail.

To ensure a true job status I have added a final control job step, that check the status of the previous job steps.
The status of the previous job steps I get from the table msdb.dbo.sysjobhistory where I filter on the job start time by inserting SQL Agent Tokens in the T-SQL statement:
DECLARE @error_count INT = (
SELECT COUNT(*)
FROM [msdb].[dbo].[sysjobhistory]
WHERE job_id = $(ESCAPE_NONE(JOBID))
  AND [run_date] = $(ESCAPE_NONE(STRTDT))
  AND [run_time] >= $(ESCAPE_NONE(STRTTM))
  AND [run_status] <> 1);

IF (@error_count > 0)
BEGIN
  DECLARE @error_msg NVARCHAR(2047) = N''Error in Database Full Backup job. Check log file for details. (Error Count = '' + CAST(@error_count AS NVARCHAR(3)) + N'' job steps).'';
  RAISERROR(@error_msg, 19, 0) WITH LOG;
END
ELSE
BEGIN
  RAISERROR(''Database Full Backup job conpleted with no errors.'', 0, 0) WITH NOWAIT;
END

This control jobs step generates a error if a previous job step in the same job has failed. The plan (hope) is that the general monitoring platform like Microsoft System Center will catch the error and raise an incident to the Service Desk.

2013-05-20

SMO by SQLPS

With SQL Server 2012 SQLPS was changes from a mini-PowerShell to a PowerShell module. This makes is much simpler to import the functionality in a regular PowerShell script.
The examples below are written on a computer running Windows 8 and SQL Server 2012.
SQLPS is imported much simpler than we used to import SQL Server Management Objects SMO assemblies.
Import-Module 'SQLPS'
The parameter „DisableNameChecking“ can be added to avoid a message about unapproved verbs.
A list of exported Cmdlets in the module can be shown
(Get-Module 'SQLPS' -ListAvailable).ExportedCmdlets.Values.Name

Backup-SqlDatabase
Add-SqlAvailabilityDatabase
Disable-SqlAlwaysOn
Enable-SqlAlwaysOn
Switch-SqlAvailabilityGroup
Join-SqlAvailabilityGroup
New-SqlAvailabilityGroup
New-SqlAvailabilityReplica
New-SqlHADREndpoint
Remove-SqlAvailabilityGroup
Remove-SqlAvailabilityDatabase
Remove-SqlAvailabilityReplica
Resume-SqlAvailabilityDatabase
Set-SqlAvailabilityGroup
Set-SqlAvailabilityReplica
Set-SqlHADREndpoint
Suspend-SqlAvailabilityDatabase
Test-SqlAvailabilityGroup
Test-SqlAvailabilityReplica
Test-SqlDatabaseReplicaState
New-SqlAvailabilityGroupListener
Set-SqlAvailabilityGroupListener
Add-SqlAvailabilityGroupListenerStaticIp
Invoke-PolicyEvaluation
Restore-SqlDatabase
Invoke-Sqlcmd
Encode-SqlName
Decode-SqlName
Convert-UrnToPath


Server Object

To create a SMO Server object on a local default SQL Server database instance can be like
$SmoServer = New-Object Microsoft.SqlServer.Management.Smo.Server '(local)'
SMO only connects to the when you ask for something specific. For a start I would like to know the basics about the installation.
$SmoServer.Information

Parent                      : [TITANIUM]
Version                     : 11.0.3128
EngineEdition               : EnterpriseOrDeveloper
ResourceVersion             : 11.0.3000
BuildClrVersion             : 4.0.30319
BuildClrVersionString       : v4.0.30319
BuildNumber                 : 3128
Collation                   : Latin1_General_100_CI_AS_KS_WS_SC
...
The full set of Server property values can be accessed
$SmoServer | Format-List *

AuditLevel                  : Failure
BackupDirectory             : D:\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Backup
BrowserServiceAccount       : NT AUTHORITY\LOCALSERVICE
BrowserStartMode            : Disabled
BuildClrVersionString       : v4.0.30319
BuildNumber                 : 3128
...

Database Object

A SMO Database object can be created on a database with a known name
$SmoDatabase = $SmoServer.Databases['msdb']
This object can give access to some information
$SmoDatabase | Format-Table DataSpaceUsage,IndexSpaceUsage -AutoSize

DataSpaceUsage IndexSpaceUsage
-------------- ---------------
12080            3136

Table Object

Also the Database object can be used to create a SMO Table object. This can be direct on the table name alone
$SmoDatabase.Tables['sysjobhistory']
Or the Table object can be create more specific on the schema name with the table name
$SmoTable= ($SmoDatabase.Tables |
Where-Object -FilterScript { $PSItem.Schema -eq 'dbo' -and $PSItem.Name -eq 'sysjobhistory' })


Again the Table object gives access to some basic information
$SmoTable | Format-Table Schema,Name,DataSpaceUsed,IndexSpaceUsed,RowCount -AutoSize

Schema Name          DataSpaceUsed IndexSpaceUsed RowCount
------ ----          ------------- -------------- --------
dbo    sysjobhistory            32             32       84

Stored Procedure Object

The Database object can also be used to create a SMO Stored Procedure object. Like the Table object it can be created in two ways, but I will be (professional) lazy and only show the creation on the name alone
$SmoStoredProcedure = $SmoDatabase.StoredProcedures['sp_get_job_alerts']
A stored procedure usually has one or more parameters that can be described with
$SmoStoredProcedure.Parameters | Format-Table Name,DataType,DefaultValue,IsOutputParameter -AutoSize

Name      DataType         DefaultValue IsOutputParameter
----      --------         ------------ -----------------
@job_id   uniqueidentifier                          False
@job_name sysname                                   False

Server Job Object

With the installation of the SQL Server database service is also installed SQL Server Agent that is the SQL Server scheduler used for maintenance, monitoring and other automated tasks.
A SMO Job Server object can be created from a Server object property and gives a quick access to some basic SQL Server Agent information
$SmoServer.JobServer

AgentDomainGroup          : NT SERVICE\SQLSERVERAGENT
AgentLogLevel             : Errors, Warnings
AgentMailType             : SqlAgentMail
AgentShutdownWaitTime     : 15
DatabaseMailProfile       : 
ErrorLogFile              : D:\Microsoft SQL Server\MSSQL11.MSSQLSERVER\MSSQL\Log\SQLAGENT.OUT
...
More interesting is the information om a job, where a SMO Server Job object can be created on the Server object
$SmoJob = $SmoServer.JobServer.Jobs['syspolicy_purge_history']
This gives access to some basic information on the job and its execution
$SmoJob | Format-Table CurrentRunStatus,LastRunDate,LastRunOutCome,NextRunDate -AutoSize

CurrentRunStatus LastRunDate         LastRunOutcome NextRunDate        
---------------- -----------         -------------- -----------        
            Idle 05-04-2013 02:00:00      Succeeded 21-05-2013 02:00:00

Reference

There are much more on SMO in MSDN Library, you can start with
Microsoft.SqlServer.Management.Smo Namespace
SMO is much larger which is indicated at
SQL Server Management Objects Reference“, take a look at the index to the left.

2013-03-15

SQLPS Clear-Host error

When a ps1-scriptfile contains a Clear-Host statement, which I often use when testing a script in PowerShell ISE, the script fails on SQLPS with an error like this:
A job step received an error at line 3 in a PowerShell script. The corresponding line is '$space.ForegroundColor = $host.ui.rawui.ForegroundColor'. Correct the script and reschedule the job. The error information returned by PowerShell is: 'Exception setting "ForegroundColor": "Cannot convert null to type "System.ConsoleColor" due to invalid enumeration values. Specify one of the following enumeration values and try again. The possible enumeration values are "Black, DarkBlue, DarkGreen, DarkCyan, DarkRed, DarkMagenta, DarkYellow, Gray, DarkGray, Blue, Green, Cyan, Red, Magenta, Yellow, White"."

A quick work-around is to comment out the Clear-Host statement as a preparation to execute it as a PowerShell step in a SQL Server Agent job.

This is on a Windows Server 2008 R2 with SQL Server 2008 R2. The Windows PowerShell is 3.0 while SQLPS is 2.0.

2012-11-28

Job history template using SQL Server Agent tokens

To log the execution and output from all steps in a SQL Server Agent job into a single logfile, you can use SQL Server tokens and macros to create the logfile in the default SQL Server LOG-folder, and let each jobstep add their output to the file. The default SQL Server LOG-folder is also the folder where SQL Server places Errorlog files (ERRORLOG.nn) and tracefiles (.trc) from default trace.

The first jobstep is a dummy step that only initialize the logfile on a normal job execution.
The jobstep can be created with
EXECUTE [msdb].[dbo].[sp_add_jobstep]
@job_id=@jobId,
@step_name=N'Job Begin',
@step_id=1,
@cmdexec_success_code=0,
@on_success_action=3,
@on_success_step_id=0,
@on_fail_action=2,
@on_fail_step_id=0,
@retry_attempts=0,
@retry_interval=0,
@os_run_priority=0,
@subsystem=N'TSQL',
@command=N'DECLARE @errmsg nvarchar(2047);
SET @errmsg = LEFT(CONVERT(nvarchar(128), GETUTCDATE(), 127), 22) + N''Z : Job Begin...'';
RAISERROR(@errmsg,0,0) WITH NOWAIT;',
@database_name=N'master',
@output_file_name=N'$(ESCAPE_SQUOTE(SQLDIR))\LOG\MacroTest.$(ESCAPE_SQUOTE(STRTDT))T$(ESCAPE_SQUOTE(STRTTM)).log',
@flags=0;

The output filename is created in the default SQL Server Log with „$(ESCAPE_SQUOTE(SQLDIR))\LOG\“ and the name of the file is created with a name like the jobname, a timestamp and the filetype „log“.
I miss two things as a token; the jobname and the time where the job began in UTC. Right now I will have to enter a jobname manually and take care of the local time when we switch between summer- and wintertime (Daylight saving time).
The logfiles initial output is generated by a RAISERROR call. Please notice that I give the errormessage a UTC timestamp in a ISO 8601 format.

A actual job step will the add the output to the logfile. A jobstep could be created with
EXECUTE [msdb].[dbo].[sp_add_jobstep]
@job_id=@jobId,
@step_name=N'Job Execution 01',
@step_id=2,
@cmdexec_success_code=0,
@on_success_action=3,
@on_success_step_id=0,
@on_fail_action=2,
@on_fail_step_id=0,
@retry_attempts=0,
@retry_interval=0,
@os_run_priority=0,
@subsystem=N'TSQL',
@command=N'DECLARE @errmsg nvarchar(2047);
SET @errmsg = LEFT(CONVERT(nvarchar(128), GETUTCDATE(), 127), 22) + N''Z : Job Executing {01}...'';
RAISERROR(@errmsg,0,0) WITH NOWAIT;',
@database_name=N'master',
@output_file_name=N'$(ESCAPE_SQUOTE(SQLDIR))\LOG\MacroTest.$(ESCAPE_SQUOTE(STRTDT))T$(ESCAPE_SQUOTE(STRTTM)).log',
@flags=2;

The value 2 to the parameter @flag append the output to the logfile.

When the job has executed all (real) steps, the logfile is ended a dummy step, that enter a final timestamp. This can be used for execution time comparisons. The jobstep can be created with
EXECUTE [msdb].[dbo].[sp_add_jobstep]
@job_id=@jobId,
@step_name=N'Job End',
@step_id=3,
@cmdexec_success_code=0,
@on_success_action=1,
@on_success_step_id=0,
@on_fail_action=2,
@on_fail_step_id=0,
@retry_attempts=0,
@retry_interval=0,
@os_run_priority=0,
@subsystem=N'TSQL',
@command=N'DECLARE @errmsg nvarchar(2047);
SET @errmsg = LEFT(CONVERT(nvarchar(128), GETUTCDATE(), 127), 22) + N''Z : Job End.'';
RAISERROR(@errmsg,0,0) WITH NOWAIT;',
@database_name=N'master',
@output_file_name=N'$(ESCAPE_SQUOTE(SQLDIR))\LOG\MacroTest.$(ESCAPE_SQUOTE(STRTDT))T$(ESCAPE_SQUOTE(STRTTM)).log',
@flags=2;


A execution of a job with three steps where one is the actual job execution could generate a output like this
Job 'MacroTest' : Step 1, 'Job Begin' : Began Executing 2012-11-28 08:21:51

2012-11-28T07:21:51.95Z : Job Begin... [SQLSTATE 01000]
Job 'MacroTest' : Step 2, 'Job Execution 01' : Began Executing 2012-11-28 08:21:51

2012-11-28T07:21:51.98Z : Job Executing {01}... [SQLSTATE 01000]
Job 'MacroTest' : Step 3, 'Job End' : Began Executing 2012-11-28 08:21:52

2012-11-28T07:21:52.01Z : Job End. [SQLSTATE 01000]

If the job is started from another step than the initial step, the output is still caught in a logfile. In this case a new file is created implicit by adding output to it. A partial execution could generate a output like this
Job 'MacroTest' : Step 2, 'Job Execution 01' : Began Executing 2012-11-28 09:50:21

2012-11-28T08:50:21.81Z : Job Executing {01}... [SQLSTATE 01000]
Job 'MacroTest' : Step 3, 'Job End' : Began Executing 2012-11-28 09:50:21

2012-11-28T08:50:21.84Z : Job End. [SQLSTATE 01000]


SQL Server Agent tokens and macros are documented in „Use Tokens in Job Steps“ on MSDN Library.

A more complex and dynamic creation of SQL Server Agent jobsteps can be studied in the SQL Server Maintenance Solution from Ola Hallengren.

2012-04-06

Get SQL Server root folder with xp_instance_regread

I often use the standard folders of SQL Server, like the folder "Log" with SQL Server Error Log files and my SQL Server Agent job log files or the folder "JOBS" with my job script files.
These folders are subfolders to the SQL Server root folder, so when I have the root folder, I have the rest.
"One folder to hold them all"
Also I would like to be able to get the root folder on named instances and non-standard installations.
By using the undocumented stored procedure "[master].[dbo].[xp_instance_regread]", I have a useful solution to my need.
Actually the procedure is placed in the schema "sys" on newer SQL Server versions, but by calling the "dbo" schema the same statement works on SQL Server 2000 and 2012.

DECLARE @root_folder nvarchar(128);
EXECUTE [master].[dbo].[xp_instance_regread]
  @rootkey = N'HKEY_LOCAL_MACHINE',
  @key = N'SOFTWARE\Microsoft\MSSQLSERVER\Setup',
  @value_name = N'SQLDataRoot',
  @value = @root_folder OUTPUT;
SELECT @root_folder AS [root_folder];


The answer could be
P:\MSSQL\MSSQL10_50.MSSQLSERVER\MSSQL

It does work on SQL Server 2000 and named instances.

If I place the statement in a PowerShell Here-String in a SQL Server Agent PowerShell job step and use the cmdlet Invoke-SqlCmd, I get the root folder without knowing the name of the server or the instance.

EXECUTE [msdb].[dbo].[sp_add_jobstep]
...,
@subsystem=N'PowerShell',
@command=N'$sql = @"
DECLARE @root_folder nvarchar(128);
EXECUTE [master].[dbo].[xp_instance_regread]
  @rootkey = N''HKEY_LOCAL_MACHINE'',
  @key = N''SOFTWARE\Microsoft\MSSQLSERVER\Setup'',
  @value_name = N''SQLDataRoot'',
  @value = @root_folder OUTPUT;
SELECT @root_folder AS [root_folder];
"@
Invoke-SqlCmd -Query $sql',
...


Very nice I think, because it makes it possible to use the same job definition on all SQL Server database instances. E.g. in standard backup jobs.

Also I can use the Root Folder path direct in PowerShell:
Import-Module 'SQLPS'
$sql = @"
DECLARE @root_folder nvarchar(128);
EXECUTE [master].[dbo].[xp_instance_regread]
  @rootkey = N'HKEY_LOCAL_MACHINE',
  @key = N'SOFTWARE\Microsoft\MSSQLSERVER\Setup',
  @value_name = N'SQLDataRoot',
  @value = @root_folder OUTPUT;
SELECT @root_folder AS [root_folder];
"@
$RootFolder = $(Invoke-SqlCmd -Query $sql).root_folder
":: MSSQL Root Folder = '$RootFolder'."

In this case the answer is
:: MSSQL Root Folder = 'P:\MSSQL\MSSQL10_50.MSSQLSERVER\MSSQL'.

2012-03-13

Log failure in a SQL Agent PowerShell job step

I wanted to have a template for a SQL Server Agent PowerShell job step. A key feature of the template is to make the job step fail controlled with a customized and informative error message.
The job step is defined in a script file (.ps1) called by the SQL Server PowerShell job step. If an error occurs in the script file, this is caught, described and handled. The script exits and the job step presents the error in the job step history and fail. By failure I also want that the value of “run_status” in the table msdb.dbo.sysjobhistory is set to “0” (zero).
The output from the script and the job step is streamed to a log file defined in the job step. The log file should hold all output and messages of the job step execution. That is both customized output and standard output from components.

The name of the logfile is defined with other job step parameters in a script configuration file for the job step.
$LogName = 'JobStepFailure'
$LogFileFolder = 'D:\MSSQL\MSSQL10_50.MSSQLSERVER\MSSQL\Log'

In the script file the log file is implemented and initialized
$ScriptStartTime = [System.DateTime]::Now
$LogFileName = "$LogFileFolder\$LogName.{0:yyyyddMMTHHmmss}.log" -f $ScriptStartTime


After the script and the logfile is initialized, I move on to reset the error array. Just in case...
$Error.Clear()
and the script execution can start for real
"{0:s}  Begin." -f $([System.DateTime]::Now) > $LogFileName

If a error occurs in the script execution, I usually handle the basics like this
try {$rowsSource = $DataAdpt.Fill($DataSet) }
catch {
  "{0:s}  Exeption: $_" -f $([System.DateTime]::Now) >> $LogFileName
  throw "Error in script. Check logfile '$LogFileName' for details."
}
finally { $CnnSource.Close() }


At the end of the script
"{0:s}  End." -f $([System.DateTime]::Now) >> $LogFileName
I check one more time for errors and exit the script
if ($Error) {
  throw "One or more error in script execution.`nCheck logfile '$LogFileName' for details."
}
else {
  "Script executed with success.`nCheck logfile '$LogFileName' for details." | Write-Output
}


One major argument to create the log file from the script file instead of using the parameter @output_file_name to the procedure msdb.dbo.sp_add_jobstep is that if the PowerShell script throws an exception, the content of the logfile is only the exception.
I do believe that this template can be used as a general solution.

2012-02-14

SQL Server Agent PowerShell job step

Some time ago I wrote about this in another blog entry, and I do it again while there are some major changes in what I want to use.
First of all I want the SQL Server Agent jobstep to be defined as a PowerShell job step, not a CmdExec job step.
Also I have some other issues I want to handle.
  1. The job step is defined in a script file, not a script in the job step command field.
  2. The script file takes input by named parameters.
  3. There are one or more spaces in the script file path, e.g. "C:\Program Files\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\JOBS".
When I looked around on the internet, two different solutions were used.
One using the PowerShell call operator ampersand (&).
One using the Invoke-Expression CmdLet. The last should be avoided while it has some risks. These risks are described in the book "Windows PowerShell in Action, Second Edition" (ISBN-13 978-1-935182-13-9) by Bruce Payette in section 21.6.4. This book is most recommend.

Actually it turned out to be quite simple, when I had the quotation marks in place.
& "D:\Microsoft SQL Server\MSSQL10.SANDY\MSSQL\JOBS\SQLPS_test.ps1" -param_00 'Zero' -param_01 'One'
The trick is to put the full name of the script file in quotation marks, and the parameter names and values outside the quotation marks.

Whe you define the job step usin the stored procedure msdb.dbo.sp_add_jobstep, the value of the parameter @command has a lot of quotation marks.
@command=N'& "D:\Microsoft SQL Server\MSSQL10.SANDY\MSSQL\JOBS\SQLPS_test.ps1" -param_00 ''Zero'' -param_01 ''One'''
Please notice that some quotation marks are single and some are double.

The script file is like this.
param(
  [Parameter(Mandatory=$true)][string]$param_00,
  [Parameter(Mandatory=$true)][string]$param_01
)
"{0:s}  Hello." -f $([System.DateTime]::Now)
"{0:s}  Parameter value 00 = $param_00" -f $([System.DateTime]::Now)
"{0:s}  Parameter value 01 = $param_01" -f $([System.DateTime]::Now)
"{0:s}  Goodbye." -f $([System.DateTime]::Now)

and the output is like
2012-02-14T07:50:03  Hello.
2012-02-14T07:50:03  Parameter value 00 = Zero
2012-02-14T07:50:03  Parameter value 01 = One
2012-02-14T07:50:03  Goodbye.

2010-10-11

SQL PowerShell output

There are several CmdLets for output in PowerShell. I got some by the statements
man write-*
and
man out-*

Putting the CmdLets in a script file
$DebugPreference = "Continue"

":: (default)"
":: Out-Null" | Out-Null
":: Out-Default" | Out-Default
":: Out-Host" | Out-Host
#":: Out-File"
#":: Out-Printer"
":: Out-String" | Out-String

Write-Host ":: Write-Host"
#Write-Progress ":: Write-Progress"
Write-Debug ":: Write-Debug"
Write-Verbose ":: Write-Verbose"
Write-Warning ":: Write-Warning"
Write-Error ":: Write-Error"
Write-Output ":: Write-Output"

I have disabled the calls of the CmdLets Out-File, Out-Printer and Write-Progress as I find them irrelevant to this little experiment.

Executing the script file gives in PowerShell (powershell.exe)
:: (default)
:: Out-Default
:: Out-Host
:: Out-String

:: Write-Host
DEBUG: :: Write-Debug
WARNING: :: Write-Warning
H:\Script Library\Output.ps1 : :: Write-Error
At line:1 char:13
+ .\Output.ps1 <<<<
    + CategoryInfo          : NotSpecified: (:) [Write-Error], WriteErrorException
    + FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,Output.ps1

:: Write-Output


and in SQL PowerShell (sqlps.exe)
:: (default)
:: Out-Default
:: Out-Host
:: Out-String

DEBUG: :: Write-Debug
WARNING: :: Write-Warning
H:\Script Library\Output.ps1 : :: Write-Error
At line:1 char:13
+ .\Output.ps1 <<<<
    + CategoryInfo          : NotSpecified: (:) [Write-Error], WriteErrorException
    + FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,Output.ps1

:: Write-Output

In the script for SQL PowerShell I removed the call to Write-Host as is fails with
A job step received an error at line 18 in a PowerShell script. The corresponding line is 'Write-Host ":: Write-Host"'. Correct the script and reschedule the job. The error information returned by PowerShell is: 'Cannot invoke this function because the current host does not implement it.

When I execute the script in a PowerShell SQL Agent job step using the Invoke operator (& "H:\Script Library\Output.ps1"), the output file contains
:: (default)
:: Out-String

:: Write-Output

The job script encountered the following errors. These errors did not stop the script:
A job step received an error at line 1 in a PowerShell script. The corresponding line is '& "H:\Script Library\Output.ps1"'. Correct the script and reschedule the job. The error information returned by PowerShell is: ':: Write-Error
'


I was a little surprised to see that the Debug Preference and the CmdLets Out-Default and Out-Host were ignored.
A quick conclusion is that the implicit pipe and the CmdLet Write-Output is general usable.

2010-08-24

CodePlex novice

I have entered CodePlex.
After some looking around, my first project is published. Please take a look.

The subject is maintaining the database „msdb“ by cleaning up history data.

Reference
SQL Server History Cleanup“ (CodePlex)

2010-02-08

SQL Server Agent schedule owner

When a database mirroring session is created using Microsoft SQL Server Management Studio (SSMS) the job "Database Mirroring Monitor Job" is created automatic. But if you create the database mirroring session by script the job is not created. This is described in more details tin the MSDN Library article "Monitoring Database Mirroring".

A database instance had a mirroring session running against one host, and the mirroring host was recycled. Then the database mirroring sessions were recreated against another host. This time by script.
No Problem I thought.
The job "Database Mirroring Monitor Job" was scripted out in SSMS and then created on the mirroring database instance. It then turns out that the jobs was not created exact as on the principal database instance.

The owner of the SQL Agent schedule "Database Mirroring Monitor Schedule" was NULL on the mirroring installation where is was "sa" on the principal installation.
This does not show in SSMS windows or by calling the procedure msdb.dbo.sp_help_schedule. But the table msdb.dbo.sysschedules holds the SID of the schedule owner.
SELECT [msdb].[dbo].[sysschedules].[name],[msdb].[dbo].[sysschedules].[owner_sid]
FROM msdb.dbo.[sysschedules]
WHERE [msdb].[dbo].[sysschedules].[name]=N'Database Mirroring Monitor Schedule';

I could not find a way to alter the owner of a given schedule in SSMS. The procedure msdb.dbo.sp_update_schedule does the trick.
EXECUTE [msdb].[dbo].[sp_update_schedule]
@name = N'Database Mirroring Monitor Schedule',
@owner_login_name = N'sa';

You could also do it quick and dirty.
UPDATE [msdb].[dbo].[sysschedules]
SET [owner_sid]=0x01
WHERE [sysschedules].[name]=N'Database Mirroring Monitor Schedule';
Not nice!

Now the schedules are identical and I can disregard the finding.

2008-10-07

SQL Server Agent PowerShell job step

I have had some problems executing a PowerShell script in a SQL Server Agent job step.
The errormessage I got – several times – was
Executed as user: SYS\ServiceDefaultSQL. The process could not be created for step 1 of job 0x5D25506B0FA13F45A770C7CD8B3BA051 (reason: The system cannot find the file specified). The step failed.
After some experiments I thought there was a NTFS security problem, so that the domain account the SQL Server Agent service was running in didn't have access rights to the PowerShell filestructure. At my test environment PowerShell is installed after the SQL Server services.
While I was browsing the NTFS rights, I asked a friend (Jakob Bindslet). He pointed me right to the solution.

The solution is somewhat awkward, while it's based on calling cmd.exe inside a CmdExec job step with the command
cmd.exe /c %SystemRoot%\system32\windowspowershell\v1.0\powershell.exe -NoLogo -NoProfile -Noninteractive "&'Z:\SQLAdmin\Jobs\HelloWorld.ps1'"
After beeing pointed in the right direction, I found out this is also working
C:\WINDOWS\system32\windowspowershell\v1.0\powershell.exe -NoLogo -NoProfile -Noninteractive "&'T:\SQLAdmin\Jobs\HelloWorld.ps1'"

Well – actually the documentation in Books Online clearly states that I "must provide the full path to the executable if the executable is not located in a directory specified in the system path or the path for the user that the job step runs as." (Creating Job Steps).
I think a downside could be the static path, that might not be alike for other Windows hosts.
But at my job we are using a single SQL Server Agent service for the entire production environment. This makes the last statement with the full PowerShell path usefull.

It does not work when using %ComSpec% or %SystemRoot%\system32\ because the environmental variables is not known to the SQL Server Agent job step.
BTW - the variables known to a job step is called tokens and macros. You can read more in Books Online > Using Tokens in Job Steps.

2008-08-14

SQL Server Agent PowerShell job step

This seems like a valid and simple way to execute a PowerShell script in a SQL Server Agent job step.
The script I've been testing with looks like this (ExecuteablePS.ps1):
param( [string]$script:myParam = $( throw "Error: Command statement is missing." ) )

Write-Output "Hello to output."
Write-Host "Hello to host."

Write-Output "The parameter 'myParam' was given the value '$myParam'."
It is on purpose that the script requires a parameter, while most - if not all - my automation scripts is parametrized on the physical elements like server (host).
The script also demonstrates that both the Output stream and the Host stream is caught by SQL Server Agent.

The SQL Server Agent job step is defined as a CmdExec step.
The step only has one line:
powershell.exe -NoLogo -NoProfile -Command "&'C:\SQLAdmin\Script\ExecuteablePS.ps1' 'CoolValue'"
I spend some time getting the " and ' right...

When the job is executed, the history is:
Executed as user: Sandbox\Frodo. Hello to output. Hello to host. The parameter 'myParam' was given the value 'CoolValue'. Process Exit Code 0. The step succeeded.

I use CmdExec in stead of PowerShell as job step type, so that the solution can be used below SQL Server 2008.
Also I like to use powershell.exe instead of sqlps.exe. This makes the solution valid on PowerShell v2.