Installing Ola Hallengren Maintenance Solution

Ola Hallengren Maintenance Solution (OHMS) is with good reason a well-known and widely used solution for SQL Server database backups and maintenance of indexes and statistics together with database integrity check. The solution is recommended by many great DBAs and SQL Server MVPs. Personally I have used the solution with great success in several financial organisations.

Installing OHMS is quite simple. In this text I will look into some details when installing OHMS in a scalable SQL Server infrastructure. The installation is here done i fire steps:
  1. Create DBA database
  2. Configure OHMS installation
  3. Run OHMS installation
  4. Configure OHMS
  5. Test OHMS
With dbatools (dbatools.io) there is the PowerShell CmdLet Install-DbaMaintenanceSolution, that install or update OHMS in a given database. Check the details on the code in GitHub. I have not personally used dbatools, but they look fine and I might take a deeper look later.

Create DBA Database

Create seperate userdatabase on each instance for SQL Server administration, e.g. [sqladmin] with a prefix of the organisation. User underscore for space in the name like [nielsgrove_sqladmin].

This is a general solution that can be used for other database administration solutions.
This database does not require OHMS, but you should still do backups and maintain indexes.

Configure installation

Put OHMS in seperate filegroup to use database for other purposes now and in the future.

ALTER DATABASE [sqladmin] ADD FILE ( NAME = N'ohms_data0', FILENAME = N'C:\Data\sqladmin_ohms_data0.ndf' , SIZE = 8MB , FILEGROWTH = 64MB ) TO FILEGROUP [OHMS];

The filegroup [OHMS] is marked as default during the OHMS installation.

OHMS is using the schema [dbo] by default. You could change this, but this would require a rather detailed rewrite of the code. As the code in the scriptfile ManitenanceSolution.sql is more than 8000 lines I would not recomment changing the schema.

Download script from ola.hallengren.com. Define database name from above in the USE-statement around line 22.
Create cmd- or PowerShell-file to implement configuration. The simple solution used here is a one-line cmd-script as in the next section. You should always script even simple things, as you then are prepared to scale out with the same quality.
If you are building a automatic installation and upgrade you should get the precise URLs from the general .com-URL. That might require a more complex configuration.

Run OHMS installation

sqlcmd command-line using configuration above. If you create a more complex configuration, the execution itself will have to be considered in detail.

C:\>sqlcmd.exe -S "(local)\SQLSERVER" -E -d "sqladmin" -i "C:\SysAdmin\MaintenanceSolution.sql"
Changed database context to '#sqladmin'.

And change the default filegroup back to [PRIMARY]

Configure OHMS

The sql-script dowes create SQL Agent jobs, but the jobs has no schedule. Add a schedule to each job by the stored procedure [msdb].[dbo].[sp_add_jobschedule]. This is a quick example:

DECLARE @schedule_id int;
EXECUTE [msdb].[dbo].[sp_add_jobschedule]
  @job_name=N'CommandLog Cleanup',
  @name=N'OHMS CommandLog Cleanup',
  @active_end_time=235959, @schedule_id = @schedule_id OUTPUT;
SELECT @schedule_id;

OHMS create these 11 jobs in SQL Agent:
  • CommandLog Cleanup
  • DatabaseBackup - SYSTEM_DATABASES - FULL
  • DatabaseBackup - USER_DATABASES - DIFF
  • DatabaseBackup - USER_DATABASES - FULL
  • DatabaseBackup - USER_DATABASES - LOG
  • DatabaseIntegrityCheck - SYSTEM_DATABASES
  • DatabaseIntegrityCheck - USER_DATABASES
  • IndexOptimize - USER_DATABASES
  • Output File Cleanup
  • sp_delete_backuphistory
  • sp_purge_jobhistory
You can get a list of the jobs from the table [msdb].[dbo].[sysjobs]:
SELECT * FROM msdb.dbo.sysjobs
WHERE [description]=N'Source: https://ola.hallengren.com'
ORDER BY [name];

All OHMS jobs are placed in the SQL Agent job category Database Maintenance. You could change the category with the stored procedure [msdb].[dbo].[sp_update_job], but I think that the benefits will be rather limited.

Log files are placed in SQL Server ErrorLogPath where the SQL Server ErrorLog and default traces are placed by default. If you want the OHMS log files another place, then change the job steps with the stored procedure [msdb].[dbo].[sp_update_jobstep]. You should only do this if you are forced to by stupid policies.

Put configuration in script file(-s) and make it idempotent. Also put everything in version control, with the documentation.

If you want to remove the OHMS jobs then this query on the table [msdb].[dbo].[sysjobs] can generate the statements:
SELECT N'EXECUTE [msdb].[dbo].[sp_delete_job] @job_name=N''' + [name] + N''', @delete_unused_schedule=1;'
FROM [msdb].[dbo].[sysjobs]
WHERE [description]=N'Source: https://ola.hallengren.com';


Run each job starting with database integrity checks, then full data backup moving to differential backup and closing with transaction log backup spiced up with index and statistics maintenance jobs.
Check output of each job and check SQL Server errorlog after each SQL Server Agent job execution.

You can generate the statements on the stored procedure [msdb].[dbo].[sp_start_job] in a query on the table [msdb].[dbo].[sysjobs] like above:
SELECT N'EXECUTE [msdb].[dbo].[sp_start_job] @job_name=N''' + [name] + N''';'
FROM [msdb].[dbo].[sysjobs]
WHERE [description]=N'Source: https://ola.hallengren.com';


  • The installation script handles both installation and update. Some identification on which version is installed and the version about to be deployed.
  • Integrate installation with your Definitive Software Library (DSL).
  • More robust errorhandling. Especially in the tests.


Sandbox Active Directory in VMware Workstation

This is to create a isolated sandbox where network, storage and patching can be isolated to the given sandbox. The idea is also to be able to create several independent sandboxes for different purposes. Also the idea is to be able to create sandboxes of different complexity like a single database installation to a full environment in a sandbox.
I am sure this could also be accomplished with Microsoft Hyper-V or another virtualisation platform. But I am currently using VMware and then also in this case.

This start as a very simple description of the process with some central configurations. Later I intent to expand this to a scripted installation. The final goal is to be able to provide a complete AD forrest automatic.

Clone Windows Server

I have a stand-alone Windows Server 2019 that I clone in the current state where it is shut down (cold) and the clone is a full clone opposite to a linked clone. I started this recipe working on Windows server 2016 and believe that with minor charges this is still valid for Windows Server 2016.

Generalize Windows with sysprep

Microsoft Docs: "Sysprep (System Preparation) Overview"

Alternative - Create new server

  1. Create virtual server.
  2. Install Windows Server from ISO-image.
  3. Update Windows Server.
This can hopefully be automated with a VMware tool. More on this when I have looked into the subject.

Prepare virtual network

  • Custom network (VMnet13, host-only)
  • Subnet IP:

Prepare server

Network profile:
  • IP :
  • Subnet :
  • Preferred DNS : (this host)
  • Alternate DNS : (the vmnet)
  • Default gateway: None (host-only)
Steps to prepare the server:
  1. Rename computer, e.g. "DC00".
    • Start PowerShell as administrator.
      Rename-Computer -NewName 'DC00' -Restart
  2. Danish keyboard - as a Dane I usually end up with a Danish keyboard.
  3. Windows location: UK to set for UTC time. This is to avoid issues with daylight savings and other local tricks.
  4. English UK ISO-8601'ish date time format.
  5. Windows SmartScreen enabled.
  6. Set password on Administrator user and rename user, e.g. "Ragnar". Password never expires.
  7. Personal administrator user, e.g. "SuperNiels". Password never expires.
  8. vmxnet3 Ethernet Adapter:
    • Do not allow computer to turn off device to save power.
    • See the post "vmxnet3 network adapter" about vmxnet paravirtualized network adapter.
    • Remove Ethernet0 after vmxnet3 configuration.
    • Rename adapter, e.g. to "Ethernet42".
  9. Remove old network adapter. Show hidden devices in Device Manager and remove Intel Gigabit Network Connection.
  10. Static network definition; IP and DNS. This should be done after changing the ethernet adapter as the configuration is assigned the adapter.
  11. Print Spooler service stopped and disabled
    • Start PowerShell as administrator.
      Stop-Service -Name 'spooler'
      Set-Service -Name 'spooler' -StartupType 'Disabled'
  12. Activate Windows Server. This can be done offline as described in the post "Activate Windows Server offline". Or you can temporary change virtual network to a network with NAT to the internet.
Also you might want to consider to disable the screensaver and keeping the screen on always.

Configure Domain Controller

Domain: sandbox.lan
  1. Add Windows Server roles: Active Directory Domain Services (AD DS) and DNS Server. Both with all features and tools.
  2. Promote server to Domain Controller (dcpromo) in a new forrest.
  3. Specify domain controller capabilities; DNS server and Global Catalog on the domain "sandbox.lan".
  4. Enter password for DSRM.
  5. Do not delegate DNS.
  6. Accept the NetBIOS name "SANDBOX".
  7. Accept the default folders. If you are installing a DC in Production this is a configuration to consider.
The generated script can be viewd, but it is actually only one CmdLet call and because of the limited coverage I find the "script" rather incomplete.
Import-Module ADDSDeployment
Install-ADDSForest -CreateDnsDelegation:$false -DatabasePath "C:\Windows\NTDS" -DomainMode "WinThreshold" -DomainName "sandbox.lan" -DomainNetbiosName "SANDBOX" -ForestMode "WinThreshold" -InstallDns:$true -LogPath "C:\Windows\NTDS" -NoRebootOnCompletion:$false -SysvolPath "C:\Windows\SYSVOL" -Force:$true

Review the validation and start installation of AD DS.
When the installation of AD DS in complete you will be logged of and the server will be restarted.

Verify Domain Controller

The Server Manager now also shows the items "AD DS" and "DNS" in the menu to the left.
I the System window full computer name is now "DC00.sandbox.lan". Also the workgroup field is now a domain field where the domain name "sandbox.lan" is shown.

Patch Windows Server offline

  1. Get latest cumulative from Microsoft Update
  2. Copy installation set to the virtual server
  3. Run installation on the virtual server as administrator


2018-12-28  Post migrated to SQLAdmin blog.
2018-04-30  Post released on AzureAdmin blog.


Excel datetime from Nagios CSV export

When you export a counter set like Processor % Priviledged Time from Nagios XI to a CSV-file, you get a file that begins with something like this:

'timestamp','\Processor($)\% Privileged Time_value'

The timestamp column is just a integer but I would like a human readable timestamp. At first I looked at a conversion from a file time, but the integer from Nagios is too small to give a correct datetime.

A good colleague then introduced me to Epoc timestamp, and with this formular in Excel the Nagios timestamp is converted correct:
  • "A2" is the reference to the cell holding the Nagios timestamp integer
  • 86400 is the Unix timestamp for the first day of the year 1970 in the Gregorian calendar, having 365 days.
  • 25569 is the timestamp for the same day (1970-01-01) in Excel calculated with =DATE(1970;1;1).
When you enter the formular in Excel the first result is not that human readable, e.g. 43404.52083. But by formatting the cell as a timestamp or a user defined type like
yyyy-mm-dd hh:mm:ss,000
you get a nice and precise timestamp:
2018-10-31 12:30:00,000

The number in the second column can also be formatted to something more readable by formatting the cells to a number, e.g. with one decimal.

Please notice that I am working in a danish Excel where the decimal delimiter is comma (,). I have tried to convert to the more international decimal delimiter dot (.), but the conversion is done manually in this text and not tested. So you could experience some errors on this minor subject, but I am quite confident that you can fix them quite fast.

You can play with Epoc conversions back and forth on EpocConverter.


TFS Backup

Backup share is required by the TFS backup wizard. Actually the wizard checks if the share is on the local host. To be sure that the rights are correct for a SQL Server backup the SQL Server service account should be owner of the folder that is shared. Subfolders should be created with the (undocumented) extended stored procedure xp_create_subdir to ensure that the SQL Server service account has the correct rights on the given folder.

Backup by TFS Job Tasks defined in TFS Administration Console (TfsMgmt.exe). Please be avare of that files are written in two contexts - the TFS Service (TFSService) and the SQL Server service:
  • Recovery Model on all TFS and SSRS databases is set to Full in the context of the administrator.
  • Backup configuration is written in xml-file in TFS backup folder (BackupSettings.xml) in the context of the TFS Service.
  • Backup of Reporting Services encryption keys are done by the wizard - not by a TFS backup job. This is why there only is one snk-file on a key backup in the TFS backup share.
Actual process in backup job:
  1. Verify installation, running status and capacity.
  2. Prepare databases for synchronized backup
  3. Grant permissions to backup path
  4. Full database backup of TFS databases; Configuration, Collection(-s) and Warehouse. Also SSRS databases; ReportServer and ReportServerTempDB. Backup files are created and written inthe context of the SQL Server service - like a normal SQL Server backup.
  5. Marking TFS databases using stored procedure prc_SetTransactionMark in each database; EXEC prc_SetTransactionMark TfsMarkTfpt
  6. Backup transaction log on TFS and SSRS databases. Backup files are created and written in the context of the SQL Server service.
Log files on the TFS job task are in the folder "%ProgramData%\Microsoft\Team Foundation\Server Configuration\Logs".
Also configuration and log files on each TFS backup run in the TFS backup folder.
  • Transactional_<number>.log; TFS job log on transaction log backup.
  • <database name>_<number>L.trn; SQL Server transaction log backup file.
  • Delete_<number>.log; TFS job log on deletion of old files.
  • Full_<number>.log; TFS job log on full database backup.
  • <database name>_number>F.bak; SQL Server full database backup file.
  • BackupSets.xml; TFS backup history on existing backup files.
  • RSKey<number>.snk; SQL Server Reporting Services encryption key backup file.
The log file BackupSets.xml holds information about the the existing backup files and their history.
Existing backup files is controlled by the TFS Delete job that deletes backup files on the given retention period when the recommended minimum value is seven days.

TFS Scheduled Backups does not
    • back up system databases; master, msdb, model. You could use Ola's SQL Server Maintenance Solution (GitHub).
    • back up Service Master Key (SMK).
    • back up Database Master Key (DMK) on master.
    • verify backup files.


    Access to vmware shared folders in PowerShell administrator session

    When you define shared folders to a vmware guest as a drive they are also available to a PowerShell session as a drive. Usually the Z-drive. This you can see by the command "net use"

    PS> net use
    New connections will be remembered.

    Status       Local     Remote                    Network
                 Z:        \\vmware-host\Shared Folders
                                                    VMware Shared Folders
    The command completed successfully.

    But if the PowerShell session is a administrator session, you get something like a network error. This is because the share is not defined to the administrator session. If a "net use" is run and the result compared with the one from the normal user session you can see that the Z-drive is missing.

    PS> net use
    New connections will be remembered.

    There are no entries in the list.

    The drive is then mapped to the session. A new administrator session will also have the Z-drive mapped.

    PS> net use Z: '\\vmware-host\shared folders'
    The command completed successfully

    It is possible to persist the mapping by the parameter "Persist", but you might experience that the persistance is a varying subject. I have not looked deeper into the details on this.
    /persistent:{yes | no}

    The short help on "net use" is available with the command:
    net use /?
    The longer help is available with the command:
    net use /help

    This is not only in PowerShell, but a general Windows UAC thing. You will have the somewhat same experience in other tools like Windows Shell (cmd) or Windows Scripting Host (WSH).

    Setting the Z-drive with the PowerShell cmdlet New-PSDrive does not make the vmware shared folders available to administrator sessions.