2024-11-19

List orphan database users

Some days ago I was asked - politely - to move some databases between two SQL Server installations. Unfortunately the service had failed on the source, so I have no access to the logins definitions.
This give me a lot of orphan database users.
To make thing more entertaining some of the databases has compability level 80, and this can not be changed.
To get a list of orphan database users, I have to combine two methods. This is because the compability level 80 databases does not have the system view database_principals.

DECLARE @db_name AS nvarchar(128); -- Database name
DECLARE @_msg AS nvarchar(2047); -- Execution message
DECLARE @_cmd AS nvarchar(MAX); -- Dynamic command
CREATE TABLE ##orphan_user (
[db_name] nvarchar(128) NOT NULL,
[user_name] nvarchar(128) NOT NULL
);

DECLARE db_cursor CURSOR FAST_FORWARD FOR
SELECT [name]
FROM [master].[sys].[databases]
WHERE [database_id] > 4;

OPEN db_cursor;
FETCH NEXT FROM db_cursor INTO @db_name;
WHILE @@FETCH_STATUS = 0
BEGIN
IF (SELECT [compatibility_level] FROM [master].[sys].[databases] WHERE [name] = @db_name) < 90 BEGIN SET @_msg = N'The database [' + @db_name + N'] is SQL Server 2000 or older.'; SET @_cmd = 'INSERT INTO ##orphan_user ' + N'SELECT ''' + @db_name + N''' AS [db_name], [sysusers].[name] AS [user_name] ' + N'FROM [' + @db_name + N'].[dbo].[sysusers] ' + N'LEFT OUTER JOIN [master].[dbo].[syslogins] ON [sysusers].[sid] = [master].[dbo].[syslogins].[sid] ' + N'WHERE [sysusers].[uid] > 4 AND [sysusers].[issqlrole]=0 AND [sysusers].[isapprole]=0 AND [master].[dbo].[syslogins].[sid] IS NULL;';
END
ELSE
BEGIN
SET @_msg = N'The database [' + @db_name + N'] is SQL Server 2005 or newer.';
SET @_cmd = N'INSERT INTO ##orphan_user ' +
N'SELECT ''' + @db_name + N''' AS [db_name], [database_principals].[name] AS [user_name] ' +
N'FROM [' + @db_name + N'].[sys].[database_principals] ' +
N'WHERE [database_principals].[type] IN (''S'',''U'',''G'') ' +
N'AND [database_principals].[is_fixed_role] = 0 ' +
N'AND [database_principals].[name] NOT IN (N''dbo'', N''guest'', N''sys'', N''INFORMATION_SCHEMA'');';
END
RAISERROR(@_msg,0,0) WITH NOWAIT;
EXECUTE (@_cmd);

FETCH NEXT FROM db_cursor INTO @db_name;
END

CLOSE db_cursor;
DEALLOCATE db_cursor;


SELECT * FROM [##orphan_user];

SELECT DISTINCT [user_name] FROM [##orphan_user]
ORDER BY [user_name] DESC

DROP TABLE [##orphan_user];


I execute only a part of the script where I fill the temporary table. Then I can filter and sort the data to handle the logins.

2024-11-16

DBA Career Levels

Starting from the first day the career levels could be something like this.

Trainee

Assist on tasks and issues. Can be given trivial tasks with some training. Mentor assigned to introduce to profession.

Junior (Level I)

The Junior DBA resolves issues while developing professional skills. Handle assigned tasks. Check test results. Will mostly work individual, but will have a close colleage as mentor (Sensei).

Usually with up to two years of experience. But the period length is not defining - it is the capabilities.

Specialist (Level II)

Works on troubleshooting, review and test. Does now have solid fundamental skills.

Between two to five years of experience. Unusally depending on the organisation, capabilities and the mentor (Sensei).

Senior (Level III)

Now works as individual contributor with deep understandings. Ownership of technical components like SQL Server components. Key resource in problem analysis. Give process recommendations. Comprehensive knowledge of practices and organization structures, both formal and informal (goat paths).

Develop and present proposals, also to customers. Negotiate project and product technical terms and requirements with vendors and customers. Participate in pre-sales. Professional mentor (Sensei). Key business knowledge. Recognized for technical skills on international level with influence on wide organizational level.

Typically more than five years of experience. Often more than ten years.

Principal

Designing and presenting technical stratigies. Setting department goals and creating implementation plans. Setting and manging engineering budgets.  Develop and implement proces improvements.

Provide expert advise. People management in research and design. Professional and career mentor (Sensei) but not as primary priority. Deep business knowledge.

Now it is no longer a matter of years but more of capabilities and ambitions. Usually has at least a bachelor degree.

Master

Guiding and training professional teams and individuals. This is the primary focus. Deep knowledge and experience in both tech and processes. Can also be called Sensei in a Lean context.

Executive

Part of senior management. Set budgets. Define products and projects.

Driven by career ambitions.

Very rare title. Most people has left database administration many years before this level is relevant to pursue a more focused business career.


Inspiration

History

2024-11-16: Post created.

2024-12-23: Master level added.

2024-11-07

Reduce OS memory with BCDEdit

This is a quick description on how to reduce OS memory on virtual Windows Server 2016 from 128 GB to 48 GB without changing VM configuration. This can be done with the command BCDEdit.exe which is somewhat described by Microsoft in the documentation on bcdedit and the documents "BCDEdit Command-Line Options" and "BCDEdit Options Reference". The tool is a default tool in Microsoft Windows and is placed in the directory "C:\Windows\System32\".

The name of the tool referes to Boot Configuration Data, which is described somewhat in an old MSDN Vista document BCD.docx. Boot Configuration Data is a firmware independant data store with boot data. The store is placed in the system partition.

Reducing memory available to the OS is sometimes done, when you have an application that is not using all available memory, but is licensed by the memory amount available to the OS. And this is sometimes made worse by a computer with way too much memory that can't be reduced by ridiculous agreements.

Before

System Information (msinfo32.exe) on the guest VM. All installed memory is available to the OS:

Reduce OS Memory

Calculation: The number needed for the command is the amount of removed memory. In this case 128 GB - 48 GB = 80 GB. As the number must be in MB the corection is 80 GB * 1024 MB/GB = 81920 MB.

Start console (cmd.exe) as Administrator. PowerShell does not work. Enter the command line:

bcdedit /set {current} removememory 81920

  • bcdedit is shorthand on BCDEdit.edit.
  • /set is a option to BCDEdit described in BCDEdit /set.
  • {current} is a reference to the current boot entry.
  • removememory removes memory available to the Windows operating system.
  • 81920 is the amount of memory to remove from the OS in MB. See calculation above.

Reboot VM.

After

System Information on guest VM:

Task Manager on guest VM:

It shows both 128 GB and 48 GB even only 48 GB is available to the OS. This will confuse most operators and administrators.

Roll-Back

Start console as Administrator. And enter the command line:

bcdedit /deletevalue {current} removememory

Reboot VM.

And all the memory is available to the VM as shown by System Information.

Notes on System Configuration

Windows has the standard tool System Configuration where it looks like you also can reduce OS memory. Let me point out that this does not work as bcdedit /set removememory but like bcdedit /set truncatememory which is not as effective.

Disclaimer

This is not my own findings but given to me by the good colleague Ib Thornøe (LinkedIn)

2024-10-28

Put SQL Server PVS in own filegroup

Putting the Persisted Version Store (PVS) in its own file group is to improve the performance of SQL Server Accelerated Database Recovery (ADR). ADR was introduced with SQL Server 2019 and had some improvements with SQL Server 2022. This is described in the Microsoft documentation "Accelerated database recovery". Putting the PVS in its own filegroup is recommended and described in the Microsoft document "Manage accelerated database recovery".

This example is on the AdventureWorks database. The script i in the SQLAdmin Github repository.

Please notice that the Microsoft documentation does not create the filegroup before using it for PVS. This is required.

2024-10-06

DBA Questions

These questions are relevant your first day on the job as Database Administrator. Also the questions are usefull to look at on a regular basis like a couple of times every year.

Q1: Where are the backups?

This is all types of backup files; full, differential and log backup files. And on all databases.

Also the backup files on the Service Master Key (SMK) and the Database Master Keys (DMK) on the system database [msdb] and the user databases with encrypted columns. This might require some investigations on encrypted columns.

This is most likely the most important question the first day.

Q2: How are the backups created?

Get concrete quickly and ask about tools and schedules. Get to know all the tools all the way to the last storage area. If you want to be a little more confrontative you could ask for the definitions of the processes to handle SQL Server recovery. I would wait a couple of weeke at least and prepare the formulation.

If the backups are created by SQL Server Maintenance Plans you should consider that as an anti-pattern and be cautious.

Q3: How are the backups used?

At first you should focus on database restore. Ask for a recovery plan and restore guides. The restore guides are the most important.

Be aware of the tools used for restore. Are they general available or are you to install them by your self? In what context are the tools to be used? Is a special user or other access required? Is a priviledged access device required?

Then you can open up and ask for a report from the last recovery test. Also you can ask about your new collegues personal experiences with recovery. Both in general and in the actual organisation.

Please remember that it is not you but the SQL Server service accounts that require access to the backup files.

Q4: How do I get administrative access to the SQL Server installations?

Do I have to apply for AD groups? Or apply for an administrative account? Or a combination? Is the access permanent or only by request? This is also you first peek into the security structure build around the SQL Server installations. The approach to SQL Server security is usually unique to organizations and people. A lot of people say that there is a strick role-based (RBAC) security model, but usually the reality is way different and with several variations.

Q5: What are the three most important systems using SQL Server?

Limit the question to three system. This you can look into the first day. Do the rest later.

Ask about a inventory on these systems. What databases, servers, contact persons etcetera.

Q6: What is your phone number?

Ask this question to your new collegues, Incident Management, line manager and the IT manager. Enter the numbers in your personal contact list. Decide by yourself who will have your private phone number. Personally I always have two phones, one job and one private. Also I am rather picky with who have my private number. Especially if I am not paid to take calls outside office hours.

2024-09-09

SQL Server Workstation

Some notes from a SQL Server 2022 Developer Edition installation on a Windows 11 Pro workstation. The infrastructure is VMware Workstation Pro.

General Preperations

If you clone an exiting VM then you should rename the new SQL Server workstation to something different. And do it before anything else.

As I usually has a sandbox for different versions of SQL Server the name could something like "MSSQL 2022 Workstation". Consider if you should differ between sandbox, workstation, devbox or anything else.

Make sure the Windows installation is up to date.

PowerShell

Install PowerShell and then Windows Terminal from Microsoft Store.

Open Windows Terminal and pin it to the task bar. Then close Windows Terminal.

PS > Install-Module -Name SqlServer
PS > Install-Module -Name dbatools

Storage

Move CD-ROM to Z–drive.

DriveNameSizeNTFSPathComments
CSystem64 GBdefault Windows and tools (SSMS)
PProgram64 GBdefault SQL Server
DData64 GB64 KB User database (mdf & ndf)
LTranslog32 GB64 KB User db & tempdb (ldf)
Ttempdb32 GB64 KB tempdb (mdf & ndf)
XBackup128 GB64 KB Full, diff & log backup
  1. Create one disk at a time: One-Piece Flow (Lean)- Must be done manually in VMware Workstation. Can be scripted in vCenter with PowerCLI.
  2. In VMware Workstation NVMe can only be added offline.
  3. Split virtual disk into multiple files.
  4. Do not allocate all space now.
  5. „MSSQL Workstation_<Drive>“
  6. All partitions are GPT.
  7. Only use Simple Volume.
  8. Quick format.

SQL Server Installation

  1. Connect VM CD-drive to SQL Server ISO-file
  2. Run SETUP.EXE
  3. Select „SQL Server Developer Edition“
  4. Select "Database Engine Services" and "Full-Text…"
  5. Place "Instance Root Directory" on another physical drive than Windows, e.g. "P:\Program_Files\Microsoft_SQL_Server\". This is to spread preassure on (local) storage units.
  6. Name the (Database Engine) instance, e.g. „SSDB00“. This will allow side-by-side installation of different SQL Server versions. On the other hand I usually does not recommend instanse name with version information. The instance name in shared installations should be generic – like a dumb database key.
  7. Change Service Startup Type to "Manual". When running several instances this will help you control local resources on your workstation. With virtual workstations I usually create one VM to each instance.
  8. Set „Grant Perform Volume Maintenance Task priviledge to SQL Server Database Engine Service“. This is to speed up database file (auto-)growths which might be when expanding the physical data model. This is really relevant on a workstation where you experiment. On the other hand I would never use Auto Shrink (AUTO_SHRINK) even in a sandbox.
  9. Collation tab: Use collation "Latin_General_100 AS KS WS" maybe with UTF-8.
  10. Add Current User as SQL Server administrator. Adding (Local) Administrators is not enough.
  11. Set Directories. Define seperate paths for
    • Data
    • TransactionLog
    • Backup
    • tempdb
      where [tempdb] transaction log is with the other user database transaction logs.
  12. Set tempdb data and log
  13. Set MAXDOP to 4
  14. Set memory to Recommended
  15. Run installation
  16. Close SQL Server Installation
  17. Disconnect CD-drive
  18. Restart VM

SQL Server Update

  1. Copy latest CU to VM. I use „C:\temp\“
  2. Start PowerShell as Administrator. That is Microsoft PowerShell (v7) not Windows PowerShell (v5)
  3. Change directory to „C:\temp\“
  4. Run the SQL Server CU exe-file
  5. Complete SQL Server CU installation
  6. Delete SQL Server CU exe-file from „C:\temp\“
  7. Restart VM

SSMS Installation

  1. Copy latest SSMS installation (SSMS-Setup-ENU.exe) to „C:\temp\“
  2. Start PowerShell as Administrator
  3. Change directory to „C:\temp\“
  4. Run SSMS Setup exe-file
  5. Complete SSMS installation
  6. Delete SSMS Setup exe-file from „C:\temp\“
  7. Start SSMS
  8. Cancel suggested connection
  9. Pin SSMS to taskbar
  10. View Registered Servers (Ctrl + Alt + G)
  11. Select properties on SSDB00
    • Server name: (local)\SSDB00
    • Trust server certificate
    • Registered server name: SSDB00
    • Connection Properties tab…
    • Connect to database: tempdb
    • Connection time-out: 5 seconds
    • Use custom color: green
    • Save
  12. View Object Explorer Details (F7)

ODBC

If SQL Server ODBC 17 is already installed, it will most likely generate a error like "The feature you are trying to use is on a network resource that is unavailable.".
The solution is to uninstall SQL Server ODBC 17. This can be done by Control Panel > Remove Program.

SSDB Configuration

Query: ALTER DATABASE [model] SET RECOVERY SIMPLE WITH NO_WAIT;

Query:

EXECUTE master.sys.sp_configure
	@configname = N'remote access',
	@configvalue = N'0';
GO
RECONFIGURE WITH OVERRIDE;
GO

Query:

EXECUTE master.dbo.xp_create_subdir N'X:\MSSQL\Backup';

Query:

EXECUTE master.sys.sp_configure
	@configname = N'backup compression default',
	@configvalue = N'1';
GO
EXECUTE master.sys.sp_configure
	@configname = N'backup checksum default',
	@configvalue = N'1';
GO
RECONFIGURE WITH OVERRIDE;
GO
EXECUTE master.dbo.xp_instance_regwrite
	@rootkey = N'HKEY_LOCAL_MACHINE',
	@key = N'Software\Microsoft\MSSQLServer\MSSQLServer',
	@value_name = N'BackupDirectory',
	@type = N'REG_SZ',
	@value = N'X:\MSSQL\Backup';
GO

Query:

EXECUTE master.sys.sp_configure
	@configname = N'show advanced options',
	@configvalue = N'1';
RECONFIGURE WITH OVERRIDE;
GO
EXECUTE master.sys.sp_configure
	@configname = N'cost threshold for parallelism',
	@configvalue = N'50';
EXECUTE master.sys.sp_configure
	@configname = N'backup compression default',
	@configvalue = N'1';
EXECUTE master.sys.sp_configure
	@configname = N'backup checksum default',
	@configvalue = N'1';
RECONFIGURE WITH OVERRIDE;
GO
EXECUTE master.sys.sp_configure
	@configname = N'show advanced options',
	@configvalue = N'0';
RECONFIGURE WITH OVERRIDE;
GO

AdventureWorks Installation

See SQLAdmin post on Restore AdventureWorks on SQL Server 2019.

;

Plan Explorer Installation

  1. Download Plan Explorer from www.solarwinds.com/free-tools/plan-explorer. This requires registration with e-mail and phone#.
  2. Start Plan Explorer
  3. Pin to Start

History

2024-10-23 – Details added.
2024-09-22 – Element on administrators added. And a few typhographic corrections.
2024-09-09 – Post created.

 

2024-07-16

File as parameter to PowerShell function

 Quite often I find it usefull to have a file as parameter to a PowerShell function. And most times I want to make sure the file exists when the function is called. In a PowerShell advanced function this can be done quick and short by a validation. More precisely by a ValidationScript attribute to the parameter.

A example on a function with a file as parameter with validation on existance could be something like

function Write-LogFile {
    <#
    .DESCRIPTION
    #>
    [CmdletBinding()]
    [OutputType([void])]
    Param(
        [Parameter(Mandatory=$true, ValueFromPipeLine=$true, HelpMessage='...')]
        [ValidateScript({$_.Exists})]
        [System.IO.FileInfo]$LogFile,

Running a call to the function with a file object that does not exist

[System.IO.FileInfo]$theLogFile = "$PSScriptRoot\log.txt"
Write-LogFile -LogFile $theLogFile

… will generate an error like

PS C:\Scripts> .\FileParameter.ps1
Write-LogFile: C:\Scripts\FileParameter.ps1:26
Line |
  26 |  Write-LogFile -LogFile $theLogFile …
     |                         ~~~~~~~~~~~
     | Cannot validate argument on parameter 'LogFile'. The "$_.Exists" validation script for the argument with value
     | "C:\Scripts\log.txt" did not return a result of True. Determine why
     | the validation script failed, and then try the command again.

There are many other ways to validate if a file exists, but I like to use a FileInfo object as it give the functionality without much custom code.

2024-07-03

Stored Procedure Guidelines (T-SQL)

Here are some personal guidelines on writing and using a stored procedure. They are mentioned in a somewhat random order. The list and each recommendation will most likely change over time with new experinces, feedback and other new knowledge.

SET NOCOUNT ON

… to avoid unnecessary activity and communication on row count.

Use schema name

... on used tables. Put procedure in custom schema without data.

Use tables

... not views, other procedures and triggers. And only functions with careful big-O considerations

Use sets

... not cursors or sub-selects as SQL Server is set-based. Cursors are fine when on DB2 or Oracle. Sub-selects are often seen on SAS.

Use functional name

... not sp_ or other Hungarian naming. It doesn´t really matter what type the object is.

Use sp_executesql

... for dynamic SQL. Not EXECUTE. In general you should avoid dynamic SQL. This includes prepared statements.

Use specific columns

... only take data that are required. Network traffic is then reduced to the required amount.

Use order of columns

... as given in the tables. One table a time. Re-arranging the columns could add significant resource consumption on each call.

Use keys

... and indexes with care. Use index on foreign keys. Check for covering index. Index usage might change with change of data in used tables.

Use table variables

... not temporary tables. This will keeps activity in buffer and away from storage,

Be direct

... and avoid temporary cached or staged data. Do not use dirty reads!

Be short

... by short transactions. Avoid costly isolation levels such as Repeatble Read. Also you can save some network traffic on often used stored procedures by short names on the procedure and the parameters.

Use effective parameters

... and avoid parameter sniffing. Focus on parameters where the values change a lot. Convert these parameters to local variables when suitable.

Check values

... before usage. Especially parameter values! Do the value check inside the procedure before usage.

Use effective types

... and avoid custom types. Custom types will add resource consumption on resources. And in some situation increase blockings on metadata.

Use effective operators

... avoid costly operators such as LIKE. Be carefull with COALESCE, string operations and aggregated values. T-SQL is for finding and sorting data. Use other languages for complex calculations and text processing.

Use explicit type cast

... not indirect. And you are still somewhat in control.

Use EXISTS

... not COUNT. As this is more effective with T-SQL.

Use effective error handling

... by testing before exception handling with TRY - CATCH. Consider exception handling as last-line-of-defence after tests and checks. Even when using exception handling this should be kept inseide the procedure. Throwing a exception outside the procedure could be rather costly on the resource consumption.

Use MS_Description

... extended property. This helps development with Visual Studio, see SQLAdmin on Descriptions.

Document

... procerdure, each parameter and output/return value in code, detailed and with examples. Also when you are the one and only developer.

Set security explicit

... on procedure through custom database roles on Active Directory groups. Using default database roles is outside your control.

Use ALTER PROCEDURE

... not DROP - CREATE PROCEDURE. Security and other external configurations will be lost on each change.

Monitor

... usage and performance with DMVs, XEvents etc. See Monitoring Performance of Natively Compiled Stored Procedures.

Call procedure effective

... with CommandType.StoredProcedure and SqlParameter objects on the SqlCommandobject. Use careful crafted connection string with secure authentication (SSPI) like Kerberos.

Reference

Kimberly Tripp, SQLSkills: Building High Performance Stored Procedures.

 

Top 10 stored procedure performance tuning tips in SQL server

2024-05-13

Buffer Pool scan

In a SQL Errorlog we saw a lot of events like this.

Buffer Pool scan took 10 seconds: database ID 6, command 'CHECKPOINT', operation 'FlushCache', scanned buffers 341183509, total iterated buffers 345933594, wait time 75 ms. See 'https://go.microsoft.com/fwlink/?linkid=2132602' for more information.

The refered article "Operations that trigger a buffer pool scan may run slowly on large-memory computers" states that

Prior to SQL Server 2022, there was no way to eliminate this problem.

But then another event came around where we looked at Indirect Checkpoint configuration. It turned out that the database originally was so old that the "Target Recovery Time" was 0 (zero).

When we changed this configuration to 60 then the Buffer Pool scan messages no longer showed up in the SQL Errorlog.

Another benefit was that spikes and average on Disk Write Latency were reduced significantly.

Kendra Little wrote some time ago a useful article (link) with good references and details when Target Recovery Interval changed default value.

2024-03-31

Disassembling a hard disk

Hard disks with spinning platters are going away very fast. During easter cleanup in my workshop I found a old 750 GB hard disk, and decided not only to take it to the local recycling station but to disassemple it to see how it it constructed. During the disassembly I took some photos with my phone - they are not near professional quality but they are still usefull with the comments.

The drive is a Seagate Barracuda ES.2 (data sheet) with a SATA 3Gb/s interface. It runs with 7200 RPM which made it rater usefull when it was released. I stopped using the drive when I really swithched to SSD in my workstation. There is nothing wrong with the drive - it just went obsolete to me.

To remove the cover there are at first six T9 screws. But hidden behind the sticker there are three more screws, also T9.
With the cover removed there is direct access to the inner components. Also this is the point of no return to use the drive again as the disk is now filled with dirt from the normal atmosphere.
On the backside of the hard disk the circut board is placed. It is fastned with T6 screws but none hidden. The connection from the board to the inner drive is through some contacts. The picture below shows two sets of contacts with one set of three contacts for the electrical engine spinning the platters and another set of four by five contacts for the head.
A closer look at the board shows chips from Seagate and Samsung. The external connections are direct from the board to the physical interfaces.
Under the board and a metallic sticker there is another hidden T9 screw. This holds the head.
The connection for the head is on a small board. But is quite easy pulled out and disconnected from the head.
The head unit movements are controlled with a coil on the back of the arms and two powerfull static magnets. The top magnet is only held in place by the magnetic force and can be removed with a small screwdriver.
The arm with heads and coil is one unit together with a small board. This head unit is removed without any tools.
A closer look at the head unit from another angle show a small board and the arms with the physical heads to transport electro-magnetic signals to and from the platters.
Going from buttom to top on the picture below it looks like the buttom arm only hold one head to communicate with the lower side of the lowest platter.
The next two arms holds two heads each to communicate with the upper side of the platter below the arm and the lower side of another platter placed above the arm.
The fourth arm hold only one head to communicate with the upper side of the top platter.
There is a fifth arm but this hold no heads and is not connected on the board.
I guess that the head unit was designed and partly constructed for up to four platters where this hard-drive only has three platters as shown later.
The bottom magnet that was under the head unit is also quite easy removed without any tools. I guess that the small plastic spring to the right on the picture below is some sort of anti-vibration or shock-absorbtion.
The spindle holding the platters on the axle is removed with six T9 screws.
The first platter is removed without tools. Between the platters there is a open ring of metal fastned to the chassis. I guess this is to stabilize the platters when they are spinning. Also there is a inne ring to give space for the heads.
After all three platters are removed the chassis looks quite empty. I have tried to lay out the components to show the stack from buttom to top. Between the platters there is a inner ring and a open outer ring.
The outher part of the electric motor spinning the platters is removed with brute force as indicated by the tools on the picture below. The outer part is the static magnetics where the inner part is the coils. How the outher part is fastned I can't see. There are no obvious traces of glue.
The three-pin connector for the motor on the back of the chassis is glued on but easy removed with a small screwdriver.
The inner part of the motor with the coils is removed with brute force and another set of tools.
All in all I count about 55 major components in the hard drive. In that there is about 31 screws.
Looking at modern storage there has certainly been some major changes. Compared to a NVMe storage unit with only one major components in a single board and no mechanics the technological span is amazing. But also compared to an elder SAS SSD drive typically with one board, around three screws and two metal half-shells the technological evolution is significant.

In general and looking back the flash storage was a paradigm shift where all our notes on spindles, RPMs and configuration tricks like short-strike suddenly were to no use. With NVMe many rules of thumb were obsolete. As PCIe (PCI-SIG) evolve we will see new fantastic features also on NVMe and CXL. But the next paradigm shift does not come from evole the current mainstream technology but when a completely new technology matures and push to the side the mainstream tech.

2024-02-03

PowerShell Advanced Function Template

I have made this simple template to start a new PowerShell script file with a advanced function. It helps me to get the basics right from the beginning. Also it supports my effort to write all scripts with advanced functions.

The code for the template on a advanced function is on SQLAdmin GitHub (link).

PowerShell script documentation

To PowerShell is defined a syntax for comment-based help in scripts (about_Comment_Based_Help). And there are many more keywords than I have used above, but my personal PowerShell style is covered with these keyword. Feel free to create your own script template with our own personal selection of standard keywords.
Using only standard keywords ensure that your scripts can be presented in a central place with nice and useful documentation. And the presentation can be automatic updated without redundant documentation tasks.

Michael Sorens has on SimpleTalk posted a series of great posts about automatic documentation on a library of PowerShell scripts. The first post is called "How To Document Your PowerShell Library" where the basics are presented in a brilliant way.

Advanced function

If a script is evolving to become a central part of a process then you should elevate it by refactoring it to an advanced function.
The template above is for an advanced function, but not all possibilities are included.  The documentation on advanced functions (about_Functions_Advanced) and parameters on advanced functions (about_Functions_Advanced_Parameters).

A very important part of an advanced function is the CmdletBinding function attribute (about_Functions_CmdletBindingAttribute) that will make a function work like a compiled CmdLet. You can in most cases just use it blank and it will make a function work really nice as a CmdLet, but if you take some time to read and experiment you can create some awesome advanced functions. If you are looking into very large amounts of data you should take a look of the possibilities with CmdletBinding. More technical CmdletBinding is a PowerShell accelerator of the .NET class CmdletBindingAttribute that is a part of the namespace System.Management.Automation.

One thing that you will benefit from rather quick is the use of -Verbose or -Debug where it will by passed on through advanced functions. Then you will get ectended information not only from your own code but also from real CmdLets and .NET objects.

There are several articles that covers various facets of advanced functions ok, but my personal reference is, beside the documentation, the book „PowerShell In Depth“ (e.g. chapter 32) by Don Jones (@concentrateddon), Richard Siddaway (@RSiddaway) and Jeffery Hicks (@JeffHicks). Look for latest edition.

PowerShell script module

When you have refined the script, I think it is time to move the functionality to one or more PowerShell script modules. The general concepts about PowerShell modules are listed in the article „Windows PowerShell Module Concepts“ and short described in the MSDN article „Understanding a Windows PowerShell Module“. Right now I will focus on a script module as this text is about PowerShel scripting. There is a short and nice introduction in the MSDN article „How to Write a PowerShell Script Module“.

In the script template above I have indicated how to dynamically import a script module placed in a given path. If the script module is not signed and the PowerShell Execution Policy is somehow restricted – which it should be! – you can't import a module placed outside the local computer, e.g. on a file share through a UNC path.

PowerShell script modules are defined and stored in psm1-files, but looks a lot like plain script files. If you use the script template above for a script module the module import and the final lines with invokation should be removed. The invokation should be in the script file importing the script module.

There are several benefits of using PowerShell script modules. Many of them you will see when you start using them...
One thing that surprised me was the value of unit testing the script code.

Michael Sorens (again) has on SimpleTalk written the great article „Further Down the Rabbit Hole: PowerShell Modules and Encapsulation“ on PowerShell modules and the surroundings with some very valuable guidelines.

History

2017-03-15 : Blog post created on SQLAdmin blog.
2017-05-17 : Section about script documentation added. Inspired by comment from my old friend and colleague Jakob Bindslet.
2017-06-14 : Section about PowerShell script module added.
2017-07-04 : Section about PowerShell Advance function added.
2017-11-10 : Blog post text migrated from SQLAdmin blog and code to Github.
2024-02-03 : Blog post migrated (back) to SQLAdmin. Code migrated to SQLAdmin Github.

Path environmental variable with PowerShell

See defined paths
$env:path.Split(';')

Add new path
[string]$JdbcAuthPath = 'C:\temp\Install-Manual\sqljdbc6220\sqljdbc_6.2\enu\auth\x64'
[string]$OldPath = [System.Environment]::GetEnvironmentVariable('Path')
[string]$NewPath = $OldPath + ';' + $JdbcAuthPath
[Environment]::SetEnvironmentVariable('Path', $NewPath, [System.EnvironmentVariableTarget]::Machine)

Test new path
(new session)

Remove path
[string]$OldPath = [System.Environment]::GetEnvironmentVariable('Path')
[string]$NewPath = $OldPath.Replace($JdbcAuthPath, $null)
[Environment]::SetEnvironmentVariable('Path', $NewPath, [System.EnvironmentVariableTarget]::Machine)

Test new path
(new session)

DSC Environment Resource

(https://docs.microsoft.com/en-us/powershell/dsc/environmentresource) 

History

2024-02-03 Migrated from Azure Admin blog.

2024-01-28

Integrated Security = SSPI

 The term Integrated Security = SSPI is a element in a connection string for an application to get access to a data source using a OLEDB or SqlClient provider. Usually it is just used because it is recommended by someone trustworthy.

SSPI is an abbreviation to Security Support Provider Interface which is a general implementation of the Generic Security Services Application Program Interface (GSSAPI). The interesting thing is that SSPI makes it possible to get the most secure authentication in a given context. In Windows it is at least a Kerberos or NTLM authentication. When possible you will get the more secure - and faster - Kerberos authentication. This is done by a negotiation, where some parameters like rank can be configured in Windows. The Windows configuration can be through Group Policies, both local and domain.

In relation to SSPI Kerberos and NTLM are Security Support Providers (SSPs) speaking Windows lingo. The documentation from Microsoft says that you can write our own SSP, but I couldn't find many details on the subject.

Speaking about Microsoft documentation the SSPI documentation as spred across Windows Server documentation on Windows Authentication with "Security Support Provider Interface Architecture" and Windows App Development on Remote Procedure Call (RPC) with "Security Support Provider Interface (SSPI)". In general I find it somewhat difficult to find documentation on the involved components in a login and authentication process.

Using SSPI is done with different parameters and values on each driver or provider.
With the SqlClient the keyword to the ConnectionString property is "Integrated Security" and the value is "SSPI" - which is „strongly recommended“ by Microsoft. Please notice that with SqlClient 5 the namespace is changed from System.Data.SqlClient to Microsoft.Data.SqlClient as the SqlClient component is removed from the .NET framework and is now available as a NuGet package.
With the ODBC driver the keyword is "Trusted_Connection" and the value is "Yes" which is an example on the big variations on both keywords and values. If you are using a driver from another vendor than Microsoft or a driver for another environment you should be very aware on these details as they can really bite you.

 

2024-01-23

PowerShell SMO by SqlServer module

Install SqlServer module:

Install-Module -Name sqlserver

Typical warning:

Untrusted repository
You are installing the modules from an untrusted repository. If you trust this repository, change its
InstallationPolicy value by running the Set-PSRepository cmdlet. Are you sure you want to install the modules from 'PSGallery'?
[Y] Yes  [A] Yes to All  [N] No  [L] No to All  [S] Suspend  [?] Help (default is "N"): y


See installed module:

Get-Module -Name SqlServer -ListAvailable


Import SqlServer module in PowerShell session:

Import-Module -Name SqlServer


Create SMO Server object:

$smoServer = New-Object Microsoft.SqlServer.Management.Smo.Server '(local)\SSDB2019'

See SMO Server object information:

$smoServer.Information

2024-01-01

Top-ten posts 2023

2023 was – again – a year with far less posts than planned. Fortunately some elder posts still generate some activity on this blog.

The top-ten 2023 post list is something like Top-ten posts 2022.

RankMoveRank 2022TitleViewsCreated
101SqlBulkCopy with PowerShell1.9502017-07-30
2>9>10High performance script timer8142020-01-03
3+14ISO 8601 date formatting using PowerShell4792008-08-14
4>7>10ISO 4217 Currency list4252012-06-26
5+38SQL Server Agent schedule owner3602010-02-08
6>5>10Team Foundation Server on a single server3502018-09-19
7-16DBCC CHECKDB with PowerShell3062014-10-09
8>3>10Start shared HTA with PowerShell2712012-09-13
9>2>10VLF count2472011-01-11
10010Audit Log for Analysis Services2042014-01-08

This year there are several new items on the Top-ten. My personal biggest surprise is the item on ISO 4217.
But – still my favority ISO is 8601. The oldest post is more than ten years old, which is I find both interesting and disturbing.

The posts in this blog are personal notes rather than public presentations. This is why I am fine with the low number of views.