I have a change implemented by several (10+) T-SQL script files in sizes from about 200 KB to 739 MB. Executing a 739 MB T-SQL script in Microsoft SQL Server Management Studio (SSMS) will fail, and typing several sqlcmd statements manually will be tedious and faulty.
By placing alle the T-SQL script files in one folder and using a single PowerShell statement all the sqlcmd statements can be created nice and quick:
ls *.sql | %{"`n@ECHO $($_.BaseName)...`nsqlcmd.exe -E -S %DbInstanceName% -i `"$($_.Name)`" -o `"$($_.BaseName).log`" -r1 -b`n@ECHO :: ErrorLevel = %ERRORLEVEL%`n@ECHO."}
The generated sqlcmd statements I can then copy into a Windows Shell script file (.cmd) where the variable DbInstanceName is defined by
@SET DbInstanceName="sandbox.sqladmin.lan\SSDB01"
It is on purpose that the Windows Shell script continues on error - in this specific case.
The logfile defined by the parameter -o in each sqlcmd statement holds output as results (e.g. SELECT), events (e.g. PRINT) and errors (e.g. RAISERROR).
This is verified by using a T-SQL test script
SELECT @@VERSION AS [sql_version];
GO
PRINT N':: Hello from PRINT';
GO
RAISERROR( N':: Hello from RAISERROR', 18, 0 );
and calling this from a sqlcmd statement generated by the PowerShell statement above.
2010-02-10
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment