THE SQL Server Blog Spot on the Web

Welcome to SQLblog.com - The SQL Server blog spot on the web Sign in | |
in Search

Allen White

Use the Best Tool for the Task - Part 2

Yesterday I posted about how Transact-SQL was better suited than SMO for gathering details about database files for a database with very large files. Today I'd like to point out that creating the database I used to test that fix was far easier with PowerShell and SMO than any other method.

SQL Server, and Transact-SQL specifically, are very good at set-based activity, and that's a good thing. Relational databases are based on a set model. To create all the files I needed for my test I needed an iterative process, and while Transact-SQL can do that, other languages are better at it.

Here's the script I used to add all the filegroups and files needed for my test. I needed over 380 filegroups to match what my client database looked like, and each filegroup had to have 32 files in it. Before running the script I created a small database (2MB data, 1MB log) called "TestLots". I also figured out through trial and error that I could create a database file as small as 512 bytes, but not 256 bytes. (I didn't try to find the absolute minimum file size I could create.) This script will add all the additional filegroups and files to that database.

[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO')  | out-null                          
$s = new-object ('Microsoft.SqlServer.Management.Smo.Server') 'SQLTBWS\INST01'                                   
$p = $s.MasterDBPath                                                                                              
$db = $s.Databases['TestLots']                                                                                    
for ($i=1;$i -lt 383;$i++) {
	$fgname = 'Test_' + [string]$i
	$fg = new-object ('Microsoft.SqlServer.Management.Smo.FileGroup') ($db, $fgname)
	$db.FileGroups.Add($fg)
	
	for ($j=1;$j -lt 33;$j++) {
		$dbfname = $fgname + '_' + [string]$j
		$dbfpname = $p + '\' + $dbfname + '.ndf'
		#write-host $dbfname $dbfpname
		$dbf = new-object ('Microsoft.SqlServer.Management.Smo.DataFile') ($fg, $dbfname)
		$fg.Files.Add($dbf)
		$dbf.FileName = $dbfpname
		$dbf.Size = [double](512.0)
		$dbf.GrowthType = 'None'
		}
	}
$db.Alter()

First I load the SMO library and connect with the SQL Server instance. Then I find out the directory holding the master data files (because a default data directory hasn't been set up on my system). Then I set the $db variable to the TestLots database object. I then set up a for loop to iterate 382 times and create a filegroup called Test_n, where n is the number of the filegroup. Once the filegroup is added, I use another for loop to iterate 32 times, creating a file called Test_n_m, where n is the filegroup and m is the file number within the file.

As I mentioned yesterday, using the right tool for the task is important, and having a variety of tools at your disposal truly helps get the job done faster.

Allen

Published Thursday, September 03, 2009 10:28 AM by AllenMWhite
Filed under: ,

Comment Notification

If you would like to receive an email when updates are made to this post, please register here

Subscribe to this post's comments using RSS

Comments

No Comments

Leave a Comment

(required) 
(required) 
Submit

About AllenMWhite

Allen White is a consultant and mentor for Upsearch Technology Services in Northeast Ohio. He has worked as a Database Administrator, Architect and Developer for over 30 years, supporting both the Sybase and Microsoft SQL Server platforms over that period.

This Blog

Syndication

Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement