Art of the DBA Rotating Header Image

April, 2013:

Server Inventories

Ok, I’ve been promising this to people for a while, time to get at it.  Brace yourself for a long one.

 As enterprise DBA’s, we usually have many servers to manage.  Whether it’s around ten, a hundred, or a thousand (or more!), tracking this manually is a bear.  However, there are still many shops maintain convoluted spreadsheets and other manual documentation to track their inventory.  Manual sucks.  As John Sansom(@SQLBrit) says, the best DBAs automate everything, including their server inventories.

 There’s a lot of approaches and third party tools you can use for your own environment.  In my mind, I’m totally cool with spending the money on a third party tool because of the depth they can provide in my tracking and monitoring.  A lot of times, though, we might not have access to those tools or we may need more information than those tools provide.  It’s at this point we sit down at our keyboard, crack our fingers, and start banging out some code.  Of course, the code I started banging out was Powershell (shocking, I know!).

 I didn’t start from scratch, though.  When putting my own script together (which we’ll get to later), I found two great resources that got me about 80% of the way:

 Allen White(@SQLRunr) – Let PowerShell do an Inventory of your Servers
Colleen Morrow(@ClevelandDBA) – Building A SQL Server Inventory

These are both great resources that will get you started, but I needed a little more.  The biggest challenge I had was a need to poll cluster information and collect SQL instance information separate from my machine information.  I also wanted to do some consolidation of my data sets.  So while I borrowed heavily from both Allen and Colleen, but then molded it to my own purposes.

Process

Before we dig into the code, I first want to touch on the collection process and how I handle it.  It’s not complex, but code doesn’t make sense until you understand the “why” of the different components.

What I’m collecting

  • SQL Instance information – Whether it’s an instance living on a cluster node, a physical stand alone, or a VM host, I want the SQL Instance without really caring about the machine it lives on.  At least not meshed in directly with the machine, though I DO collect the physical host name so I can identify which machine information I need.

  • Physical Machine information – Since I’m not tying the machine info directly with the SQL Instance collection, I want the physical information for the servers my instances live on.  This is where I get everything that isn’t tied directly to the SQL Instance.

How I’m collecting it

  • I’ve got 4 tables, 2 live tables and 2 stage tables, that I use for the collection.  Nothing out of the ordinary, but there’s an instance table with a stage table and a machine table with a stage table.

  • I use a Powershell script to perform the collection itself.  It uses instance table as its primary driver, where the idea is that I provide the initial population of instance names to track into the table and the inventory process will fully populate the rest of the information.

  • When the Powershell scripts complete, there is a stored procedure it executes to load data from the stage tables into the live tables.  The load is simple, where the instance table is updated with data from the stage and the machine information is deleted/replaced.

  • The Powershell script is run by a SQL Agent job running under a specific monitoring Active Directory account created as a credential in SQL Server.  The gotcha here is that the monitoring account needs domain access to the clusters and machines it’s going to be querying.

The Code

I won’t list the full scripts here, but you can download and review them here:

SQL Objects
Powershell script

 Let’s first talk about the dependencies:

  • The SMO, so you’re going to need to have either your script or your profile load the 2008 snap-ins or import the 2012 sqlps module.

  • Chad Miller’s Out-DataTable and Write-DataTable (I combined them into one file, DataTables.ps1).

  • The FailOverClusters module, part of the Remote Admin pack if you’re not running the code on a machine with the Windows Server OS, part of the Fail Over Cluster components if you are.

There are two key functions I use, Get-Instance to gather the SQL instance information and Get-Machine to gather the machine information:

#Collects and returns SQL Instance information
function Get-Instance([string]$instcoll,[int]$id,[string]$name)
{
try
{
	$smo = new-object ('Microsoft.SqlServer.Management.Smo.Server') $name
	$sname = $smo.NetName
	$iname = $smo.InstanceName
	if($iname.Length -eq 0 -or $iname -eq $null) { $iname = "MSSQLSERVER" }

	$managedcomp = new-object ('Microsoft.SqlServer.Management.Smo.WMI.ManagedComputer') $sname
	$output = New-Object System.Object

	$port = $managedcomp.ServerInstances[$iname].ServerProtocols["Tcp"].IPAddresses["IPAll"].IPAddressProperties["TcpPort"].Value
	$ip = (Test-Connection $sname -count 1).IPV4Address.ToString()

	$output | Add-Member -type NoteProperty -name InstanceId -value $id
	$output | Add-Member -type NoteProperty -name SQLVersion -value $smo.VersionString
	$output | Add-Member -type NoteProperty -name SQLVersionDesc -value $smo.ProductLevel
	$output | Add-Member -type NoteProperty -name SQLEdition -value $smo.Edition
	$output | Add-Member -type NoteProperty -name MemoryMinMB -value $smo.Configuration.MinServerMemory.RunValue
	$output | Add-Member -type NoteProperty -name MemoryMaxMB -value $smo.Configuration.MaxServerMemory.RunValue
	$output | Add-Member -type NoteProperty -name MAXDOPVal -value $smo.Configuration.MaxDegreeOfParallelism.RunValue
	$output | Add-Member -type NoteProperty -name IP -value $ip
	$output | Add-Member -type NoteProperty -name Port -value $port
	$output | Add-Member -type NoteProperty -name PhysicalHost -value $smo.ComputerNamePhysicalNetBIOS
	return $output
}
catch
{
	write-host "Error collecting $name"
	return $null
}
}#Get-Instance

#Get host machine information via WMI
function Get-Machine([string]$name,[string]$clst)
{
 try
{
	$comp = gwmi Win32_ComputerSystem -Computer $name | select Model,Manufacturer,TotalPhysicalMemory
	$proc = gwmi Win32_Processor -Computer $name | select NumberOfLogicalProcessors,MaxClockSpeed
	$os = gwmi Win32_OperatingSystem -Computer $name | select OSArchitecture,Name,Version,ServicePackMajorVersion,ServicePackMinorVersion

	$output = New-Object System.Object

	$output | Add-Member -type NoteProperty -name MachineName -value $name
	$output | Add-Member -type NoteProperty -name Model -value $comp.Model
	$output | Add-Member -type NoteProperty -name Manufacturer -value $comp.Manufacturer
	$output | Add-Member -type NoteProperty -name Architechture -value $os.OSArchitecture
	$output | Add-Member -type NoteProperty -name PhysicalCPUs -value $(if(!$proc.Length){"1"}else{$proc.Length})
	$output | Add-Member -type NoteProperty -name LogicalCPUs -value ($proc | Measure-Object NumberOfLogicalProcessors -sum).Sum
	$output | Add-Member -type NoteProperty -name CPUSpeed -value ($proc | Measure-Object MaxClockSpeed -max).Maximum
	$output | Add-Member -type NoteProperty -name MaxMemory -value ($comp.TotalPhysicalMemory/1MB)
	$output | Add-Member -type NoteProperty -name OSName -value $os.name.split("|")[0]
	$output | Add-Member -type NoteProperty -name OsVersion -value $os.Version
	$SPMaj = $os.ServicePackMajorVersion
	$SPMin = $os.ServicePackMinorVersion
	$output | Add-Member -type NoteProperty -name SPVersion -value "$SPMaj.$SPMin"
	$output | Add-Member -type NoteProperty -name Cluster -value $clst

	return $output
}
catch
{
	write-host "Error collecting $name"
	return $null
}
}#Get-Machine

Both leverage the SMO and the WMI to get relevant info.  I’ll let you sort through the individual elements I’m pulling, it’s fairly standard config info.  Data gathering, because I think in terms of tables, is a collection of objects that are essentially rows.  I append each new object with all the appropriate properties to a larger collection, then use Write-Datatable to push that data directly into the related stage table.

Look at the entire script for my full process.  I’ve been pretty happy with it and it’s been chugging away in my environments now for about 8-9 months without a whole lot of interaction from me.  If we stand up a new server, then I just pop that name in the instance table, run the job, and everything gets updated.  What made me most proud was that we had a recent meeting with a new VP and pulled up the SSRS report built off of this information to give him an idea of what our environments.  His comment was “This is the most documentation I’ve seen since I’ve got here.”  The beauty is that it was provided without a whole lot of manual maintenance on my part.

This is very much an evolving work on my part, but hopefully gives you some insight into how I manage my environments.  I’ve still got lots of ideas on improvements, including automatic population (or other use) of Central Management Server.  If you have any ideas on how this can be improved, I’d love to hear them.

T-SQL Tuesday #41: The Hook #tsql2sday

Bob Pusateri(@SQLBob) is this month’s host for T-SQL Tuesday with a topic I definitely can relate to.  Bob asks bloggers to talk on their presenting experiences:  how they got introduced to it and why do they keep doing it.  Since I’m right on the heels of giving my Server Core talk at SQL Saturday 197, it’s perfect timing.

To put my presentation experiences in context, let’s first talk a bit about some of my performance philosophy.  I’ve written about my musical background before and how it relates to giving technical talks.  One of my chief theories of performance (and art, for that matter) is the requirement of an audience, that art is not really art until you have an audience to appreciate it.  It’s all well and good for me to practice and play by myself, honing my skills and rehearsing pieces, but none of this becomes music until there are people to listen to it and hear my message.  Art is about communicating with that audience, sharing something of yourself through your performance.

This is a philosophy that directly translates to the presentations we give in the SQL community.  The main driver is for us to share our technical knowledge with our peers, to create and education conversation with those who do what we do.  For many, it’s intimidating to present when you think you have nothing to share.  When we realize that we can teach our audience something new, it’s an epiphany of what our impact can really be. This was exactly the “hook” that got me into presenting.

It was March 2011.  I had recently read Brent Ozar’s (@BrentO) landmark post: Rockstars, Normal People, and You.  I wasn’t sure about presenting, but I’d figure I’d give it a bash, so I reached out to the Denver User Group to see if I could sneak in for a slot.  After initially being told that my first chance would probably be something in the summer, I got a call from the VP of Events to see if I could give a short talk for the March meeting.  Apparently, the regularly scheduled speaker had to cancel and the group needed someone to fill in on short notice.

I had about two weeks.  In retrospect, that is a TON of time, but as a new speaker I felt like I was cramming last minute for an exam.  I put together a short presentation on database security, built around this cool extended stored procedure I found: xp_logininfo.  The night of the meeting came along and I went to the podium to warm up the room for Doug Lane(@thedouglane) with my “dinky, little presentation” .  The 30 minutes flew by, I think partly because of my nerves and I talked quickly, but everything went fine.  My demos worked, no one laughed at me, and my biggest sin was not speaking up so the back of the room could hear me.

Then came the “hook”.  As I was packing up for the evening, Tom Norman(@ArmorDBA) came over to talk with me.  Tom’s been a regular at the user group for a while who has given his own share of presentations.  To this day, I remember what he said:

“I’ve been a DBA for over twenty years.  You taught me something new tonight.”

Needless to say I was flattered, but it took a couple days to sink in.  When it did, it hit me: these were people that benefited from my performance, an audience that enjoyed my performance.  I was able to take my technical knowledge and mold it into something more.  Two years later and I’m a regular speaker in the mountain west area as well as VP of Events for the Denver group.  I’ve had the opportunity to speak at SQL Rally and many other SQL Pass events.  Presenting has been so much fun and it’s opened countless doors and started numerous friendships.

I want to thank Bob for giving speakers a chance to share their experiences.  My biggest hope is that we can encourage those who haven’t started speaking to do so.  If you’re reading these T-SQL Tuesday posts and you haven’t given a talk yet, go talk to your user group right now.  The SQL community is always looking for speakers and, whether you believe it or not, you have something I want you to teach me.