Art of the DBA Rotating Header Image

July, 2012:

SQL Superstars

Michael Jordan.  Peyton Manning.  Joe Sakic.  Kobe Bryant.  Tom Brady. Andy Pettit.

Sports superstars. We watch them throughout the year, enjoying their athletic performances and last minute heroics.  Whether you love them or hate them, their abilities are amazing.  But there’s more than that.  We also watch as these special athletes also become a focus for their team, a linchpin for their team mates, a fulcrum upon which their organization’s efforts are leveraged.

I typically listen to the local sports talk radio on my way to work.  One day, the hosts were discussing superstar qualities, such as the ability to make plays and elevating the play of their team mates.  Since I was heading in to work, I started wondering how that translated to my job and if I was trying to bring these same superstar elements to the database court.  After all, I’ve always considered myself smart and able to handle any technical task in front of me, but being a true superstar is more than just being good, it’s also about being a leader and a force that drives the people around you to success.

I don’t think it’s quite that far of a leap from sports superstardom to the corporate world.  While we can’t do what these guys do (or get paid millions of dollars to do it), we can approach our jobs with the same mindset.  Here are some of the qualities that bounce around in my head and how the can be brought to our jobs:

  • Talent – Successful athletes all have some basic level of talent.  You don’t have to be the best, but you do need to be good.  I think all people in the technical field have this capacity, since we wrap our brains around some pretty involved concepts daily, so this is the easiest to attain.
  • Drive – It’s not enough to be ok.  It’s not enough to do the minimum.  Superstars are ambitious, want more, and strive to be the absolute best they can be.  For the technical field, this means we embrace the constant change, the continual learning, and the desire to find better ways to get our day to day work done.
  • Leadership – Now it gets harder.  Superstars realize that they’re on a team, and have to be a part of that team.  But true superstars step forward and lead the team, they set the tone for the people around them.  As data professionals this means we look for projects and weaknesses in our environments and improve them.   Don’t ask for permission, don’t look for approval, just lead.  Is there a better way to run your backups?  Do you see poor performance due to bad query or table design?  Where ever you see a place where something can be made better, lead the way and those around you will follow.
  • Make those around you better – Once people start following, you need to help lift them up.  Your team is only as strong as the weakest link and, as a superstar, it’s incumbent upon you to strengthen those links.  Maybe a teammate is struggling with escape characters in dynamic SQL or they have questions about creating a server side trace.  Whatever the case, withholding your knowledge or doing the work for them doesn’t help the team.  True superstars raise up those they work with, because they know that the success of the team is dependent on the ability of the team, not the ability of its superstars.

The work we do every day is a team sport.  We can be good as individuals, but we are rarely viewed as that.  Take John Elway and Dan Marino, two of the greatest quarterbacks of their generation.  In almost all cases, Elway is seen as the better quarterback because of his team’s success.  While he’s recognized as a proficient individual talent, the Superbowls he led the Broncos to set him apart.  That’s the key, that he led the Broncos.  His superstar status is more a result of what he did with his team than what he did by himself.

We want to have the same effect.  While there might not be a championship on the line, our own talents and abilities are enhanced by what we do for those around us.  These people might be team members, co-workers, or even the extended SQL community.  Note that a lot of the superstar SQL contractors we know (Brent, Paul, Denny, etc.) are heavily involved in sharing knowledge with the rest of us.  Their sharing makes us stronger, resulting in a better database community.

The next time you watch a major sporting event, look at how that team is being led.  Consider the superstars and what you can do to emulate them.  While we may not have the money, we can certainly have that same success by choosing to make their habits our own.

T-SQL Tuesday #32 – A Day in the Life (#tsql2sday)

This month’s T-SQL Tuesday is brought to us by one of the more statistically important data professionals out there, Erin Stellato(b|t).  The assignment is simple enough: Record your day on either Wednesday 7-11 or Thursday 7-12.  Easy enough, but leave it to me to screw it up, right?  Anyway, I was travelling on Thursday (heading down to Albuquerque to present at the local 505 user group), so I cheated and recorded my activities for Tuesday, 7-10.  It was an average enough day, so a good cross section for the series.  So, without further adieu:

  • 6:55 AM – 7:10 AM Check on I/O trace – I can work remotely, so commonly when I get up I’ll check in on things just to make sure everything is ok.   This time, I had set a profiler trace to run overnight to give me some info on I/O issues we were having.  All I did here was log in and make sure it ran, I was going to drill in to the detail later.
  • 8:15 AM – 8:30 AM Review alerts from previous night – Still at home, but I did a quick glance over the alerts from last night just to make sure there weren’t any fires.  Everything was cool, so I hit the road to get in to the office.
  • 9:00 AM – 9:20 AM Arrive in the office, get my day started – This is administrative time, responding to emails and getting my tea (I hate coffee.  There.), settling in for the day.  This bleeds into the next part….
  • 9:20 AM – 9:40 AM General maintenance work – This was basically some file clean up and responding to some of the alerts I saw from over night.  Nothing major.
  • 9:40 AM – 10:40 AM I/O research – So we’re having an I/O issue in our lower environments.  We’ve got a LUN on one of our instances that is getting slammed.  This was what I was using my trace to research and discovered that a whole lot of work was going through tempdb.  I spent this time reviewing my data and then talking with the relevant developers and QA engineers.  Once I had my info collected, I reported out to the systems team, DBA team, and the dev guys.  Unfortunately, this is a situation where not much can be done.  There really wasn’t any alternatives for spreading out the I/O load (at least none worth pursuing for a lower environment system) and the proper way to fix it was to have the dev team file things away for code fixes.  Still, with the info I collected we could come back to that with a better strategy.
  • 10:40 AM – 11:00 AM TempDB cleanup – Got some additional space for one of our dev instances to allow us more tempdb space, so I cleaned that up and arranged the files.
  • 11:00 AM – 12:00 PM CLR Research – So I’ve never really done much CLR work.  We had a legacy sproc that we used that was reporting incorrectly, so I was doing some research as to why. Really didn’t have much luck, but since I was used to the WMI in Powershell, I figured I’d try and rewrite the CLR logic using that.
  • 12:00 PM – 1:00 PM Lunch – Every Tuesday we go to this awesome thai place down the road.  Basil chicken for the win!
  • 1:00 PM – 5:00 PM CLR Research – I basically spent the rest of my day fighting with CLR.  Keep in mind, I’m a DBA with a sys admin background.  I’ve dabbled in .Net code, but I’m very rusty and my code is less than elegant.  However, it was a good learning experience, and taught me several things:
    1. CLR only supports a limited set of libraries.
    2. The System.Management libraries apparently have a lot of dependencies on forms and drawing (I have no idea why).
    3. CLR is a real pain to debug, depending on local security policies.

Honestly, this was one of my lighter days.  Probably because we had just come out of a holiday week where we had locked systems down and allowed minimal change, meaning we also did have much breaking.  This is what makes the job enjoyable: not every day is a fire drill and the ones that aren’t afford me an opportunity to experiment and learn.  Because of this day, I’m a whole lot more comfortable with the concepts of CLR (even though I still haven’t built a successful CLR function) and it’s made me a stronger DBA.

Thanks to Erin for hosting T-SQL Tuesday #32!   Make sure you visit the master post for other great blogs!

Improved Laziness

I’m really digging Powershell the more that I work with it.  It’s nice to have so much rich functionality within the operating system and not having to fuss with compilers or syntax.  One of my more recent projects was automated restore testing for our servers.  Now, while I won’t be showing that to you (yet), working with that script got me more time with the SMO.

The SMO is incredibly powerful.  You have so much functionality that it can be a little overwhelming.  There’s also a lot of stuff you can do in the SMO that you can also do in standard T-SQL.  As with anything, it’s a matter of using the right tool.  I’m finding that what makes Powershell the right tool is it’s ability to work outside of the database, avoiding some of the restrictions placed (rightly so) on the engine and how it interacts with the OS.  It makes Powershell ideal for things like file manipulation and talking with the OS.  It also gave me a slightly cleaner way to generate restore scripts.

If you have seen my previous restore building script, it was handy that it worked in T-SQL, but because it had no visibilty into the file system, I had no real way to logically combine fulls, diffs, and transaction logs to build a single script.  Sure, I could probably assemble something looking at msdb.dbo.backupset, but that only works if I’ve got access to the server the backups were done on.  By leveraging Powershell and the SMO, I can now read the file system, interogate each backup file to see that it fits in the LSN sequence, and build out a SQL script.

The steps are simple:

  1. Point the script at a target directory where all the backup files live and a SQL instance that has access to the file directory (typically, this is the instance you’ll run the restore on).  You can also declare a database name (if you want something different) and an output directory (which will otherwise default to your My Documents folder).
  2. The script will then use the SMO to read in the backup files, coordinating the LSNs to order them properly for the restore.
  3. The entire restore script will be written to the output directory.

Some caveats:

  • The script identifies files based on extension (.bak for Full, .dff for Differential, and .trn for Transaction Log).  This is based on how I write out my backup files, so this is the key piece of logic that will need to be altered to make this more generic.
  • The full restore will “move” the files to their original location.  If restoring to a different server, you’ll want to alter these file locations.

Now, I’m by no means a Powershell guru, but I do like making my life easier through scripting.  There’s probably more elegant ways to do this, but it’s clean enough for my purposes.  Hopefully in makes your life easier too!

(Feel free to download the script here.)

<#
.SYNOPSIS
Builds Database Restore script, coordinating Full, Diff, and Log backups.
.DESCRIPTION
Generates a database restore .sql script for restoring a database.
Mike Fal (htp://www.mikefal.net) 2012-07-09
.PARAMETER
String
    dir(REQUIRED) - Target directory where backup files reside
    server(REQUIRED) - Target server restore used by SMO to build restore script.  Should be server you want to restore to.
    database - Database name to restore. If blank, database name from the backup will be used.
    outputdir - Output directory for script.  If empty, user's My Documents will be used.
.EXAMPLE
    .\Build-Restore.ps1 -dir "C:\database_backups" -server "localhost"
.NOTES
    Script acquires files based on extension:
        .bak = Full
        .dff = Differential
        .trn = Transaction log
#>
param([parameter(Mandatory=$true)][string] $dir,
      [parameter(Mandatory=$true)][string] $server,
      [string] $database,
      [string] $outputdir=([Environment]::GetFolderPath("MyDocuments")))

#load assemblies
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | out-null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") | Out-Null

#processing functions
function Get-Restore{
    Param($db,
          $backupfile)

    $rs=new-object("Microsoft.SqlServer.Management.Smo.Restore")
    $rs.Devices.AddDevice($backupfile.FullName, "File")
    $rs.Database=$db
    $rs.NoRecovery=$true
    $rs.Action="Database"

    return $rs
}#Get-Restore

function Get-Header{
    Param($rs,$srv)

    $dt=$restore.ReadBackupHeader($srv)
    return $dt.Rows[0]
}#Get-Header

#START SCRIPT
$sqlout = @()
$smosrv = new-object ('Microsoft.SqlServer.Management.Smo.Server') $server

$full = gci $dir | where {$_.name -like "*.bak"} | Sort-Object LastWriteTime -desc | Select-Object -first 1
$diff = gci $dir | where {$_.name -like "*.dff"} | sort-object LastWriteTime -desc | select-object -first 1
$trns = gci $dir | where {$_.name -like "*.trn"} | sort-object LastWriteTime

#initialize and process full backup
$restore=Get-Restore $database $full
$hfull=Get-Header $restore $smosrv
if($database.Length -eq 0)
{
    $database=$hfull.DatabaseName
    $restore.Database=$database
}

$LSNCheck=$hfull.FirstLSN
$files=$restore.ReadFileList($smosrv)
foreach($file in $files){
        $newfile = New-Object("Microsoft.SqlServer.Management.Smo.RelocateFile") ($file.LogicalName,$file.PhysicalName)
        $restore.RelocateFiles.Add($newfile) | out-null
}

$sqlout+="/****************************************************"
$sqlout+="Restore Database Script Generated $(Get-Date)"
$sqlout+="Database: "+$database
$sqlout+="****************************************************/"
$sqlout+="--FULL RESTORE"
$sqlout+=$restore.Script($smosrv)

#process differential backups
if($diff -ne $null){
    $restore=Get-Restore $database $diff
    $hdiff=Get-Header $restore $smosrv

    if($hdiff.DifferentialBaseLSN -eq $LSNCheck){
        $sqlout+="--DIFF RESTORE"
        $sqlout+=$restore.Script($smosrv)
        $LSNCheck = $hdiff.LastLSN
    }
    else{
        $LSNCheck = $hfull.LastLSN
    }
}

#process transaction log backups
if($trns -ne $null){
    $sqlout+="--TRN LOG RESTORE"

    foreach ($trn in $trns){
        $restore=Get-Restore $database $trn
        $htrn=Get-Header $restore $smosrv
        if($htrn.FirstLSN -le $LSNCheck -and $htrn.LastLSN -ge $LSNCheck){
            $sqlout+=$restore.Script($smosrv)
            $LSNCheck = $htrn.LastLSN
        }
    }
}

#Write final recovery line
$sqlout+="`r`n"
$sqlout+="--COMPLETE RESTORE/ONLINE DB"
$sqlout+="RESTORE DATABASE "+$database+" WITH RECOVERY"

#output script file
$sqlout | Out-File "$outputdir\restore_$database.sql"