Using -ExcludeProperty in Select-Object

There are some properties that I just don’t want to see…

Nice, short, simple blog post today.

Head, Meet Wall

Working on getting data file information from the SQL Server SMO objects:

foreach ($DataFile in $db.FileGroups.Files) {

    $DataFile

}

 

Original No Pipe
ExecutionManager = ExecutionManager…that’s helpful!

Hmm, Properties, ExecutionManager, and URN (hidden from view) are not needed. That is not a problem, I’ll just pipe them to Select-Object and include them in the -ExcludeProperty parameter.

Head, Meet Keyboard

So we change our code to the following:


foreach ($DataFile in $db.FileGroups.Files) {

$DataFile |
    Select-Object -ExcludeProperty Urn,
                                   Properties,
                                   ExecutionManager

}
Original SelectObject NoChange
These suck at Hide ‘n Seek…

What the…? I excluded these guys! Do they not know the meaning of “Exclude”?!

Shh…Calm Yourself

As Kevin Feasel ( blog | twitter | CuratedSQL ) once said “Get-Member early and Get-Member often“, I’d like to add to that:

Get-Member early and Get-Member often…and don’t forget to Get-Help!

help Select-Object -ShowWindow
GetHelp
-ShowWindow cause it feels fancy!

Now I’m interested in the -ExcludeProperty parameter but I read the full thing, it’s not that big anyway. Thankfully the help, well, helps me.

Help ExcludeProperty
Gotta have properties to exclude them I guess…

Main point here is:

This parameter is effective only when the command also includes the Property parameter.

Bruce Banner, Not HULK.

I change up my code to include the  -Property * and see how it works…

foreach ($DataFile in $db.FileGroups.Files) {

    $DataFile |
        Select-Object -Property * -ExcludeProperty Urn,
                                                   Properties,
                                                   ExecutionManager

}
ItWorks
mwahahah I’m the greatest!!! eh I mean…yes, that’ll do!

Clear-Host

Now, there is nothing here that I would consider a ground-breaking, earth-shattering revelation. I had a problem, resolved it, and decided to share it.

  • Time to fix: ~3 minutes.
  • Time to write: ~7 minutes.
  • Overall time taken: ~10minutes.

This is quite possibly the shortest post I’ve ever written and there’s a reason for that; #SQLNewBlogger.

Blog posts don’t have to have these big revelations for you. It seems to help, yeah, but that’s mainly because if you feel passionate about something, it comes across in your writing. Besides, what you consider trivial, other people may never have thought of or encountered before. So go on! Give it a go!

Got a problem, write it up.
Got an opinion, voice it.
Got a script, share it.

Got it? Good!

 

Table Column Differences with T-SQL and PowerShell

Where I start off with one idea, than veer widely towards another…much like me normally

As part of my post for the latest T-SQL Tuesday, I talked about community scripts. One of these scripts was by Jana Sattainathan ( blog | twitter ) and it was to do with finding tables with similar table structure. This is a great script for initially finding the tables and something that I am not sure that I would have thought of!

However, do me a favour and in your favourite search engine, do a search for “stackoverflow sql server wide tables” (Google search included for your benefit).

For some reason, people have a slight fascination with wwwwwiiiiiiiiddddeeeee tables!

So thanks to the excellent work done by Jana, you have now identified the tables with similar structure, but what about if we want to know which column names match exactly?

I recently had this problem with consolidating copies of tables of medium to wide length in a database and needed to know. So I created a little snippet of code to help me out and I present it to you, in all of it’s unashamed vulnerability, in the hope that it can help you out.

Be warned: Thar be PIVOTs ahead!

Ahoy, matey!

So we’ve got two tables that are slightly different, not completely different (hey, they could be) but different enough to be similar without matching… (differently similar?)

Our task is to find out which columns are in both tables, which columns are only in 1 table, and which columns are only in the other.

Now this isn’t so bad manually, especially if you only need to do this once, maybe twice. What happens though if you need to do it with multiple pairs of tables? Or multiple pairs of wide tables like our search engines showed us above?

So let us do what all DBAs should do when they have a repetitive, manual task stopping them from doing more fun important things: Automate it away!

Avast Ye!

Our two tables are as follows:

CREATE TABLE dbo.DifferenceTable01
(
    col1 int,
    col2 int,
    col4 int,
    col6 int
);
GO

CREATE TABLE dbo.DifferenceTable02
(
    col2 int,
    col3 int,
    col4 int,
    col5 int
);
GO

Now we can use the sys.columns table to check out the different columns in the table but the results are, well, meh

SELECT 
    OBJECT_NAME([object_id]) AS TableName,
    [name] AS ColumnName
FROM sys.columns 
WHERE [object_id] IN
(
    OBJECT_ID(N'dbo.DifferenceTable01', N'U'),
    OBJECT_ID(N'dbo.DifferenceTable02', N'U')
);
GO
sys_columns_results.PNG
Even if I ordered it, it would still be “meh”…

That is not really going to work out for us…
So I’m not liking the look of this, and going through the results, it seems to me that these results are just not useful. This isn’t the computers fault – it’s done exactly what I’ve told it to do – but a more useful result would be a list of columns and then either a simple ‘Yes’, or a ‘No’.

There’s syntax for this…PIVOT

Thar She Blows!

As anyone who has seen me dance can attest to, I can neither shake, rattle, nor roll. And I definitely do not normally PIVOT. However, as I’m trying to know my tools, I do know that this is the syntax that I need.

PIVOT rotates a table-valued expression by turning the unique values from one column in the expression into multiple columns in the output, and performs aggregations where they are required on any remaining column values that are wanted in the final output.

So after looking up the syntax for this once (ok, 5 times!) I managed to come out with a script that I’m reasonably happy with.

And the results are a lot easier to read 🙂

human_readable_format
Can also be extended to more than 2 tables!

So much better! This way, no matter how long the tables, I can easily figure out what columns are in what table(s) based on their names.

Shiver Me Timbers!

Isn’t it always the way after you’ve done something, you realise a much easier way to do it?

The Old Seadog!

I’ve talked about automation in this post and I have yet to mention PowerShell. I bow my head in shame.

Especially when it has a native command like Compare-Object.

help Compare-Object -ShowWindow

Synopsis
Compares two sets of objects.

Description
The Compare-Object cmdlet compares two sets of objects. One set of objects is the “reference set,” and the other set is the “difference set.”

The result of the comparison indicates whether a property value appeared only in the object from the reference set (indicated by the <= symbol), only in the object from the difference set (indicated by the => symbol) or, if the IncludeEqual parameter is specified, in both objects (indicated by the == symbol).

If the reference set or the difference set is null ($null), this cmdlet generates a terminating error.

So the question we have to ask ourselves now is “Can we do what we did with the PIVOTs easier?”

The Old Salt.

I’ve recently found out about splatting so, of course, I’ve started to use it EVERYWHERE!

Let’s “splat” our two parameters

$Table01Param = @{
ServerInstance = 'localhost\SQLDEV2K14'
Database = 'master'
Query = "SELECT OBJECT_NAME([object_id]) AS TableName, [name] AS ColumnName FROM sys.columns WHERE [object_id] = OBJECT_ID(N'dbo.DifferenceTable01', N'U');"
}

$Table02Param = @{
ServerInstance = 'localhost\SQLDEV2K14'
Database = 'master'
Query = "SELECT OBJECT_NAME([object_id]) AS TableName, [name] AS ColumnName FROM sys.columns WHERE [object_id] = OBJECT_ID(N'dbo.DifferenceTable02', N'U');"
}

And we now save ourselves the trouble of writing the parameters to the functions.

Invoke-Sqlcmd @Table01Param
Invoke-Sqlcmd @Table02Param
splat-attack
SPLAT-ATTACK!!

Since everything is now set up, we can just pass those results into 2 different variable holders and use our Compare-Object.

$Table01 = Invoke-Sqlcmd @Table01Param
$Table02 = Invoke-Sqlcmd @Table02Param

Compare-Object -ReferenceObject $Table01 -DifferenceObject $Table02 -Property ColumnName -IncludeEqual
Non-splat-attack
It’s annoyingly wide without splatting…

And for anyone saying “Yeah, but who knows what ‘SideIndicator’ means!”,  I advise you to read the help with PowerShell. It helps a-lot!

The result of the comparison indicates whether a property value appeared only in the object from the reference set (indicated by the <= symbol), only in the object from the difference set (indicated by the => symbol) or, if the IncludeEqual parameter is specified, in both objects (indicated by the == symbol).

If you are still complaining about it – “You can’t have the HumanReadableFormat column like you did in T-SQL” – then please stop. There are experts out there who can make you go “wow” with what they do. I’m still learning but even I have an answer to that.

Compare-Object -ReferenceObject $Table01 -DifferenceObject $Table02 -Property ColumnName -IncludeEqual |
Select-Object -Property *,@{N='HRF';e={switch ($_.SideIndicator)
{'==' {'Both'}
'=>' {'Table 2 only'}
'<=' {'Table 1 only'}
}
}}
hrf
Go on! Compare this with T-SQL

Land-ho!

I’m not trying to argue with who would win between T-SQL and PowerShell, that was definitely not my intention with this post. T-SQL is my first language love, PowerShell is very quickly becoming my second (and that’s not just because it’s the only other langauge I know).

They both accompolish the same thing!

It’s just a matter of preference, that’s all, not a matter of competition. You want to work in SSMS, PIVOT it up! You like PowerShell, hammer that Compare-Object nail!
Whatever works for you.

My first idea for this was T-SQL but it turns out for me that PowerShell is the easier option. Don’t rule out one, just because the other was the first thing to pop into your head.

Now I’m going to go before I make a joke about Pirates, SQL and the R language…

 

T-SQL Tuesday #89 – The times they are a-changing: Inline Out-Sourcing.

It’s T-SQL Tuesday time! T-SQL Tuesday time! T-SQL Tuesday, T-SQL Tuesday, T-SQL Tuesday and a baseball bat!
Ahem…sorry about that…

tsql2sday150x150

Koen Verbeeck ( blog | twitter ) is hosting this month’s T-SQL Tuesday experience, and wow does he really ask us to search deep within ourselves and ponder our future…

Technology has changed a lot in the past years, especially with cloud/globalization/automation. What an impact has this had on your job? Do you feel endangered? Or do you have more exciting features/toys to work with? Do you embrace the change and learn new skills? Or do you hide in your cubicle and fear the robot uprising?

Now my knowledge of these subjects are somewhat limited; I have yet to research the cloud beyond a rudimentary “poking it with a stick” so I leave that topic to people like Arun Sirpal ( blog | twitter ) and others.

Globalization is another new topic for me. I consider it to be along the lines of being able to fill in positions with people all around the globe (but I’m probably wrong with this). There is a veritable plethora of people online that are excelling at this, so I leave it up to you to seek them out.

Automation…well I have some dealings of this, so I was considering this to be my topic. Then I realised that I could write about something along the same lines. Out-sourcing, but not in the conventional sense. More like…

Inline Out-sourcing.

Start-Transcript.

There’s never enough hours in the day for everything I need to do!

How many times have we heard a complaint similar to that? Especially now-a-days when DBAs are tasked to look after more and more servers and instances. I cannot remember the last time I heard of a DBA taking care of servers in the single digits.

The work of the DBA keeps increasing but the amount of time that we have remains the same. How do we combat this? How do we make it so we are not sprinting just to keep up?

The only answer I have to this problem is this.

Don’t try to re-invent the wheel…let someone else do it.

SQL Community.

The SQL community has been hard at work creating awesome content for everyone to use.

A short list of the top of my head include:

This doesn’t include the tools available from companies like

And have you seen some of the scripts created by individual members of the SQL community?

Hmm, maybe this won’t be as short a list as I thought…

You know what this blog post needs? More links!

Can I stop now? NO!!! MOAR LINKS!!!

And with Microsoft’s advancements with SQL Server and PowerShell, porting it to work with Linux and with Azure, it is like Microsoft are actively trying to get you to automate; actively trying to make your life easier!

Stop-Transcript.

So yes, technology has changed but we, as a SQL community, have worked so that you can use these tools, automate away the drudgery, and embrace these changes.

As long as you embrace this “inline out-sourcing” that we have.

Now I could wax lyrical about all the different aspects of the community and how, with these resources, you shouldn’t need to re-invent the wheel but I’m not going to.

These people have poured hours of effort into these scripts all for you to use. They have poured their heart, souls, and RAM into these scripts just so that they can help out the community.

I would ask a favour of people reading this post though. For all the time, effort, and sacrifice that these people have put in…

Thank them.

It means more than you know.

P.S. Andy Mallon ( blog | twitter ) has a beard…he may win this blog post…

Dealing with System.Data.DataRow.

 

Expert Opinion.

I had being sitting on this blog post for a while but then came a recent blog post by Mike Fal ( b | t ) that defended the use of  Invoke-Sqlcmd. Well, it turns out that Mike’s post was in response to Drew Furgiuele’s ( b | t ) blog post condeming it!

If that wasn’t bad enough, I then came across an article by Steven Swenson ( b | t ) that was in response to Mike’s article. Guess what? Another condemnation of  Invoke-Sqlcmd!

It seems that Invoke-Sqlcmd is the Marmite of the PowerShell/SQL Server world. That’s the equivalent of the Crunchy Peanut Butter versus Smooth Peanut Butter debate for my American readers. (Hi Aunt Kate and Uncle Tom!)

Now if you want some real concise, knowledgeable, and professional opinions on the pros and cons of this command, I encourage you to check out those blog posts. I’ve linked to them and I’ve read them all, each with a blend of “oh yeah” and “huh, good point” comments thrown in.

Let’s Get Personal.

The reason that I wanted to throw in my thoughts in this debate is because, as much as I love Mike’s article, it doesn’t deal with the biggest problem that I had with Invoke-Sqlcmd.

Dealing with those stupid, annoying System.Data.DataRow

system-data-datarow
Look at them there…taunting us!

I eventually  figured out how to deal with these and wanted to pass the information on.

The Set Up.

For all those playing along at home, I’ve got a SQL Server 2016 Developer Edition with a copy of WideWorldImporters, as well as PowerShell version 5.

Let’s see how many customesr we have…

SELECT COUNT(*)
FROM Sales.Customers;
CustomerCount
I am not adding 2 more customers, no matter what!

Now I don’t know about you but when I query stuff in a SQL database, it’s to do something to/with the results. They could be a list of servers that I monitor, they could be a list of databases that I want to check the recovery model of, or it could be a list of tables that I want to see how much space they are using. The main point is that I want to do something with the results.

But for this simple case, I just want to list out the customer name from this table. Simple? Yes, but this is just a test case to prove a point.

So let’s PowerShell this!

And so our problems begin.

Now, the basic premise is this:

For each customer name, I just want to output the line “Currently working on” & the customer name.

Now this is based on a real world example where it was a list of servers and I wanted to include this in Write-Debug.

Pain 1.

Invoke-Sqlcmd -ServerInstance localhost -Database WideWorldImporters -Query @"
SELECT CustomerName AS Name
FROM Sales.Customers;
"@ | ForEach-Object {
  "Currently working on $_"
}

Nice and simple PowerShell command, what I would call a “Ronseal” but when we run it…

system-data-datarow
grr!

I’m just going to follow this up with code and pictures of what I tried to do to get this to work…Hopefully you’ll get some amusement out of this…

Pain 2.

In this case I figured maybe I should put the results into a variable first and then see if it could work.

$Employees = Invoke-Sqlcmd -ServerInstance localhost -Database WideWorldImporters -Query @"
SELECT CustomerName AS Name
FROM Sales.Customers;
"@

foreach ($employee in $Employees) {
    "Currently working on $employee"
}
system-data-datarow
Nope!

Pain 3.

Well I know that PowerShell arrays start at 0, and I know that I can get the count of elements in an array by using <variable>.count so maybe that will work?

0..$Employees.Count | 
    ForEach-Object {
        [int]$i = $_

        $employeeRange = $Employees[$i]

        "Currently working on $employeeRange"
    }

 

 

0basedArrayNotHighlighted
Nope!

Pain 4.

A quick check on Google points me to using ItemArray with my loops so I try that.

0..$Employees.Count | 
    ForEach-Object {
        [int]$i = $_

        $employeeRange = $Employees[$i].ItemArray

        "Currently working on $employeeRange"
    }
0basedArray
YES!!! Wait…what the?

Ahhh! I know that PowerShell is 0 based but I didn’t realize that means the count is going to give me 1 extra row! Plus that’s a bit too much lines for my liking. All that just to output a customer name? Nah let me try again.

Pain 5.

for ($i = 0; $i -lt ($Employees.Count);, $i++) {

    $EmployeeFor = $Employees[$i].ItemArray

    "Currently working on $EmployeeFor"
}
forgood
FORtunately FOR gets me the FORenames (get it?)

The Real Solution.

If only I had run this…

$Employees | Get-Member

You know, there’s a reason that they say the 3 best commands are Get-Help, Get-Command, and Get-Member.

It’s because they save so much time if you just look at them.

$Employees | Get-Member
e_gm
If I may direct your attention to the MemberType of “Property”…

As it turns out there is such an easier way to get the data values back from Invoke-Sqlcmd,

if you want the data, just change $_ to $_.<property>

Let’s see if it works for us.

Pleasure 1.

If we “correct” our original code…

Invoke-Sqlcmd -ServerInstance localhost -Database WideWorldImporters -Query @"
SELECT CustomerName AS Name
FROM Sales.Customers;
"@ | ForEach-Object {
    "Currently working on $($_.Name)"
}

 

forgood
Oh…that’s lovely!

Pleasure 2.

And what about with variables?

foreach ($employee in $Employees.Name) {
"Currently working on $employee"
}
forgood
Brings a tear to my eye, it does 🙂

It looks like we finally have a proper Ronseal moment.

Final Thoughts.

I have absolutely no problem with Invoke-Sqlcmd, so I suppose I fall into Mike’s side of the camp.

Do I use it all the time though? Not really.

The SMO objects have an amazing amount of information that is just too difficult to get with Invoke-Sqlcmd so I’ve started to use the SMO more and more.

But Invoke-Sqlcmd is a tool, just like everything else. There’s no point in throwing away a tool just because it isn’t the most optimal anymore, especially when it is so useful in adhoc situations.

There are some cases where a small handheld screwdriver is more useful than an electric one, just like there are some cases where Invoke-Sqlcmd is more useful than the SMO objects.

Just know your tools…

DBA Fundamentals Social Media

This is going to be a short message but since it counts as my first SQL Family post, rather than a SQL or PowerShell one, I felt it still deserved a little post on its own.

DBA Fundamentals.

I’m helping out with the DBA Fundamentals Virtual Group’s social media presence (I told you it was going to be short 🙂 ).


What’s the Full Story Smartass?

What happened?

‘”What Happened” with Query Store’, actually?

At the start of February, I was watching a YouTube recording of that webinar by the DBA Fundamentals Virtual Group on their YouTube channel. Since my normal commute times and the normal times that webinars are on clash so often, it’s how I normally have to watch them.
During the webinar Steve Cantrell ( t ), the host and Group Leader, mentioned that they were looking for someone to volunteer with helping them out with their social media.

Why?

Well, I had steadily become more and more…addicted, for want of a better word, to the various social media outlets out there, like Slack, Twitter, YouTube, and LinkedIn. From these, I have gained so much knowledge, insight and joy from the different ways to connect to the SQL Family.

How?

I’ve been wanting a way to give back for all that I have learned so far so I took a chance and emailed Steve about the volunteer work, fully realising the very real possibility that someone who had watched the webinar in real time may have already offered and been accepted.

As you’ll find out with nearly all members of the SQL Community, there was no instant rejection. We emailed back and forth, Steve detailing who they are, what they currently do, and what they think they need to do better.

I replied with who I am, what I currently do, and what I thought I could help out with.
I think my suggestions were as long as that last line.

February 10th, expecting an apologetic yet negative reply to my last email, I got an email from Steve. He said he had talked it over with the Co-Group Leaders Mike Brumley ( t ) and Niraj Mehta, and then proceded to write to most eloquent piece of literature I have so far read.

You are in.

Why now?

I’m writing this now because I’ve had a month to try my hand in that arena, to give it a go, and see what it’s like.

I like it!

The Group Leaders have put a tremendous amount of effort into the webinars and secured well known names like Kimberly Tripp ( b | t ), Paul Randal ( b | t ), and more giving their expertise free of charge to all that attend.

Plus the cat was already out of the bag at this stage since it’s been mentioned in the pre-webinar slides.

What now?

From yourselves?

Enjoy the great content, hit us up with any questions or suggestions, and please forgive me if I ever come across as annoying on social media. I promise to try and not do that.

As for ourselves, it’s going to be pretty damn busy.

The Group Leaders are continuing to procure great talents for the webinars (believe me I’ve seen a sneak of what’s to come), we’re going to be updating the Virtual Group page to go with PASS’ new branding, and also try and get all avenues of social media to a standard where anyone can take pride in what they see.

How now brown cow?

Yeah so…a longer post than I expected to write…my bad.

 

T-SQL Tuesday #88 – The daily (database related) WTF! The Biggest Danger to your Database: Me.

That is more of a blurb than a title…and this is more an apology than a blog post…

Kennie Nybo Pontoppidan ( blog | twitter ) has the honour of hosting this month’s T-SQL Tuesday and has decided to base this month’s topic on ‘The Daily (database related) WTF‘.

Now I have great time for Kennie and T-SQL Tuesday since my very first blog post was in reply to a T-SQL Tuesday and it happened to be a topic where Kennie blogged about the exact same thing!

Now, truth be told, I wasn’t planning on participating in this one and this wasn’t because of not having a WTF moment, but rather having too many of them. However, reading through most of the entries, I see a vast majority of them are about moments well in the past and caused by other parties.

This is not the case for me. My WTF moment happened recently and the culprit was … myself.

Sorry Kennie 😦

Friday:

A request came in from our Developers about a slow performing query and my Senior DBA identifies an index that can be safely modified to improve this ones performance.
So a Maintenance Window was set and it fell to me, in my role of Junior DBA, to create a SQL Agent Job to create this index.

No worries so far right?

I create a once-off SQL Agent Job to create this index, scheduled it appropriately, and I’m off on my merry way for the weekend.

Monday:

I come in on Monday morning,  check my email, and I see an alert in my inbox about my job as well as an email from my Senior DBA; He’s not angry…WTF?

My whole job had failed!

Unable to connect to SQL Server ‘(local)’. The step failed.

01. SQLAgentError
WTF!

He is not angry as he has seen this error message before, has dealt with it before, and sees it as a case of “well you’ve seen it now, investigate it and you won’t fall for it again”.

A quick investigation later pointed to this in the Error Log the moment before the SQL Agent Job Step was supposed to run:

[165] ODBC Error: 0, Connecting to a mirrored SQL Server instance using the MultiSubnetFailover connection option is not supported. [SQLSTATE IMH01]

04. ErrorLogMessage
WTF?

Long sub-story short (i.e. Google-fu was involved), the main reason that this failed is that the SQL Agent Job Step has been configured to use a Database that is currently a mirrored one.
And SQL Agent does not like when you try to start off a step in a database that is mirrored.

02. WrongDBSetUp
WTF is wrong with this?

So the solution for me was to set the Job Step property ‘Database’ to a non-mirrored database (preferred: [master]), then include a “USE [<mirrored database>]” in the ‘Command’ property.

03. RightDBSetUp
WTF!

Knowing what to do now, and having identified another maintenance window for the next morning, I make the required changes to the job step and continue on with my day.

Tuesday:

I come in on Tuesday morning,  check my email, and I see an alert in my inbox about my job as well as an email from my Senior DBA; He’s angry…WTF?

My final job step had failed!

CREATE INDEX failed because the following SET options have incorrect settings: ‘QUOTED_IDENTIFIER’. Verify that SET options are correct for use with indexed views and/or indexes on computed columns and/or filtered indexes and/or query notifications and/or XML data type methods and/or spatial index operations. [SQLSTATE 42000] (Error 1934).  The step failed

05. SecondSQLAgentError
WTF!

Now I’m angry too since I count these failures as personal and I don’t like failing, so I get cracking on the investigation.
Straight away, that error message doesn’t help my mood.
I’m not indexing a view!
I’m not including computed columns!
It’s not a filtered index!
The columns are not xml data types, or spatial operations!
And nowhere, nowhere am I using double quotes to justify needing to set QUOTED_IDENTIFIER on!

SO WTF SQL SERVER, WHY ARE YOU GIVING ME THESE ERRORS???

SET QUOTED_IDENTIFIER must be ON when you are creating or changing indexes on computed columns or indexed views. If SET QUOTED_IDENTIFIER is OFF, CREATE, UPDATE, INSERT, and DELETE statements on tables with indexes on computed columns or indexed views will fail.

I’ve talked about stupid error message before… but in my current mood I wail, beat my breast, and stamp my feet!
The error message above was not complaining about the index I was creating, it was complaining about indexes already on the table!
In my case, we had filtered indexes already created on the table and, as such, every single index on this table from then on requires SET QUOTED_IDENTIFIER ON.

USE [TEST];
SET QUOTED_IDENTIFIER ON;

CREATE ...

Third Time’s the Charm?

No, not this time.

Luckily the Senior DBA had come in while the maintenance window was still running and manually ran the create index script.

He wasn’t angry that my job step failed. He was angry that my first job step succeeded!

Are you going “WTF? Why is he angry about that?” Let me enlighten you…

Remember at the start of this blog post I said that he had identified an index that could be safely modified?
Well, on Monday, in my haste to fix my broken job I had focused too much and thought too granular.
My second job step that created the index had failed, but my first job step, the one that dropped the original index had succeeded.

There’s not really much more to say on this. In my rush to fix a broken job, I created a stupid scenario that was luckily caught by the Senior DBA.

Wrap Up:

Yeah…so thought it would be a nice, little counter-example to the other posts out there about third parties coming along and wrecking havoc, and the DBAs swooping in to save the day.

I could make up excuses and say that, as a Junior DBA, I’m expected to make mistakes but I’m not going to.

It should be the aspiration of every Junior DBA to strive to improve and move upwards, and one of the key aspects of this is responsibility.

You should be responsible for looking after the data, looking after the jobs, and looking after the business.
And if all else fails, you should be responsible for your actions.

I have been properly chastised by my Senior and am still chastising myself for this. It’s been a long week so far…

… and it’s only Tuesday…wtf?