Category Archives: Uncategorized

The North Texas SQL Server User Group is currently looking for speakers for 2019. If you are a speaker and searching for events that are where you are headed for business, or if you’re looking to just fill your calendar with different user groups across the country, NTSSUG should be at the top of your list!

When do you meet?

We meet the third Thursday of each month. The following are the available dates:

Well, that’s a good question. Last year we took a survey to find out what members wanted to see at our meetings. The results were quite interesting.

With NTSSUG, interests vary.

Why should I speak for NTSSUG? Give me one good reason!

I will do you better than that…I will give you a few reasons.

First of all, we have one of the largest groups in the south central region with over 1800 members. Our meetings have an average attendance of 78 people, and occasionally we will see spikes in attendance – just this month we had a record breaking meeting with 141 people! Our group meeting is also quite active and you will typically get lots of questions from our membership.

In January we had Mexican food! We typically do something different in January in lieu of a December meeting or holiday party. For all other meetings we typically have pizza…but we are talking about maybe changing things up and maybe doing something different once a quarter.

We have our SQL Saturday coming up on June 1st. If you can’t make that then consider us for a user group speaking engagement. Or do both!

Dallas has two major airports – Love Field and DFW. Both are in reasonable proximity from our meeting location and typically you can get low-cost fares from the numerous air carries that serve our area.

You are coming to the mecca of all that is BBQ (don’t listen to those other people…they don’t speak the truth…TEXAS is where it’s at!). Pecan Lodge. Make the time. Go. Eat. And experience some truly epic BBQ.

The North Texas area is a destination for many performers and traveling shows. Wanna know if one of your favorites is coming to the area and when? Check out GuideLive for concerts and evens across the DFW Metroplex.

In August the North Texas Fair and Rodeo is going on. Not to be confused with the State Fair of Texas, this staple of Denton, Texas is not to be missed – while smaller, it is still plenty of fun, with rides, vendors, midway games, live concerts and an actual rodeo!

If you have an interest in beer and German cuisine, Addison Oktoberfest is in September. Come speak at our meeting and then stick around an extra day or two and head over to Addison for this annual shin-dig.

Around September-October there is the State Fair of Texas. Our October meeting would coincide with the fair the best, but if you have never been before, definitely worth checking out if you are going to be in town anytime at all while this is going on.

If you are interested in speaking at NTSSUG, or have questions, shoot me an email at programs@ntssug.com.

On May 31st, I was shocked to find out one of my submitted sessions was selected for PASS Summit 2018. My response?

I didn’t see this coming. At all. I was stunned – walking around at the office, saying “Oh my gosh…I got picked” while co-workers looked at me like I had lost my mind. Literally.

I thought I would be going to PASS Summit this year, but not as a speaker. I thought I would be an attendee, learning a lot and having an amazing time. I thought it was far more likely that we see 10 feet of snow this winter in Texas than I would be selected for this year’s PASS Summit.

I am sincerely humbled and grateful to be selected. The competition was stiff. I should know – just like last year, I pulled all the data while it was still available. Once the selections were announced I was able to isolate the sessions from those submitted, and also isolate the ones that were not selected. I was able to determine one particular number – how many sessions were selected out of total submissions.

90. 90 sessions were selected out of approximately 600 submitted. I might be off by a few, but I also checked this against ones that are there but I didn’t have in the data pull from immediately following the closing of submissions. For the most part, the ones that are there that didn’t match with a submission are pre-cons or sessions from what I am assuming were invited speakers.

I’m not saying any of this is bad. What I am saying is that the competition among PASS Summit submissions is growing and becoming more fierce. I see this as a positive for the community – we are developing more and more excellent speakers with awesome content to share.

I queried my data set further, looking for the names of the people selected and those not selected. I found the latter to contain many speakers I know personally, who would make amazing PASS Summit speakers, and may have already done so in the past.

I pulled the list of the speakers that had sessions that were part of the “Best Of” from PASS Summit 2017. Out of 36 speakers from that list, 13 of those are currently part of 2018 PASS Summit lineup. While it could be said that all of these people should be on this year’s roster because they were part of the “Best Of” from last year, I think the only thing that can really be taken away from this number is the fact that no one is guaranteed a speaking spot at PASS Summit, no matter who you are.

Are there invited speakers? Yes, but like the regular submissions selection, that list likely has some subjectivity to it as well. Maybe an invited speaker gets red carded in the previous year following their session (heaven forbid, but you never know) – do you think they would be invited back? I would hope not, regardless of the content they would provide.

As speakers in the SQL community, we are largely a very goal oriented, driven bunch of people. We might even be a tad competitive. It’s not lost on me how fortunate I am to have been selected this year. Looking back to previous years when my submissions didn’t make the cut, I know now that while this was something that I wanted, I wasn’t ready. I might have been grumpy about it, but that didn’t make me any more ready for this monumental task. For all the seasoned PASS Summit speaker veterans that were not selected, you of all people know how competitive this is. You also know that your time will come again.

Selected or not, I hope to see everyone at PASS Summit this year. If you haven’t registered yet, and need a discount code, I got one of those for ya!LSDIS97QR

The price for PASS Summit goes up after June 29th so if you are planning on attending, get registered now!

Note: I am on the board of directors for the North Texas SQL Server User Group and this is our code. Registering with this code benefits our group. There are other discount codes out there but I just gave you this one. Like, right now. See? There it is. So go register and use it!

When performance issues rear their pretty little heads I tend to look at the simplest things last. I am the person who is looking at the code first – because in many cases, if your code was better we wouldn’t be having this discussion. Whether it’s a series of nested views or one single user-defined function that doesn’t play well with the optimizer, there is usually something in the code itself that is contributing to the performance issue.

But what about the times when it isn’t the code or you can’t touch the code (I’m looking at you NHibernate…and all of your red-headed step-child offshoots and fellow ORMs). What about the cases when the poor performance situation only occurs periodically, or once a week on the same day, every week, and while you can see the slow-downs in your queries, and all of your waits and plans, the source of the problem isn’t any particular smoking gun – it’s lots of guns, firing off at random times, and the casualties are in your interfaces that depend on the data from the queries.

The one thing I don’t immediately go to is UPDATE STATS – and I should be looking at this. It’s something easy to run and if you still have issues, you have ruled it out.

Over estimation on rows = bad times for the optimizer.

In this case, I ran UPDATE STATS on one table in particular and immediately saw a marked improvement in my test query.

Ahhhh….much better.

This morning I ran the same query and things were back to where they were with performance. This was somewhat disheartening, but the troubleshooting must continue.

But what about auto-update stats? Is this on? Yes, but auto update statistics is only going to update based on a percentage sample of the records in the table. If your table contains millions of records, you may not be updating your statistics sufficiently to see an improvement. I am updating statistics with a FULL SCAN of the table. For larger tables, this may take some time, but you may also see a difference in your execution plan versus only updating statistics based on a sample.

In an effort to rule out whether or not statistics are definitely a factor, I want to UPDATE STATS on all the tables in my query, and at a specific time – sometime the day before we expect our slowdown to occur. I also want to be able to easily see how long the process ran and the duration of the update for each table. I could write all this to a table, and maybe I will do this later, but viewing this from job history is what I want right now – it is easy to look at and easy for everyone else to find and see as well.

Creating a job with multiple steps can sometimes be a bit painful and tedious, if the job is going to have A LOT of steps. With PowerShell, a server name and a database, I can dynamically create this job.

PowerShell

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

46

47

48

49

50

51

52

53

54

55

56

57

58

59

60

61

62

63

64

65

66

67

68

69

70

71

72

73

74

75

76

77

78

79

80

81

82

83

84

85

86

87

88

89

90

91

92

93

94

95

96

97

98

<#

Name: create update stats job for specified tables.ps1

Author: Amy Herold

Date: 8 February 2018

Purpose: With a specified server, database and list of tables, create a job to update stats.

The result from this is a job that will update statistics on your list of tables, one at a time. All you have to do from there is give it a schedule.

All those steps…courtesy of the magic that is POWERSHELL!

For this script, I was getting my list of tables based on a specific query that was problematic. If you are doing this for a stored procedure, you can query for the list of dependent tables instead, and instead of hard coding the tables in this code, pass in the name of your stored procedure and get the list that way.

I hope if you have a situation where some one-off stats updates are needed you find this script to be useful. As always, comments and feedback are appreciated. 🙂

This morning Wednesday I walk into the office and immediately hear that CHECKDB is the source of issues on one of the servers and is the reason behind some errors that have been happening. While I don’t think this is the case (it might look like it on the surface but there is something else that is happening that is the actual cause) I also wanted to find out what CHECKDB was running at the time the errors occurred.

I needed information on when CHECKDB ran for each database. When you look for what you can run to find when CHECKDB was last run you find this blog post and also this blog post on grabbing this info. While these were very informative, they were for one database at a time. I need this for all the databases so I can try to not only find out when each one ran, but also use these time stamps to figure out the duration.

I took the code from Jason’s post and made a few changes, running this for all the database with sp_MSForEachDB.

Transact-SQL

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

-----create the table - added dbname------

CREATETABLE#temp (IdINTIDENTITY(1,1)

,ParentObjectVARCHAR(255)

,[Object]VARCHAR(255)

,FieldVARCHAR(255)

,[Value]VARCHAR(255)

,dbnamevarchar(50)

)

-----using sp_MSforeachdb, grab the data for each database and update the dbname field--------

The results gave me some good information to go on – the database where the errors occurred was not the same database where CHECKDB was running during that time frame. There were also some FlushCache messages in the log – some suspect that this is also tied to the errors that are happening. Since all this ties back to NHibernate code, I suspect something else is going on and further digging is needed.

I discovered something last week – I had not blogged about little things that I thought I had blogged about. What the heck does that mean? It means that I tried to reference my blog for something because I thought “I totally blogged about that”…and found out that was not the case.

Starting now, I am fixing this situation. There was something that popped up today that called for a PowerShell script and the Get-ADGroupMember cmdlet – get a list of users from a list of groups. Some users are in there more than once so this needs to be a distinct list, unless you are into manually cleaning up things like this, and then I will be sad for you. Because that is kinda sad.

I originally wrote a script with two arrays (one for the initial list and one for the de-duped list of users), but even though this is quick and dirty, that was a little too dirty. Enter the Group-Object cmdlet – it takes this list of names and groups them. No black magic this time. Just a cmdlet, that comes baked into PowerShell giving me what I need.

What? You wanted the code too? Oh, OK.

PowerShell

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

<#

Name: activedirectory get list of users from multiple groups.ps1

Author: Amy Herold

Date: 24 October 2017

Purpose: Get distinct list of users from multiple AD groups.

#>

$groups=@('Admins','Managers','Some Other Group')

$people=New-ObjectSystem.Collections.ArrayList;

$people.Clear();

#--------------get list of users from list of groups--------------

foreach($gin$groups)

{

$people.Add(@(Get-ADGroupMember-Identity$g|selectname))|Out-Null;

}

#------------use group-object and get a distinct list of names--------------

$people.Name|Group-Object|selectname

There you have it – quick, dirty and to the point. Enjoy. 🙂

UPDATE: Mathias Jessen tweeted a one liner for this….so no need for the one array! Woohoo!

PowerShell

1

('Admins','Managers','Some Other Group'|Get-ADGroupMember|Group-Object-PropertyName-NoElement).Name

I was trying to do this but was also just trying to get it done, and if in doubt, I slap things in arrays. Thanks Mathias!

Next week I am returning to Orlando not only to visit the mouse at his house, but also to speak at SQL Saturday Orlando! Last year was my first time there and I had a great time. This also happened after the event was pushed back to November 2016 because of Hurricane Matthew.

This year I will talking about deadlocking and blocking – something that is an issue for so many DBAs. Whether this issue is indexes or bad code…or you have been trying in use indexes to cover up bad code, blocks and deadlocks can happen in even the best environments under the right circumstances. If you are going to be at SQL Saturday Orlando come to my session where we will talk about detecting and preventing these arch enemies of the DBA!

Before I worked on server migrations to Azure, I had only worked with Azure for a SQL Saturday session, and I was only using Azure as a host to the virtual machines (VMs) that I used to demonstrate PowerShell scripts running on a windows server that also accessed active directory. My only exposure to this technology was limited and focused on what I could do with the VM – not with how the VM was created or how I could use PowerShell to deploy a VM.

That professional experience with Azure finally came, and I got to work on PowerShell scripts that could be used to create VMs in Azure. With a script you have something that is consistent and less prone to human error, but even with that sometimes something is set up wrong that can’t be changed once the VM has been provisioned. Alternatively, maybe you are only using Azure for the testing of an application of project, you are done and the VM is no longer needed. You could simply decommission the VM, but if there is no need for the VM to still exist you probably want to delete it and all the parts associated with it.

The first one of these happened to me – there were three VMs where the configuration wasn’t right and it wasn’t something that could be changed from the portal. The VMs would have to be deleted and re-provisioned. I modified the creation script to have the right configuration for these VMs and got those created so work could continue. At this point I started to work on a script that would remove the VMs base on the name and resource group, including removal of all the parts that were tied to the VMs, such as the virtual network and the storage account.

Fast forward to today – I am working on my session for PASS Summit 2017. Instead of creating every part separately with the PowerShell cmdlets, PowerShell is used to apply a template that contains all the parameters and variables for a VM. If you need to make a change it is made to the JSON template, not to the PowerShell. Since I am testing, I am creating VMs and then I want to blow them away right after. I attempted to use the PowerShell script I had used before to remove VMs but that failed – enough time had passed that the cmdlets have been updated and what I ran before was no longer valid. I tried to revise the existing script and then ran across the OMG-I-can’t-believe-I-didn’t-do-this-before answer – just delete the resource group.

PowerShell

1

Remove-AzureRmResourceGroup-Name'rgsqlkitten'-Verbose-Force

If you have contained all parts of the VM in the same resource group, and there is nothing else in the resource group other than the VM, cleanup is this simple. From what I recall about the server migration, there were multiple VMs associated with the resource group. If the resource group had only contained the VMs I wanted to blow away, I could have potentially run this for that clean-up operation. In this case, I was also removing disks from each of the VMs ahead of removing the VMs and had to run a script for that first. Looking back, if everything had been in one resource group, and had there been nothing else in there, I could have run this one line and been done.

For now, this solution works for me, but I know a more specific cleanup script may be more useful – one that will remove a VM and all of its parts based on the VM and the resource group. This will have to happen when I am done with my other testing, but in the meantime, this one line of code allows me to create and blow away VMs while using my personal Azure account…and not run up a ton of charges. Let me know if you find this useful or if you have any suggestions here – would love to have feedback!

In case you have been under a rock, or buried in work, or otherwise occupied, the PASS Board elections are going on and end TODAY! There are some amazing candidates (thanks to the selection committee for all of their work in bringing us this fine list of folks) running for a handful of seats. Big question – have you voted yet?

When you vote you are exercising your voice and opinion on who from the SQL community best represents your views on the direction and future of the PASS organization. That said, the total number of people voting during each election cycle has steadily declined, with fewer that 1000 people voting last year.

With this being one of the easiest ways to be involved with PASS, you would think that these numbers would be increasing. If you have not voted yet, go check out the candidates. If you look at all of them, and just can’t decide who to vote for, give that Chris Hyde a second look – while it might seem like personal bias, I honestly feel he is well equipped for this position and will represent PASS and the SQL community with the same enthusiasm and vigor he has brought to the Albuquerque user group and all the other locations where he has spoken for user groups or SQL Saturday events (including the North Texas SQL Server User Group).

The voting deadline is 12 noon PST today (2pm CST). If you haven’t voted yet then put that query down and get to it!

I’ve been teaching my 17 yr old SQL Server and other various topics. I have quite a bit of experience training technology and speaking but it never fails; after about 5 minutes his eyes glaze over and he nods and “umhums” and then starts checking his phone.

I have one nephew that is already intrigued by the PowerShell that I am doing and seems to have a genuine desire to write code. This makes me SOOOOOOOO HAPPY! My hope is that he is able to continue what he is learning now and migrate into a skill set that will afford him a good career. And if he happens to maybe “hack” into something that he shouldn’t have, I will help with his bail, smother him with hugs and kisses and tell him “OMG…I AM SO PROUD OF YOU!!!”

Check out Daniel’s blog post, download the game and give it a whirl. I plan on showing my nephews this to get the opinions of the younger, not-a-teenager set. I doubt my niece will be interested in this right now…unless there was a Disney princess in there with the sea monsters. I will leave it to Daniel to incorporate that in the 2.0 version. 🙂

For the first time ever I will doing a general session at PASS Summit this year.

Waiting for an email with news one way or the other last week was angst ridden, to say the least. After receiving no notification with the first wave on Tuesday, I got even more anxious and tried to occupy myself with work and other things. I didn’t want to get excited over something that wasn’t even a thing yet.

It didn’t sink in right away. I had myself so convinced I wasn’t going to get selected. Maybe my head and heart were still bracing for the disappointment that was not to happen. Why would I be thinking this way? This is the third time I have submitted and the first time I have had a general session accepted. I did have the opportunity of doing a lightning talk last year. I also got to compete in Speaker Idol, which was an amazing and valuable experience that is only available to those that have not done a general session before. More on this later.

The first year I submitted I had been writing abstracts the same way I normally did for SQL Saturday events, not knowing there was more involved for a PASS Summit abstract. I logged in and was about to fill out everything for each of my sessions when I saw all these fields to fill out that I didn’t know about. I quickly got things written for them, along with my main abstracts, and submitted.

But I didn’t get selected.

The next year PASS offered a program for feedback on Summit abstracts. I thought “Yes, this is great! I will write my abstracts, get feedback, and I will be speaking at Summit this year!”

But I didn’t get selected for a general session.

Both times I was disappointed. Both times I did a fair amount of sulking and grumbling, but I am only human. In these cases, my best was beat out by others that were better. Kind of a hard pill to swallow, but I think these challenges are what make us who we are – IT professionals that are constantly striving to improve our own skills while creating content that will convey knowledge and better the SQL Server community.

Last year however, I was selected for a lightning talk and I finally got to participate in Speaker Idol. Doing my lightning talk on How to Keep Your DBA from KILLING YOU was fun and a great experience.

Speaker Idol? Wow…how do I put this? If you are a new speaker, and have not spoken at PASS Summit before (general session or better), throw your name in the hat for this! If selected, you will have the opportunity to get up and present something in five minutes, and get immediate feedback from a panel of judges, who also happen to be well known and experienced speakers in the SQL community. When this was first done in 2014, I heard about the brutality of the first round, so I made sure I was there for the subsequent rounds. Your audience for this? Mostly speakers or aspiring speakers. Don’t be surprised if the final round is standing room only. Or if there are acrobatics (lookin’ at you, Rob).

Moving on to 2017…

Earlier this year it was announced that the Summit selection process was changing yet again. There would be no feedback to speakers and you were also limited to submitting three sessions. There was also a list of topics, with some marked as “hot topics” or topics that PASS really wanted content on. It was also stated that they wanted new content. That session you had been peddling to every SQL Saturday that would have you? Not new content.

I took all this info and combined it with a ramped-up abstract writing and review process. What is ramped-up? Reading the sessions that were selected last year, and having a reviewer that is a seasoned PASS Summit speaker. It also helped that my reviewer was super critical. Once it was announced that submissions were being accepted I logged in and grabbed all the sections that I would need to fill out and got started on my abstracts, outlining details before writing anything. There was even one session I outlined while on a plane, that I looked at later to write up and decided it was COMPLETE GARBAGE…AND WHAT WAS I THINKING???

What did I learn from the PASS Summit submission and selection process this year?

Review, review, review. Write them early, review, come back later and review again. Have other experienced PASS Summit speakers review your abstracts. Don’t like their feedback or think they are being mean? Ummmm….they are trying to help you get your abstract to a level that is worthy of PASS Summit. LISTEN TO THEM!!! And buy them a beer.

Nothing is guaranteed. In a previous blog post I told y’all how to pull all the submitted session into a table and even gave you some PowerShell code to do it. I was able to compare that to the list of speakers selected and there were some amazing people that have spoken and PASS Summit before that didn’t make the cut this year. Sessions are reviewed and selected by human beings. This is a gamble, and regardless of how good of an abstract you have, you are at the mercy of the session review committee and what they feel is good enough for Summit. This is a monumental task, but they hunker down and get it done so that all of us can have an amazing selection of sessions to pick from.

I hope I get to see everyone at PASS Summit this year, whether or not they are speaking. Out of the conferences that I have been to over the years, I have found PASS Summit to not only be the best of them, but it was where I discovered that the #SQLFamily thing is real. If you are open to it, you can get hooked into groups that will welcome you and include you all because you are there for the same reason they are – to expand your knowledge and career in the area of SQL Server.