John Morehousehttp://sqlrus.com
sqlrus.com // dba // bat whispererMon, 16 Oct 2017 13:42:12 +0000enhourly137555515Feeling Stunnedhttp://sqlrus.com/2017/10/feeling-stunned/
http://sqlrus.com/2017/10/feeling-stunned/#commentsMon, 16 Oct 2017 13:42:12 +0000http://sqlrus.com/?p=3314If you work in the Microsoft technology space you are probably aware of the mass of people who refresh their inboxes like a mad man on the first of the month. It used to be on the first of every quarter but that has since changed. I’ll admit that I was one of those individuals for some time. For those that might not know, I’m referring to the Microsoft MVP award program. This award program is an annual award that recognizes technology community leaders worldwide. These leaders actively share their knowledge, expertise with others in many different facets of the Microsoft technology stack. Award recipients have a deep commitment to the community and display a willingness to help others.

To be honest, I had never really understood that feeling of receiving the email because, well until now I had never gotten it.

I get it now.

I’m excited to more officially announce that as of October 1st, 2017 I have been awarded the Microsoft MVP award for 2018-2019. This is my first time receiving this award. I’m extremely humbled and honored that the people at Microsoft felt that I was deserving enough to be awarded this award.

I assure you that it’s a feeling of stun & disbelief. A feeling that makes you double check the email several times during the course of the day. Quite possibly even into the following day and upcoming week.

Nominate Others

As I have done previously, I challenge you to nominate someone in your circle that is worthy. Even if it isn’t for the MVP award, nominate them for something, anything. Even if you just simply email them telling them Thank You, I assure you it’ll make a difference to them. Getting any type of recognition is a big thing, so make it happen.

If you do want to nominate someone for the Microsoft MVP award, you will need to know their email address and can submit the nomination here.

Thank You

I want to take a moment and thank friends & family who supported me. You know who you are and I deeply appreciate all of the support you’ve given over the years. It hasn’t gone un-noticed.

I always have a special place in my heart for the SQL Server user groups of the mid-west. I ran the Omaha chapter for several years so there’s a special fondness for groups in the surrounding areas of Nebraska that provide training & networking for their communities. Don’t get me wrong, if a user group anywhere wants me to speak, I’m more than happy to do so if my schedule allows. If a group from Nebraska or Iowa asks, I’ll make that happen.

With that said, when the Quad Cities PASS User group asked me to speak, I absolutely said yes. So tonight, Thursday August 9th, 2017, at 6/7PM (CST/EST) I’ll be doing a remote presentation for the group. It’s one of my favorite sessions to present and I’m really looking forward to it!

Do you find yourself constantly putting out “fires”? Can’t take anymore heat in the data center? We as DBAs can easily become burnt out with dealing with these daily blazes. Wouldn’t it be grand to learn how to prevent these fires from scorching us to begin with? In this session, we will jump into things you can start implementing as “fire prevention” to stop the “fire fighting”. We will talk about several things that you can take back to your shop and put into action easily ranging from code tuning, backup maintenance, monitoring and performance tuning!

If you are in the area, please make sure to RSVP so that the organizers have an accurate head count for food!!

After reviewing some recent audit results for a couple of our SQL Servers, I noted the audit checked for start up procedures. This caught my attention as I was not fully aware of what those were. Naturally I decided to investigate further.

Startup procedures automatically execute whenever SQL Server is started. Where would you use this? One example is if you had an application that required the use of a global temporary table, you could use a startup procedure to create it. This would allow the table to be immediately accessible to any application that requires it.

Another possibility is that you could “warm up” the cache so that data is already cached when an application needs it. This would help prevent that initial I/O hit the first time the query is called, thus potentially improving initial performance.

Gotchas

Just like with anything within SQL server, there are some things that you need to be aware of:

Each startup procedure consumes a worker thread while executing

The stored procedure must live in the Master database

Only system administrators can set a procedure as a startup procedure (this is a good thing actually). The SA account must also own the procedure

It cannot have any input or output parameters

The procedure executes with the same permissions as the sysadmin role

It should not return any result sets. It gets execute by the instance so there is not anywhere for the results to go

How To

So how do you configure this wizardry, you ask? There are two methods to enable this.

You can use sp_configure

sp_procoption

Let’s walk through both of them.

Using sp_configure is simple. If we look at sys.configurations, we can see whether or not it is already enabled. It is disabled by default.

From the above, we can see that it is disabled for this particular instance. This is an advanced configuration setting so we have to ensure that ‘show advanced options‘ is enabled. Once that is completed, we can then enable the scan for startup procedures.

Hold the phone. Even after setting it to be enabled, it STILL shows disabled.

This is another gotcha. You have to restart the SQL Server services in order for the configuration change to take effect. So, plan accordingly if you have a need to use this feature.

As a side note, if you start SQL Server with the -f flag (which is minimal configuration) the startup procedures will not be executed. You can also use trace flag 4022 as another option to skip them upon startup.

The other option is to use a system procedure sp_procoption. The sole purpose of this system procedure is to set or clear a procedure for automatic execution. Just like using sp_configure, this procedure is fairly simple as well:

exec sp_procoption @ProcName = 'dbo.LoadAllOfTheThingsIntoCache', @OptionValue = 'off';
As a side note, if the 'scan for startup procs' is not enabled when you use sp_procoption, it will automatically enable it for you.

Just remember that any procedure that is configured to be executed upon startup has to live in the Master database.

Real World

Have I ever used start up procedures? No, I have not. I believe that it’s an edge case usage, however it is always good to know what options you have available to you. I do think that using it to warm up the cache for highly transaction systems might be useful.

With that being said, here’s the obligatory warning:

This was informational only. Do NOT go wild and create 100+ startup procedures on your SQL Servers. Doing so will cause problems. As we have learned from Indiana Jones, choose wisely.

If anybody is using start up procedures, I would love to hear about your experience in the comments!

]]>http://sqlrus.com/2017/07/start-your-engines-with-startup-procedures/feed/73095I’m Speaking! SQL Saturday #653 – Columbushttp://sqlrus.com/2017/07/im-speaking-sql-saturday-653-columbus/
http://sqlrus.com/2017/07/im-speaking-sql-saturday-653-columbus/#respondMon, 17 Jul 2017 13:38:04 +0000http://landingzone.jmorehouse.com/sqlrus/?p=3159I’ll admit it, every time I think of Ohio I have flashbacks to the Drew Carey show. The show was set in Cleveland, Ohio. Don’t ask me why.

I’m excited that this weekend, I’ll be traveling to Columbus, Ohio (not Cleveland) for their annual SQL Saturday. This will be my first time attending their event. It is only 3 or so hours away so an easy drive for me!

They have a great schedule of session put together. Make sure to check out the schedule and get registered for the event!!

Here is the session that I’ll be presenting in room Roush ‘Fisher 1’ @ 9:45AM, which is one of my favorites:

Have you ever taken apart a toaster or an alarm clock just to see how it worked? Ever wondered how that database actually functions at the record level, behind the scenes? SQL Server Databaseology is the study of SQL Server databases and their structures down to the very core of the records themselves. In this session, we will explore some of the deep inner workings of a SQL Server database at the record and page level. You will walk away with a better understanding of how SQL Server stores data and that knowledge will allow you to build better, faster databases.

Honestly, is it every too early to learn about internals? I think not! Bring your coffee and let’s dive in!

How Can You Beat FREE Training?

Remember that these events are a FREE day of awesome SQL Server training and professional development. The only cost to you is if you decide to purchase lunch! Check out the schedule and register. In today’s economy, how can you afford to pass up FREE training?!?! The answer is that you can’t. Go register, get out there and learn. I triple dog dare you.

If you make it out to see my session, make sure to come up and introduce yourself as I’d love to meet you!

]]>http://sqlrus.com/2017/07/im-speaking-sql-saturday-653-columbus/feed/03159I’m Speaking! Evansville Technology Grouphttp://sqlrus.com/2017/07/im-speaking-evansville-technology-group/
http://sqlrus.com/2017/07/im-speaking-evansville-technology-group/#respondWed, 12 Jul 2017 14:29:16 +0000http://landingzone.jmorehouse.com/sqlrus/?p=3149SQL Server user group is a backbone of the SQL Server community. I currently help run the Louisville SQL Server user group and prior to that I ran the Omaha SQL Server user group for many years. I know how difficult it can be to get in person speakers. Needless to say when I was asked to speak at a group within driving distance, I jumped at the opportunity.

So on Thursday July 20th, 2017, I’ll be making the drive over to Evansville, Indiana where I’ll be presenting for the Evansville Technology Group. This group is not just SQL Server, but rather a combination of SQL Server, Powershell, and .NET. I’m really looking forward to presenting for them! Lunch will be provided!

Continuous Integration & Delivery isn’t a new concept. Application Developers have been doing it for quite a while and now it’s time for Database Professionals to catch up. Whether you are a database administrator, database developer, or even an application developer, database continuous delivery can help you find your salvation. I’ll show you how to apply these concepts to areas of security, collaboration, testing, support and deployments. We will look at what continuous delivery means, demonstrate some tools that can help you easily get it into place, and dive into why it’s important.

If you are in the Evansville, Indiana area over the lunch hour on Thursday July 20th, please stop by!! You can register for the event here. If you do decide to go, please make sure to RSVP so that the organizers have an accurate head count for food.

Come join me on Wednesday July 19th at 11:00AM CST for Idera’s #sqlchat on Twitter!! We will be discussing database continuous delivery and questions that DBA’s might have about it! If you aren’t familiar the #sqlchat, every month Idera (B|T) hosts an online Twitter chat session. From Idera:

This #SQLChat will allow members from the SQL community to discuss SQL Server’s biggest topics with IDERA product experts, and have a chance to win a giveaway prize. This month’s giveaway prize is a Google Home!

Database Continuous Delivery

Database continuous delivery can be a scary topic for database administrators. This requires the DBA to start trusting tools & processes, and that’s like pulling whitening from a bull ox. However daunting, this is becoming a hot new buzz term in-conjunction with “devops”.

]]>http://sqlrus.com/2017/07/im-hosting-idera-july-sqlchat/feed/03155T-SQL Tuesday #91 – Start Talkinghttp://sqlrus.com/2017/06/t-sql-tuesday-91-start-talking/
http://sqlrus.com/2017/06/t-sql-tuesday-91-start-talking/#commentsTue, 13 Jun 2017 14:54:56 +0000http://landingzone.jmorehouse.com/sqlrus/?p=3121T-SQL Tuesday is a monthly blog gathering for the SQL Server/Data Professional community It is the brainchild of Adam Machanic (B|T) and is not limited to just things around the SQL Server database engine. Each month a blogger hosts the event and anybody who wants to contribute can write a post about that month’s topic. You can find a list of all topics at http://tsqltuesday.com/.

This month’s T-SQL Tuesday topic is about DevOps. It is being hosted by Grant Fritchey (B|T).

Grant asks some specific questions in this month’s posting:

How do we approach DevOps as developers, DBAs, report writers, analysts and database developers?

How do we deal with data persistence, process, source control and all the rest of the tools and mechanisms, and most importantly, culture, that would enable us to get better, higher functioning teams put together?

We’ll discuss each one, but first a recap.

What exactly is DevOps? Wikipedia tells us that devops is: “DevOps (a clipped compound of “software DEVelopment” and “information technology OPerationS“) is a term used to refer to a set of practices that emphasize the collaboration and communication of both software developers and information technology (IT) professionals while automating the process of software delivery and infrastructure changes.”

Great. Now we know what it is. It is intended to be a set of practices that stress on the collaboration and communication of basically everybody (and I mean everybody) that might be involved in application development and delivery. Ironically, this set of practices has morphed into an actual job title over the years. My employer just recently hired a “DevOps Engineer”.

So, let’s go back to Grant’s questions.

How do we approach DevOps as developers, DBAs, report writers, analysts and database developers?

Honestly, I think it’s time for us data professionals to get out of the data center and get on board with the DevOps movement. Let us take off the cloak of invisibility and get our hands dirty. If we look at the overall “DevOps” role, we should want to have DevOps in our world. While it may be a hard journey, in the long run it will make our jobs easier. How would you like to have push button deployments or less manual code reviews? I know that I do.

How do you do this? Easy. Talk. Communicate. Collaborate. Break it down now. Go talk to your respective teams (app dev, infrastructure, DBA’s, management, etc) and get them talking. Everybody will have their own opinion and that is okay. The important part is to get everybody talking.

If you need a starting point, ask this simple question: “What if we could implement a process to deliver application changes to the wild, that could potentially have little to no impact to our customers?”. Think about that. Do you work in an environment where deployments cause outages for your customers? I bet for the majority of us, that is true.

If your customers have a better end user experience, meaning little to no down time, isn’t that the name of the game?

DevOps helps to answer that question.

How do we deal with data persistence, process, source control and all the rest of the tools and mechanisms, and most importantly, culture, that would enable us to get better, higher functioning teams put together?

In my opinion, the answer is simple. Trust. As data professionals, we often question the process because once you involve data (the data is the business) then we get very distrustful of changes. This is probably rightfully so. How many times as a deployment gone sideways for you? Ever have to roll back a deployment because it barfed? I know that I have been there and usually it’s not that much fun. Hopefully every DBA has a recovery strategy in place to handle such events.

In order to reach a true “DevOps” method to deliver application changes into the wild, we have to get our hands off of it. And I mean OFF. This means tools must be in place in order to facilitate this. This also means that we have to TRUST the tools to do their job and do it well. Tools such as Octopus Deploy, Team City, Jenkins, TFS, Red Gate, etc. All of these third-party vendors pour money, time and effort into making them as rock solid as possible.

Trusting the tools is difficult for most database administrators. We want to see the guts in deployment, making sure dangerous things do not happen to our precious databases and their contents.

Experience

I was a part of a database continuous delivery project at Farm Credit Services of America in conjunction with Alex Yates (B|T) and Bob Walker (B|T), who have both blogged about the experience. I have also been pushing a similar project at Farm Credit Mid-America.

I’ll admit when the project started up at FCSA, I was that distrusting, skeptical DBA that thought this would never work. Yes, it was hard. It was cumbersome and clunky in the beginning. Then it got easier and the pieces fell into place. Iterations to delivery changes to the wild got easier. Quickly, I began to trust the process and how it was going to work.

I highly recommend Bob’s series of blog posts about the process. You can find them here. Alex also has several excellent posts around the process here.

Summary

In short, I’m a large support of the DevOps movement especially concerning database lifecycle management (DLM) and continuous delivery. While difficult, there are organizations out there that can help you get there. I assure you that the end results will be worth every penny.

]]>http://sqlrus.com/2017/06/t-sql-tuesday-91-start-talking/feed/33121I’m Speaking! Capital Area SSUGhttp://sqlrus.com/2017/06/im-speaking-capital-area-ssug/
http://sqlrus.com/2017/06/im-speaking-capital-area-ssug/#respondMon, 12 Jun 2017 14:17:31 +0000http://landingzone.jmorehouse.com/sqlrus/?p=3115I like it when I get referred to speak at various functions. Whether it’s a user group, conference or just something informal, I really enjoy speaking (yes, I’m crazy) and giving back to the community.

This time I’ll be speaking for the Capital Area SQL Server User group out of Albany, New York. This will be a remote session and I’ll be speaking on Continuous Delivery & Your Salvation.

Continuous Integration & Delivery isn’t a new concept. Application Developers have been doing it for quite a while and now it’s time for Database Professionals to catch up. Whether you are a database administrator, database developer, or even an application developer, database continuous delivery can help you find your salvation. I’ll show you how to apply these concepts to areas of security, collaboration, testing, support and deployments. We will look at what continuous delivery means, demonstrate some tools that can help you easily get it into place, and dive into why it’s important.

If you are in the Albany, New York area tonight from 5:30PM to 7:30PM(ish) please stop by!! You can register for the event here. If you do decide to go, please make sure to RSVP so that the organizers have an accurate head count for food.

Speaking of food, the only sad thing about this, is the group is having BBQ this evening. Why is that sad? Because I love good BBQ, I’m missing the food!!

]]>http://sqlrus.com/2017/06/im-speaking-capital-area-ssug/feed/03115Reverse Ithttp://sqlrus.com/2017/06/reverse-it/
http://sqlrus.com/2017/06/reverse-it/#respondTue, 06 Jun 2017 13:35:27 +0000http://landingzone.jmorehouse.com/sqlrus/?p=3039Recently I had a conversation with a colleague that wasn’t aware of the REVERSE function in SQL Server. I had sent them a code snippet and they inquired as to what the purpose of the function was. Essentially, this function that reverses any string value.

For example, the phrase:

The brown dog jumped over the lazy fox

reversed looks like this

xof yzal eht revo depmuj god nworb ehT

Or in Management Studio

Awesome. Where can I use this? I use this function when I need to get file names for data or log files of a database. These are the steps to do this:

Reverse the full path of the file name

Use the LEFT function to return all of the characters left of the first instance of “\”

]]>http://sqlrus.com/2017/06/reverse-it/feed/03039Ola Hallengren: Output Filehttp://sqlrus.com/2017/05/ola-hallengren-output-file/
http://sqlrus.com/2017/05/ola-hallengren-output-file/#commentsTue, 30 May 2017 14:43:03 +0000http://landingzone.jmorehouse.com/sqlrus/?p=3043I’m a huge fan of Ola Hallengren‘s free utility scripts. If you aren’t familiar with it, it’s a great tool to manage backups, index maintenance and the overall health of your SQL Servers. It is utilized around the globe and it is completely free.

Although his script is very thorough I still like to tweak and One of the things that I tend to change is the way that the log file for each respective job is named. By default, the file name is configured in the following format:

JobName_JobID_StepID_YYYYMMDD_HHMMSS.txt

which results in file names that look like this:

If you have never looked, these files contain information about any failures or issues the job(s) might have encountered. The difficult part is that when looking at the list, how do you know which file belongs to which job? In the example above, one is a FULL backup, a DIFF backup and the third a TRAN LOG backup but it is difficult to determine which is which. However, you can see that each JobID is a unique identifier, which correlates to the job ID of the specific job.

There are two ways to adjust this.

Update the maintenance script itself to adjust the file names. In the script, configuration of job logging starts approximately on line 4127. Your mileage may vary depending on which version you have. You can look for the phrase “Log completing information” in the file to determine where it starts. Once you find its location, you can adjust the file name to however you want.

If you don’t want to adjust the script, you can use the script below to manually adjust the output file name. This script will produce the appropriate parameters for sp_update_jobstep. You can then copy/paste the command into a query windows and execute in a controlled manner.