Category Archives: #tsql2sday

TSQL2sday #68 – Defaults

A big thank you goes out to Andy Yun (b|t) for hosting this month’s TSQL2sday party. This month’s topic is Just Say No to Defaults.

TSQL2sDay150x150

TSQL2sday #68

I have a Bachelor’s degree in Mathematics with a minor in Computer Science. So when I say I’m a fairly smart person, it’s not bragging, it’s simply the truth. So when people say, “Any monkey can be a SQL Server DBA”, I find it offensive. While that statement may be close to true, it’s not the whole truth. Yes, Microsoft made SQL Server easy to use right out of the box, but if you want SQL Server to run well, you better have one of those really smart monkeys.

I can’t tell you how many times I have been approached by my friends that are developers, DBAs for other RDBMs or SysAdmins and asked to “fix” their SQL Server environment. They are really smart people, but they made the mistake of thinking that they could simply install SQL Server, accepting all the defaults, be off and running and never look back. The biggest complaint I hear from these same people is that “SQL Server is slow”. My response is usually something like, “Hmm, I can’t imagine why that would be”, in my most snicker-y voice.

There are so many things that can be tweaked in SQL Server to improve performance, but there are two things you can change right off the bat that will make a world of difference. They are the defaults for database file location and max memory. In fact, if you don’t change the default location for your database files and the max memory setting, a lot of the other performance tweaks won’t make that big of a difference.

Database File Location

When the SQL Server install runs, it asks where you want to put your database files. The default location is on the same drive where SQL Server is installed, which is typically the same drive as the OS installation. Do NOT accept this default, just say NO! If you have a high volume transactional system, this will cause competition with the OS and guess who loses? You do. You should take this a step further and separate out your data files from your log files. And your tempdb should have a drive all to itself. (Note: When I say drive, I am referring to physically separate disks, not a single disk that has been partitioned into multiple drives. If you’re using a SAN, make sure you coordinate with your SAN administrator to get your drives created from the correct “LUN pools”.)

Max Memory

After SQL Server has been installed, the default max memory setting is to basically allow SQL Server use as much memory as it wants. Sounds like a good idea on the surface, but just say NO! SQL Server is very greedy when it comes to memory, it will take every byte your server has to offer, leaving your OS starved. My general rule of thumb is to allocate ¾ of the total memory to SQL Server, never leaving less than 2GB, but not more than 4GB, for the OS.

These are not the only defaults you should change, but these are two that will get you the most bang for your buck. They are easy to change and the implications/consequences of these changes are easy to understand. If you are a beginner with SQL Server, start with these two things, they will make you look like a very smart monkey.

TSQL2sday #66 – Monitoring

A big thank you to Cathrine Wilhelmsen (blog | twitter) for hosting this month’s TSQL2sday party. Monitoring is this month’s topic and it’s a very important one. It could mean the difference between having a job and looking for a job.

TSQL2sDay150x150When I started working with SQL Server (a long time ago, in a galaxy far far away) there were no commercial monitoring tools available and quite often I would get called or paged (yes, it was a long time ago) in the middle of the night by a very angry boss because there was something “wrong” with the database. Or worse yet, I would get no call at all and show up at work the next morning with a line of angry people waiting for me when I got off the elevator. It only took a couple of these encounters for me to realize that I needed to be much more proactive or change my line of work (I had heard that underwater basket weaving was an easy gig).

I started looking at the reasons I was being called and discovered most of them were things that could easily have been avoided if I had known about them earlier. Things like database and transaction log files filling up, running out of disk space, processes/queries that were taking increasingly longer and longer. Since there were no commercial monitoring tools out there I decided I needed to essentially roll my own.

I had to start looking under the covers to find what I was looking for. This gave me an even greater exposure into how SQL Server worked. Did I mention that this was before Google? I couldn’t just search for easy answers, I had to really dig in the system databases to find what I wanted. This was in fact, one of the best things that could have happened to me so early in my career as a DBA. I was forced to learn how SQL Server worked on my own.

To this day, I still “carry” around my home grown monitoring solution in my toolbox. I have updated it and expanded it through the years to accommodate newer versions and functionality and made it more efficient based on both of those things. Not all shops have the budget for monitoring tools and even if they do, a lot of the time they are only willing to spend that money on production servers, not development or test (don’t get me started, that’s an entirely different blog post).

My little monitoring solution has come in handy over the years because it has afforded me the opportunity to discover what’s under the covers of the newest version/features of SQL Server and provide a no cost basic monitoring solution to my employers when the budget is tight or non-existent. If you don’t have your own monitoring solution I would highly recommend you create one, if for nothing more than the reasons I stated above.

Don’t get me wrong, I love the commercial monitoring tools that I have access to now, but knowing the how and why of SQL Server will only make you a better DBA and could possibly mean the difference between having a job and looking for a job.

Managing Security – TSQL2sday # 63

A big thank you goes out to Kenneth Fisher ( b | t ) for hosting this month’s TSQL2sday party. Security is a big deal. How many times have you opened the paper (I’m dating myself, I know – no one reads an actual newspaper anymore, it’s all online now) in the last 6 months and there’s a story about another security breach, more records compromised or flat out stolen? Too many. While securing your data is probably the key to keeping your current employment status, there’s also a piece of security that is quite often overlooked and could be the reason for a resume generating event. Recovering from a failed server when you don’t use any of the HA features that are now available.

TSQL2sDay150x150

The scenario:
Your production server has failed and you don’t use any of those new fancy HA features like Always On Availability Groups, Log Shipping or even Database Mirroring. Your server hosts a standalone instance for the HR/Payroll department. Payroll must be processed in the next two hours or your company will be out of compliance with Federal Regulations and face heavy fines, not to mention all the really mad employees who won’t get their paychecks on time. I don’t know about you, but I do NOT want to be responsible for every employee not getting a paycheck, including myself.

You have a good backup plan in place, you take full, differential and log backups on a schedule that meets the minimum required data loss SLA and send those backups to a remote SAN data store. Your Sysadmin stands up a new standalone server for you in 30 minutes. You install and configure SQL Server in about 60 minutes (those pesky service packs and cumulative updates can take quite a bit of time). Now you are left with 30 minutes to get your databases restored and functioning. No sweat! Easy as 1..2..3, right? Wrong!

You restore your database only to discover that all your logins no longer exist on your brand new server. No problem, just recreate the logins and give them brand new passwords (SQL Authentication). All will be right with the world. You give your HR/Payroll department the okay to proceed and you catch your breath with 20 minutes to spare. The phone rings 5 minutes later, it’s HR/Payroll and it’s not working. They are getting invalid login errors. You have that momentary flashback to when you helped with the application install 4 years ago – the vendor hard coded the password into their application code, so you can’t just change it or give it a new password. That’s when you remember that you created a job to script the logins with their passwords on a weekly basis and saved the results off to file on that same remote SAN data store as the backups. Hallelujah! You find your script on the remote SAN data store, clean up the logins you created, then execute the script with the logins and their passwords. HR/Payroll is back up and running with 4 minutes to spare.

Paychecks for everyone!

While some of this may seem far-fetched, it’s based on an actual incident very early in my career. I may have embellished a little, but you get the point. You need to make sure you can recreate any login on your server at any time due to disaster/failure. If you can’t, you may just be looking for a new job.

To this day I still script the logins on all my servers on a weekly basis. I store that file in a secure location on a remote server. I’ve never had to use one since this original incident, but it’s nice to know that I can recreate the logins if I ever need to. Can you?

Giving Back T-SQL Tuesday #61

First off I wanted to thank Wayne Sheffield (Twitter | Blog) for hosting this month’s T-SQL Tuesday party and Adam Machanic (Twitter | Blog) for starting this party five years ago. I can’t believe it’s been five years.

TSQL2sDay150x150

This month’s theme is Giving Back to the SQL Community.

This is a great topic, it’s kind of like the Dickens’ holiday classic, A Christmas Carol. It gives me an opportunity to reflect on what I have done in the past, what I am doing now and what more I could be doing to give back to the community that has helped me so much in my career.

Past Giving

Member of the Abstract Review Committee (2010-2014)
Member of the Nomination Committee (2012)
“Speaker Wrangler” for Triad SQL, the local PASS chapter in Greensboro, NC (2012-2014)
PASS Ambassador at Summit (2010-2014)
24 Hours of PASS moderator (2012-2014)
PASS Summit Buddy (2013)
Chosen as a mentor in Steve Jones’ and Andy Warren’s The Mentoring Experiment (2012)

Present Giving

“Speaker/Sponsor/Swag Wrangler” for Triad SQL, the local PASS chapter in Greensboro, NC
Program Manager with the Program Committee for the PASS Summit.

Future Giving

I will continue my work (as long as they will have me) with the Program Committee. I absolutely love this volunteer position, it allows me to do something I enjoy while helping the community. It’s a win-win. I will also continue my position with Triad SQL. I have stepped it up a notch by wrangling not only speakers but sponsors and swag as well for 2015.

I enjoy moderating the 24 Hours of PASS too. It always gives me an opportunity to connect with some of the speakers that I’ve never had interaction with before. Just this year, I moderated for Gail Shaw (Twitter | Blog) and then got to meet her in person at Red Gate’s SQL in the City event in Seattle just before the Summit. How cool is that?!

I have a blog, in fact you’re reading it right now (wink wink), but I don’t write nearly enough. Part of me feels, “Surely someone has written about this before, so why should I clutter up cyberspace with my drivel?” and the other part of me feels, “Maybe my post can help someone who couldn’t quite make heads or tails of something they found via Google.” So for next year, I am setting a goal to write/blog at least once a month. I bet T-SQL Tuesday can provide the subject matter for those months where nothing interesting happens at work.

My favorite way, by far, to give back to the SQL Community is by being a PASS Ambassador during the PASS Summit. I get to see all the eager faces ready to stuff their brains to the point of overflowing. I get to help someone find the registration desk so their adventure can begin. I also get some of the first hugs of the Summit just by smiling and answering questions. It really is the best. So, as long as this program exists and I’m attending the Summit, I will continue to be a PASS Ambassador.

Lastly, I may even try speaking this year.  This is a huge deal for me, so I make no promises, other than to think about speaking.

How are you giving back?

The Long and Winding Road

TSQL2sDay150x150

“The Long and Winding Road” is one of my favorite Beatles songs, it reminds me of my dad, he was a huge Beatles fan.  My dad passed away almost five years ago, but I still miss him like it was yesterday and I still remember the advice he gave me when I was a kid, “Work hard and don’t let anybody tell you you can’t do something, there’s always a way.  It may not be easy, but there’s always a way.”  I credit that advice for the success I’ve had in my life, both personal and professional.

I started working with SQL Server back on Windows 3.11 (version 4.21a).  I know I’m dating myself here, but that’s kind of the point of this post.  We thought Windows 3.11 was so cool after having to deal with DOS for so many years, we didn’t think it could get any cooler.  But it did.  We got Windows 95, then 98, XP, 7 and now 8.  I can’t wait to see what they come up with next.

The same goes for SQL Server.

We started out with what I thought was a great tool, then they made it better.  We went from having to create devices to hold our database files and keeping track of each file growth, to just being able to create our database files and let them grow on their own.  I don’t know about anybody else, but that made me so happy.  No more sp_helpdb every night to make sure I could recreate my databases if needed in a disaster.

It used to be very hard to write T-SQL code if you weren’t familiar with the database objects, the old iSQL query window didn’t have an object browser so you couldn’t see a list of your tables, let alone column names.  Now we have intellisense built right in to the query window.  This feature alone is with worth its weight in gold.  It allows me to be so much more productive, type the first few letters and hit tab.  “BAM!” as Emeril would say.

BCP really used to be the only way to get data in and out of your database, then they gave DTS.  Once again it’s like going from DOS to Windows.  Then they came out with SSIS.  Holy Cow, I thought I’d won the lottery after writing my first package in under 5 minutes.  It truly was an amazing transformation (pun definitely intended).

There are so many other features that have improved along the way, too many in fact to list them all here.

It’s been a long road,

The wild and windy night
That the rain washed away
Has left a pool of tears
Crying for the day.
Why leave me standing here?
Let me know the way.

There were definitely many tears shed along the way to where I am today, tears of frustration and of joy.  But I wouldn’t change my long and winding road for anything, it’s made me who and what I am today.

SQL Server let me know the way…