The Whirlwind That Is October – Part 2

October has been such a whirlwind of PASS activity for me. Two SQL Saturdays and the PASS Summit. This post is about the second SQL Saturday I attended in Charlotte, SQL Saturday #452, on October 17, 2015, where I was a member of the organizing team. You can read about my first SQL Saturday here.

I have been an attendee and volunteer for the SQL Saturday in Charlotte that is put on by CBIG for the last 3 years. This year I was approached by one of their fearless leaders, Rafael Salas (B | T), to help out as one of the organizers. How cool is that?!

Let me start by saying I love, absolutely love, helping out as an organizer.   But with that said, wow, it’s a whole lot of work! So the next time you attend a SQL Saturday, be sure to find the organizers and thank them for the event. They do this on their own time and don’t get paid for any of it.

Since I’ve volunteered for registration at several SQL Saturdays (in addition to Charlotte) over the past 4 years and I’ve been on the PASS Program Committee for six years, I decided I could be of most use helping out with speaker selection, session scheduling and registration on the day of the event.

We had a great team of organizers this year, Rafael Salas (B | T), Javier Guillen (B | T), Jason Thomas (B | T), Elizabeth Ricks (T), Don Sparks and me. In the past Melissa Coates (B | T) has been one of the organizers and even though she is not on the team this year she definitely deserves an honorable mention as she put a lot of the processes in place to help make sure things run smoothly. Thank you Melissa!

One of the reasons SQL Saturday was developed was to encourage local speakers. Well, the CBIG team believes in that mission wholeheartedly. We were able to accept sessions from all the local speakers plus some great regional/national speakers.

The event was held at Central Piedmont Community College (CPCC) again this year. It’s a great venue with lots of light and room. All the sponsors were wonderful and had great give-a-ways for the attendees. We only had one speaker cancel in the last week prior to the event and that was beyond his control – clients, what can you do?! Andy Leonard (B | T) was so gracious and stepped up to fill the empty slot without hesitation, Thank you Andy!

We changed things up for the speaker dinner on Friday night. Instead of having just speakers, we included our sponsors and some of our very dedicated volunteers and called it our Appreciation Dinner. The food was delicious and the venue was perfect. I had the opportunity to talk with some speakers and sponsors I’d never met before. Thank you to both groups for all that you do for the community, we wouldn’t be able to put on such a great event without your support.

There were 588 people registered and 328 people actually attend. That’s a 55% turn out, which isn’t great, but it is pretty good. We had some amazing volunteers there helping us out both the night before for setup and during the event on Saturday. This is the fourth year I’ve been involved with this SQL Saturday and it really does just keep getting better every year.

Needless to say the day of the event went by in the blink of an eye. One minute my alarm was going off at 5:45 a.m., the next we were headed out the doors to the after event party. Several speakers and volunteers joined us at the after event party at Grapes Bistro & Wine Bar. The food was spectacular and the setup of the venue allowed for an easy flow of conversation. Slava Murygin (B | T) was there to take lots of pictures too. BTW – You can view them here for the Appreciation Dinner and here for the Event and after event party. Thank you Slava for all the pics, you did a great job.

I’m kind of sad that it’s over, but I really do need to catch up on some sleep. I am looking forward to next year and hope that we will see you there.

Well done CBIG!

The Whirlwind That Is October – Part 1

October has been such a whirlwind of PASS activity for me. Two SQL Saturdays and the PASS Summit. This post is about the first SQL Saturday I attended in Raleigh, SQL Saturday #445, on October 10, 2015, where I was both volunteer and attendee.

This event is put on by the Triangle Area SQL Server User (TriPASS). Brett Tomson (L | T) has long been the leader of this group but recently handed the reigns over to Kevin Feasel (L | T). Kevin and Brett did a great job. They had awesome speakers, great sponsors and wonderful volunteers (yes, I am including myself in that).

This year’s event was at a new venue, William Peace University. It was a smaller venue than last year, but it worked. It’s a beautiful campus with very friendly staff who were willing to help out in any manner they could.

Being the morning person that I am, I got up at 4:30 a.m. on Saturday to make the 90+ mile drive to Raleigh. I helped out with registration in the morning, which is always one of my favorite volunteer activities. You get to see EVERYONE that’s attending; speaker, sponsor and attendee alike. It gives me a chance to give #sqlhugs to my #sqlfamily to start the day off.   What could be better than that?!

I was able to attend two sessions, which were fabulous. Then I finished off my volunteer duties by taking lunch tickets at the Dining Hall. That’s right, I said Dining Hall. Because we were at WPU, the organizers were able to take advantage of the campus Dining Hall and the people that staff it. There was quite the selection from the salad bar, sandwich bar and dessert bar. The only bar missing was the bar with alcohol, but we still had afternoon sessions, so I guess that was for the best. This is the first SQL Saturday that I’ve attended where lunch was not brought in. I’m sure it was nice for the organizers because they didn’t have to worry about getting volunteers to help with lunch setup, wrangling coolers for soda/water or clean up and attendees had a very wide selection to choose from.

After lunch I was able to go visit the vendors and chat with them while attendees were in session. I even snagged one of the coveted bacon scented T-shirts from Micron (be forewarned, even after three washings it still smells like bacon).

After getting up so early, I was spent, and since I had to make the 90+ mile trip home, I decided to head home early. I missed the end of the day raffle and after event party, but I hear they were both entertaining, to say the least.

Huge thank you and congratulations goes out to the organizers of SQL Saturday #445, job well done.

Fall Means Election Season

While it’s not officially Fall yet, it is definitely Election Season. Elections for the PASS Board of Directors opens up on September 17, 2015. Campaigning starts today with the announcement of the Official Slate of Candidates.

It’s a great slate and it will be hard to choose. I appreciate all the hard work the NomCom has done to provide us with such a great list of candidates. Please take the time to read each candidate’s application. This is the first year the NomCom scores have been made public. Members of the NomCom spend countless hours researching, interviewing and agonizing over how the scores are reached, so pay attention to them.

PASS has also come up with some community events for you to get to know your candidates. Take advantage of them to help you make your decision.

This is your community. Do your research then cast your vote. This is your opportunity to make a difference.

TSQL2sday #68 – Defaults

A big thank you goes out to Andy Yun (b|t) for hosting this month’s TSQL2sday party. This month’s topic is Just Say No to Defaults.

TSQL2sDay150x150
TSQL2sday #68

I have a Bachelor’s degree in Mathematics with a minor in Computer Science. So when I say I’m a fairly smart person, it’s not bragging, it’s simply the truth. So when people say, “Any monkey can be a SQL Server DBA”, I find it offensive. While that statement may be close to true, it’s not the whole truth. Yes, Microsoft made SQL Server easy to use right out of the box, but if you want SQL Server to run well, you better have one of those really smart monkeys.

I can’t tell you how many times I have been approached by my friends that are developers, DBAs for other RDBMs or SysAdmins and asked to “fix” their SQL Server environment. They are really smart people, but they made the mistake of thinking that they could simply install SQL Server, accepting all the defaults, be off and running and never look back. The biggest complaint I hear from these same people is that “SQL Server is slow”. My response is usually something like, “Hmm, I can’t imagine why that would be”, in my most snicker-y voice.

There are so many things that can be tweaked in SQL Server to improve performance, but there are two things you can change right off the bat that will make a world of difference. They are the defaults for database file location and max memory. In fact, if you don’t change the default location for your database files and the max memory setting, a lot of the other performance tweaks won’t make that big of a difference.

Database File Location

When the SQL Server install runs, it asks where you want to put your database files. The default location is on the same drive where SQL Server is installed, which is typically the same drive as the OS installation. Do NOT accept this default, just say NO! If you have a high volume transactional system, this will cause competition with the OS and guess who loses? You do. You should take this a step further and separate out your data files from your log files. And your tempdb should have a drive all to itself. (Note: When I say drive, I am referring to physically separate disks, not a single disk that has been partitioned into multiple drives. If you’re using a SAN, make sure you coordinate with your SAN administrator to get your drives created from the correct “LUN pools”.)

Max Memory

After SQL Server has been installed, the default max memory setting is to basically allow SQL Server use as much memory as it wants. Sounds like a good idea on the surface, but just say NO! SQL Server is very greedy when it comes to memory, it will take every byte your server has to offer, leaving your OS starved. My general rule of thumb is to allocate ¾ of the total memory to SQL Server, never leaving less than 2GB, but not more than 4GB, for the OS.

These are not the only defaults you should change, but these are two that will get you the most bang for your buck. They are easy to change and the implications/consequences of these changes are easy to understand. If you are a beginner with SQL Server, start with these two things, they will make you look like a very smart monkey.

The Whole Pie

Today should be such an exciting day, the day the speakers are announced for the 2015 PASS Summit. But as someone who has been involved with the process for the last 6 Summits, it’s a day that I usually turn my Twitter feed off because there’s always those one or two people who don’t get selected or who don’t like the selections that were made and throw a very public tantrum. Then the insults and accusations start flying.

There has been a long history of criticism of the abstract review process for the PASS Summit. Speakers complain that they don’t get feedback or the feedback they get is not helpful and that the process takes too long. Community members complain that certain topics weren’t included, that the process is a black box or there is a “white” and/or “black” list of who gets to present and who doesn’t.

When I started working on the Program Committee six years ago as an abstract review team member, it was a lot of work but very rewarding. When I was asked to step up my game two years ago and become a team lead, it was more work, but even more rewarding. Being part of the process that builds the program for the Summit is a great honor and a tremendous responsibility. So when I was asked to be a Program Manager this year, I had to consider what it would mean. It would mean even more work but I wasn’t sure if it would be more rewarding. But I decided I was up for the challenge and accepted the role.

In years past I was only responsible for a very small piece of the pie. I had my track that I had to think about and that was it. As a review team member, you read the abstracts for the track that you are assigned, you rate them and then you provided the team lead with your rankings. That was it. The rest of the process was really a black box to me.

As a team lead there are a few more pieces of the pie that you get to sample. Not only do you get to read all the abstracts in your track, but you also get to wrangle reviewers (someone akin to herding cats) and make tough decisions based on input from your team members. You have to take into account things like topic distribution, session level and speaker distribution (speakers are limited to only two general sessions). Being a team lead is a very time consuming role. Last year in an effort to provide better feedback to submitters a report was introduced that allowed team leads to see the comments reviewers had made. This gave us the opportunity to have a bit more of an insight into what the team members were thinking when they scored an abstract. The flow of the data wasn’t perfect, but it was tremendously helpful to me as a team lead.

As a Program Manager you have to look at the whole pie. You have to do all the things that a team lead does, but now you have to do it for ALL the tracks, not just the one that you were assigned to review. Then you have to set out to “build the program”. Building the program is like a super-mega-charged game of Jenga. You move one piece and it can cause such a ripple effect, you might spend thirty minutes trying to “fill the gap” you just created. I have a whole new appreciation for the process after being a Program Manager.

Whether you are a speaker getting an email, or you are a community member looking over the sessions that were selected, remember that feedback is a GIFT. Everyone loves gifts, so think twice before you speak/blog/tweet about the process. That was a giant pie we just made and we are very proud of it.

Earning my MCSE: Business Intelligence Certification

I earned my MCSE: Business Intelligence Certification on May 27, 2015. It was a long road, but I did it. Back in May of 2013, I wrote about being Certifiable and wasn’t really interested in pursuing any certifications. What made me change my mind you ask? The short answer is, being a speaker.

Last summer I was invited to speak for the Triad SQL BI User Group in Winston-Salem. I did a very introductory class on Integration Services. I was asked a question that seemed simple, but I didn’t know the answer. That got me thinking, if I don’t know the answer to that, what else don’t I know?

I started doing some research on the question and decided, if I am going to do this research, why not get something other than just an answer, there had to be other things that I didn’t know. I looked at the MCSA certification path again. I looked through the topics that the three exams covered and got really excited. There were so many pieces of the technology that I had never used or hadn’t used in years. This was a real learning opportunity. I decided I needed to get my SQL learnin’ on.

I did a little bit more research on the exams and what study guides were available and discovered the Microsoft Training Kit. It consists of three books, each dedicated to an exam and each book has its own practice exams. It seemed like the best candidate so I ordered it from Amazon and had it delivered in two short days (Thank you Amazon Prime!).

The MCSA certification consists of three exams, 40-461, 70-462 & 40-463. The first exam, 70-461, is all about querying SQL Server. I’ve been querying SQL Server for almost 20 years, so it didn’t take much effort for me to pass this exam. I read through the questions at the end of every lesson in each chapter and the case studies. For the questions I got wrong, I went back and read the lesson, re-answered the questions correctly and that’s it. I passed exam 70-461 on December 24, 2014.

Exam 70-462 was a bit more involved for me. It is focused on Administering SQL Server. I had never used Always On and it has been years since I worked with replication so I figured the best place to start was by taking a practice exam to see where I needed to focus. I failed that first practice exam, but it provided me with a road map of what I actually needed to focus on. On January 30, 2015, I passed exam 70-462.

Exam 70-463 is about implementing a data warehouse. I followed the same approach for 70-463 as I did for exam 70-462. That approach paid off and on February 20, 2015, I passed the exam and earned my MCSA for SQL Server 2012.

I was going to stop at the MCSA, but after I completed that with relative ease, I decided I needed a bit more of a challenge. The question came down to MCSE: Database Professional or MSCE: Business Intelligence, since most of the work that I do now is BI related, I decided on the later. I looked at the topics that were covered in the exams and realized there were going to be some huge gaps. I don’t use Reporting Services in SharePoint integrated mode nor do I do any work with the Tabular model for Analysis Services. I’ve only been using Analysis Services on a regular basis for about 2 1/2 years now, so I am certainly no expert, so definitely needed some work there as well.

There are two exams needed to earn your MCSE: Business Intelligence after your MCSA, they are 70-466 and 70-467. Since there are no Training Kits for the last two exams, I decided to take Microsoft up on its Second Shot offer. For a limited time, it allowed a person a second chance to take a qualifying exam for free if you fail it the first time. I figured, what do I have to lose? At best I’ll pass first time around. At worst, I’ll fail the exam, but will gain valuable experience in how the exam is structured, what it covers and learn where I need to focus my studies. Then I could retake the exam for free. I failed exam 70-466 the first time I took it, as I expected I would. But I did much better than I thought I would, so I knew there was hope of earning my MCSE.

I went out to Microsoft Virtual Academy (MVA) and found the training video for 70-466. I also found the video for Tabular Model training. In addition to MVA, I also used PluralSight and various other books. I studied up on the stuff that I had never seen or worked with before. Then I went through a few refresher videos on the stuff I already knew (but had forgotten) and retook the exam, passing the second time around with flying colors on May 6, 2015.

The last exam was the most nerve racking, 70-467. You basically have to take all your knowledge from the previous four exams and apply that knowledge to what seems like an endless barrage of case studies. If you were no good at story problems in school, then this exam is definitely going to challenge you. I passed the exam on my first try, but I really wish I hadn’t waited three weeks between taking it and 70-466. Since I do not use the Tabular data model or Reporting Services in SharePoint integrated mode, I forgot a lot of the material in the three weeks between the two exams. You are given 150 minutes to take the exam and I finished with only three minutes to spare because I had to rack my brain for those nuggets of information that I hadn’t had the opportunity to use out in the wild. I think that if I had taken the exam within a week of 70-466, I would have done much better and had more time remaining.

Overall it was a good experience. I plan on taking some of the things I learned (and “relearned”) and implementing them at work to provide a better experience for our users. I know they will be grateful and I will know that I’ve done the best possible job that I could for them.

The certification isn’t why I started this journey. I started this journey because there was something that I didn’t know. Don’t let certification be the only reason you take this journey, make it one of the many rewards when you reach the end.

TSQL2sday #66 – Monitoring

A big thank you to Cathrine Wilhelmsen (blog | twitter) for hosting this month’s TSQL2sday party. Monitoring is this month’s topic and it’s a very important one. It could mean the difference between having a job and looking for a job.

TSQL2sDay150x150When I started working with SQL Server (a long time ago, in a galaxy far far away) there were no commercial monitoring tools available and quite often I would get called or paged (yes, it was a long time ago) in the middle of the night by a very angry boss because there was something “wrong” with the database. Or worse yet, I would get no call at all and show up at work the next morning with a line of angry people waiting for me when I got off the elevator. It only took a couple of these encounters for me to realize that I needed to be much more proactive or change my line of work (I had heard that underwater basket weaving was an easy gig).

I started looking at the reasons I was being called and discovered most of them were things that could easily have been avoided if I had known about them earlier. Things like database and transaction log files filling up, running out of disk space, processes/queries that were taking increasingly longer and longer. Since there were no commercial monitoring tools out there I decided I needed to essentially roll my own.

I had to start looking under the covers to find what I was looking for. This gave me an even greater exposure into how SQL Server worked. Did I mention that this was before Google? I couldn’t just search for easy answers, I had to really dig in the system databases to find what I wanted. This was in fact, one of the best things that could have happened to me so early in my career as a DBA. I was forced to learn how SQL Server worked on my own.

To this day, I still “carry” around my home grown monitoring solution in my toolbox. I have updated it and expanded it through the years to accommodate newer versions and functionality and made it more efficient based on both of those things. Not all shops have the budget for monitoring tools and even if they do, a lot of the time they are only willing to spend that money on production servers, not development or test (don’t get me started, that’s an entirely different blog post).

My little monitoring solution has come in handy over the years because it has afforded me the opportunity to discover what’s under the covers of the newest version/features of SQL Server and provide a no cost basic monitoring solution to my employers when the budget is tight or non-existent. If you don’t have your own monitoring solution I would highly recommend you create one, if for nothing more than the reasons I stated above.

Don’t get me wrong, I love the commercial monitoring tools that I have access to now, but knowing the how and why of SQL Server will only make you a better DBA and could possibly mean the difference between having a job and looking for a job.

Becoming a PASS Chapter Leader – Challenge Accepted!

My first real introduction to PASS was back in 1998. I was an “Accidential DBA” and needed some training. I heard about this conference in Chicago and my boss agreed it was the best place to start. Little did I know how that conference would influence the rest of my career.

Fast forward to 2010, after taking a slight hiatus from SQL Server, I was ready to get back into it. What better place than a local user group? After moving over 1,600 miles (a couple of years earlier) from where I started my career in SQL Server, I needed to make some new local connections. That’s when I was introduced to the Triad SQL Server User Group. At the time, Miguel Cebollero (B | T) was the chapter leader. Miguel was so friendly and welcoming and I remember Kevin Goode was presenting a session on CLR and how it really is NOT the devil’s work, but a great tool if used correctly. I was hooked.

In early 2011, Miguel decided to move back to his home state of Florida and handed the reins over to Kevin. Kevin moved the meeting location from Winston-Salem to Greensboro and I attended meetings on a semi-regular basis, but let’s face it, life just gets in the way. By 2012, our meeting attendance started to drop off and a new BI Group had started up in Winston-Salem. It became obvious that something had to change. Kevin had the foresight to work with the new BI group and asked for volunteers to help him out with the chapter. I decided to step up as “Speaker Wrangler” and we worked out a schedule with the new BI group that would be mutually beneficial.

Now it’s 2015 and Kevin has taken on a new role in his company which means he has less and less time to devote to the user group. When Kevin approached me about taking over the user group I was both honored and excited (and a little scared) that he considered handing the reins over to me. I have accepted the challenge and I hope that I can continue to lead the group as well as both Kevin and Miguel did and provide a much needed resource to local SQL Server professionals.

We will be a little more active on social media, we tweet at @TriadSQL and use the #TriadSQL hashtag, and welcome any suggestions that will make your user group more useful to you.

How to Add Users to Roles in SSAS, a.k.a., Where the Heck are the Buttons?

I’ve been responsible for administering Analysis Services (SSAS) on a regular basis for the past 7-8 years via SQL Server Management Studio (SSMS), but developing is a whole different ball game (and set of tools). I only recently started doing development work with SSAS on a regular basis. Prior to that I dabbled once every two to three years, enough to be able to say, “Yes, I can do that, but I can’t remember exactly how.” Fortunately the tool used for SSAS development is a tool that I am very familiar with, Visual Studio (or as we data professionals now call it, SQL Server Data Tools).

I recently had to change some security for our cube and was tasked with modifying the role membership. Now this most likely seems like a no brainer and it is, as long as you know where the buttons are. What buttons am I referring to? Those precious Add and Remove buttons that allow us to modify the members. When doing development work in SQL Server Data Tools (SSDT), there are all these nice menus and toolbars at the top or on the left hand side. So when I went to do some work on the role membership, imagine my dismay when I could find no way to add/remove users to the role via the menu or toolbar.

I must have spent a good 10 minutes looking through all the menu options and toolbars, only to discover there is no way to add/remove members in any of those. So I of course freak out mode ensued. I must have the wrong version of SSDT, I must not have the right permissions to be able to add/remove users, the list goes on and on. I took another look, I mean look at all that screen real estate. You have that HUGE white space that is just dying to be used and no way to use it.

Post26Pic02

Then it happened, I the saw the light (or buttons as it were), hiding way down there at the bottom of my screen.

Post26Pic03

Like I said, this may seem like a no brainer, especially for someone who does SSAS development work on a daily basis, but for those that are new or only use it every so often, it’s frustrating. After seeing someone ask where the buttons were via #sqlhelp on Twitter, I decided I needed to write this up.

I hope this saves some other unsuspecting soul the frustration that I experienced.

Achievement Unlocked: 24 Hours of Pass Session Submitted

I have finally submitted a session for a speaking engagement. Those who know me know that this is a very BIG deal. I have a horrible fear of public speaking, but I am slowly starting to embrace that fear. Last year I presented at the Triad SQL BI User Group and I didn’t hyperventilate, no one was more surprised than me.

I submitted the same session that I did at the Triad SQL BI User Group, Introduction to Integration Services (SSIS), to the upcoming 24 Hours of PASS: Growing Our Community.

I wanted to be a teacher when I was in college until I discovered my fear of public speaking. Now, more than 20 years later, maybe I can finally reach that goal.

Fingers crossed that my session is accepted.