Did you know that it’s election season for PASS? Did you know that you can pose questions to the potential candidates via the Election Forums? The NomCom is busy interviewing candidates and is offering you the opportunity to ask the candidates question, via the NomCom. Do you have a question burning a hole in the back of your mind? You can ask anything from platform related issues to who their favorite sporting team is and just about anything in between.
This is your community, get involved.
I’m putting my money where my mouth is, with respect to my last blog post. I am officially a mentor with The Mentoring Experiment. I am so honored to have been selected as a mentor and am so excited to begin. I’ll let you know how it turns out when the experiment ends.
My company has recently put forth an effort to hire fresh college graduates and interns. I think this is a great thing. A long time ago, in a galaxy far far away, I was a fresh graduate, with no real world experience and my first employer took a big chance in hiring me. I was really lucky with my first employer, I had a couple of fabulous mentors that taught me not just technical skills, but life skills, that I still use today. Employers are getting in on the ground floor and it’s in their best interest to invest resources in what will hopefully be a long, successful and rewarding career. “Paying it forward”, as it were.
As a data professional, there are some essentials will make your life so much easier down the road. The first one is Standards and Best Practices. Establish these early and they will become habit. Now I could drone on forever about particular Standards and Best Practices, but that’s not my point. My point is that if you start off your career by adhering to standards and best practices, it won’t come as a shock to you later when they are forced upon you. You may not like the particulars of them, but it won’t feel so icky the first time you have to use them if you are used to following something. In working with our new employees, I always take the time to point them to our standards and explain why we have them. They are always appreciative of this, especially the explanation.
Another essential is code review. This is a big bang for your buck. It’s a learning experience for everyone involved. We use an Agile Methodology in our development life cycle. So our teams are fairly small, they usually consist of two to three general developers a technical architect and a data architect or DBA type person. Following the Agile Methodology, anybody should be able to work on anything in the product/iteration back log. For the general developer, they can usually write T-SQL, but most times it won’t be nearly as efficient (or dare I say pretty) as if the data architect/DBA type person wrote it. Code review allows for the team to become familiar with a task they are not directly involved with and provides valuable feedback to the person who wrote it. They may not be as familiar with the subtleties of the business rules or have no idea that when you use three nested select statements in the predicate you will most likely end up doing a table scan on that table with 30 million records, three times! All they know is that their query returns and it returns the correct answer.
Take the time to “pay it forward”, it can only benefit you in the long run and the person you “paid” will be that much better for it.
So, we’ve had this issue lately at work with our TFS build/deploys causing a significant amount of contention and only partially succeeding.
A little about our environment:
- SQL Server 2008 R2 Enterprise Edition, 64-bit.
- Visual Studio 2010
- Team Foundation Server (TFS)
- Using Transactional Replication
- Using partial projects for replicated objects
- 31 database projects in our solution
When we looked at the build log, we saw that several of our largest tables were being rebuilt, but not on a consistent basis. At a cursory glance we could not see why, they appeared to be the same as what was in TFS. After a couple of weeks we discovered that the tables were only being rebuilt right after a replication snapshot had been re-initialized. The tables that were being rebuilt were tables with identity columns that were being replicated. Turns out we had some mismatched configurations between TFS and replication. When using partial projects in TFS to manage replicated objects, you cannot use the NOT FOR REPLICATION attribute on your identity columns. You must also “manually” manage the identity ranges in your replication publications.
Tips/Reminders when using TFS and transactional replication.
- Create a partial project in the publication database project containing all the tables and their respective attributes that you want to replicate.
- Add a reference to the subscriber database project by importing the partial project created in step one.
- Do NOT use the NOT FOR REPLICATION attribute on the identity column in the tables that are being replicated. This is counter intuitive, but necessary if don’t want your tables being rebuilt every time between a snapshot initialization and your TFS build.
- When you create the replication publications, you will need to set the identity column management property to manual for the articles that have identity columns. Again, counter intuitive, but necessary. Another thing that we had to do was remove permissions from our users to directly update the tables in the subscriber databases. Our database supports a web based application that uses a generic service account to access the database. We had an issue when new/inexperienced support staff would directly modify data in the subscriber database.
- In your .sqldeployment file(s) check the option named IgnoreNotForReplication.
The reason for all this hoop jumping is because the tools do not support database replication, nor do they support Change Data Capture (CDC). These are great features, but when they are not supported in the tools that are available, it makes it very hard to want to use them.