Don’t Forget the Keys

I was recently given the nod to upgrade my monitoring server from SQL Server 2012 (SP 3) to SQL Server 2016.  This came none too soon as SQL Server 2012 (SP3) goes out of mainstream support on July 11, 2017.

We decided to go with a brand new box since the existing one was on Windows Server 2012 (not R2) and had been having issues lately.  So my SysAdmin guy stood up a brand new shiny Windows 2016 Server box for me.  This will be the first in our domain.  I get to be the guinea pig – WooHoo!

I got SQL Server 2016 installed on the new box without issue.  This box is used as my monitoring server and my personal sandbox, so it has Reporting Services (SSRS) installed on it as well as the database engine.  Since we are using a brand new box, there is no need to shut the old one off before we turned this one on, which is nice.  I can migrate things when I have time.

My first order of business was to migrate my SSRS databases to the new box and get it configured.  Now, it’s been a while since I’ve migrated an instance of SSRS, so of course I forgot something.  Otherwise you would not be reading this post.

I remembered to backup BOTH databases and the encryption key.  Once I had the databases restored on the new server I started the Reporting Services Configuration Manager so I could restore the encryption key so I wouldn’t lose all my credentials and other security sensitive information (BTW – Here’s a great reference for migrating SSRS).  After I restored my encryption key, I wanted to generate a new key, but the Backup button was not enabled on the Encryption key tab.  I didn’t think much of it, I just restarted SSRS and figured it would be available after restarting.  Of course it wasn’t.  I tried navigating to the URL for the SSRS Web Portal (replaces Report Manager) and I got an error saying Reporting Services was not configured correctly.  Interesting.  It took me about 15 minutes to realize/remember the last step in restoring the encryption key.  Do you remember what it is?  Without Googling it?

Okay, I’ll tell you since you’ve made it this far in the post.  Now I will tell you that I was not the one that installed/configured SSRS on the old server, so I was not aware that whoever did, configured the database for a scaled out deployment, even though it was not being used in a scaled out deployment <sigh>.  When you have a database that has been configured for scaled out deployment, you have to clean up the entries in the Keys table in the ReportServer database (the link above has a note about this very thing about three quarters of the way down).

Lesson learned:  Always look at the existing configuration very closely.  I totally didn’t notice (and hadn’t noticed for 3+ years) that the database had been configured for scaled out deployment, shame on me!

My shiny new SSRS 2016 instance is up and running now.

Working on my Microsoft Professional Program for Data Science

Microsoft announced a new program last year to help you understand the skills that a Data Scientist needs in their daily life.  It consists of nine courses and a final project, you can get all the details about it on the Microsoft Academy site.  I started working on this at the end of 2016 when things were slow at work and at home.  I completed seven of the courses before things started to pick up at work and home.  I’ve been mid-way through the eighth course for almost four months now, having to go back to the beginning of the course a few times due to being pulled away. 

The program has been very informative so far, providing courses on statistics and probability, Machine Learning, Power BI, R (and Python) and general data science concepts.  I’m hoping things will slow down a bit so I can complete the program by the end of Summer.

If you are at all curious about what a Data Scientist does, I highly recommend this program.  The great thing about this program is that you can take all the courses for free, that’s right, I said free, gratis, no dough required, etc.  However, if you do opt for the free route, you don’t earn that beautiful certificate that you can share with others, you just get the satisfaction of completing the course and broadening your horizons.  Either way, it’s a good way to get started in the field of Data Science.

Import Export Wizard Mapping Files

Recently I had to copy data from our AS400 to a SQL Server database.  Normally I would whip up my fancy SQL Server Integration Services (SSIS) project and get it hammered out pretty quickly.  Unfortunately there were over 4,000 tables that I needed to pull data from, no way in HELL was I going to manually create 4,000+ packages.  Now most of my BIML friends would say, I could BIML that for you in 2 hours and if my BIML weren’t so rusty, I probably could have too.  I didn’t have to do any fancy transformations on the data, I just had to copy it.  So I decided to take the “easy way out” and use the Import Export Wizard in SQL Server Management Studio.  Shoot, all I would have to do is a few clicks and be done with it, right?  Boy was I wrong.

This post talks about the issue I ran into with SSIS Mapping Files.

We currently run DB2 on an IBM iSeries AS400 for our ERP system.  I was tasked with copying data from the AS400 to a SQL Server database for some consultants to use.  The C-Suite didn’t want to give the consultants access to our AS400, so this was the work around that was put forth and accepted (and no, no one asked me before I was “voluntold” for the task).  Since this would essentially be a “one-time” thing, I chose to use the Import Export Wizard, but I would save the package just in case they wanted this process repeated.

I fired up the Import Export Wizard and selected my source, IBM DB2 for i IBMDA400 OLE DB Provider.  Now before you can select this data source you must install the IBM DB2 drivers.  You can find out more about them here, unfortunately you have to have a maintenance contract and an account with IBM before you can download them <sigh>.  It’s a straight forward install once you have the installation package.

EZ peazy, lemon squeezy.

I selected my destination, SQL Server Native Client 11.0, of course!

 

On a roll now, should only be another 30 seconds and I can put this project to bed.  Well, we all know that’s not what happened, otherwise you wouldn’t be reading this.

When I clicked on the Edit mappings button in the wizard to make sure all the datatypes had been mapped successfully, I got “<unknown type>” for every single column.  WTH?!  This doesn’t happen when I’m working in Visual Studio with my SSIS projects.  After some frantic googling, I found a couple of very good articles on the Mapping Files for the Import Export Wizard.

Data Type Mapping

Import Export Wizard can show numbers..

I took the advice of the articles and made copies of my Mapping Files before I modified them.  I made my modifications to include the column types and their respective mappings courtesy of the Data Type Mapping article and saved my changes.  I made sure the Import Export Wizard was closed then started it again.  This isn’t so hard, no big deal, they’ll all populate correctly now..WHAT?!  Still <unknown type> for all columns!  Now this has become a matter of solving it using this method, I will NOT resort to brushing up on my BIML.

After many attempts I finally figured out what the issue was.  There were two.  First, the order in which the Import Export Wizard searches through the Mapping Files.  Second, the Source Type within the Mapping File.

According to the Import Export Wizard, my source provider is IBMDA400 and it can’t find any mapping file.  But if you notice the Mapping file for my destination…

For the first issue, a little background on how the Import Export Wizard works.  When you select a source and destination the wizard has to know how to map the data types from source to destination so you don’t end up with gobbledygook in your destination.  So it searches through all the files in the following directories, depending on your architecture (I’m using SQL Server 2016 hence the 130 in the path):

C:\Program Files\Microsoft SQL Server\130\DTS\MappingFiles

C:\Program Files (x86)\Microsoft SQL Server\130\DTS\MappingFiles

The key word there is ALL the files in the directory, so if you just copy your original files to the same folder (with the famous “ – copy” so courteously appended by Windows Explorer), it will search through both your original AND the modified copy of the respective Mapping File.  In my case it was finding the source/destination Mapping File in the original Mapping File and completely ignoring my modified copy <sigh>.  Easy enough to fix, I moved my original “copies” to a completely different folder.

For the second issue, the source type within the Mapping File.  Now I will admit that I had been working on this for about 5 hours straight and had become so obsessed with making this work with the Import Export Wizard, I started to not pay attention to detail.  I want to see how long it takes you to find the issue, this is my file that I think should work.

 

This is the file that actually works

 

Did you find it?  How long did it take you?  Took me about an hour to figure this out

In case you still haven’t found the issue, the answer is:  The SourceType in the first file is using IBMDADB2* and the second file is using IBMDA*.  Since our source provider is IBMDA400 and we are using the first file (IBMDADB2*), there is will be no match on the source.  As soon as we change the SourceType (IBMDA*) we get a match (* is used as a wild card) it works.  Three little letters, that’s all it took for me to waste half a day.

Now what I ended up doing instead of modifying the original mapping file is creating a copy of it, renaming it to something meaningful to me, but still following the naming convention of the Mapping Files, changing the SourceType value to IBMDA* and adding all the data types that were missing.  This way there will be no conflict if I ever need to use the mapping file with the IBMDADB2 SourceType.

I hope this helps someone else.  There’re tons of posts out there about data mapping, but none of them tell you to pay special attention to the issues I had.  Granted my issues were self created, but they were issues nonetheless.