It’s hard to believe it’s over. It felt like a whirlwind while I was in Seattle for my 7th PASS Summit, but now that I’m back home it feels like it was ages ago. I think time moves more quickly when you’re with friends and that’s where I was, with friends.
I got to reconnect with old friends and meet new ones. I didn’t attend nearly as many sessions as I would have liked, because let’s face it, cloning technology isn’t quite where it needs to be as Michael Keaton found out in Multiplicity. With my luck my “Number Four” would have attended one of Paul Randal‘s sessions and I would have wound up doing God knows what to my servers when I got back.
I also got to meet people that I have “worked” with for quite a while virtually, but never met in person. I must say it’s always refreshing when their “in person” exceeds your expectations. There are so many genuinely nice people in our community, I am truly in awe.
In years past I have not been able to participate in most of the after-hours activities due to Summit happening right before a big annual swim meet, which meant I couldn’t take a break from training. This year, my swim meet was the week before Summit so I didn’t need to get up at 4:30 a.m. every morning to make it to practice before breakfast. I got to see how the “other half” lived at Summit this year. I must say it was eye opening and entertaining. They don’t have next year’s swim meet on the calendar yet, but I have the Summit dates, so next year’s meet just may have to go on without me.
If you’ve ever attended a PASS Summit, you know what I’m talking about when I say I’ve already started the count down until next year’s Summit. If you’ve never attended a Summit, what are you waiting for?
Data Driven Subscriptions On A Budget – Part 2
Yes, this is Part 2, you can find Part 1 here to get background information.
This blog will talk about Point 3 – Distribute a report to a fluctuating list of subscribers.
Distribute a report to a (fluctuating) list of subscribers
When using email as your method of delivery for a Data Driven Subscription, best practice is to use metadata to drive that process. Usually a simple table that contains the email address of the recipient and the report name does the trick. This part of the process is no different if you don’t have Data Driven subscriptions. I usually create a table similar to this:
CREATE TABLE dbo.SSRSEmailSubscribers
EmailAddress varchar(128) NOT NULL
,ReportName varchar(128) NOT NULL
Let’s say I have a sales report that needs to go out on a daily basis and the standard format for this report is Excel. Because we don’t have data driven subscriptions, we can’t just query the table and use the resulting list to email the report. Instead we need to create a File Share subscription that generates this report and saves it to a file share. From there we can “pick up” the newly generated Excel file and email it to the recipients.
- Create a subscription to your Daily Sales Report in Report Manager, schedule it for a one time execution at a time just a few minutes in the future, and remember the execution time. (This creates the SQL Agent job in SQL Server.)
- Take a look at your SQL Agent jobs in SQL Server. If you have never seen or noticed a job created by SSRS, then you will be wondering where your job is because SSRS does not use human readable names for its jobs, it uses those pesky GUIDs as names, ugh! If your server has lots of jobs you may need to sort your jobs by Category to get all the “Report Server” jobs together. Find the job that executed at the time you scheduled (this is why you needed to remember the time!), this will be the job you will need to reference in the next step.
- Create a new SQL Agent job and add a step for Transact SQL script. In this new step you need to execute the SQL Agent job that you created back in step 1:
exec msdb.dbo.sp_start_job N'B514C05F-07D5-4C0B-9600-666E9980C7C3'
where B514C05F-07D5-4C0B-9600-666E9980C7C3 is the GUID from the job that SSRS created.
- Next you will need to add a new job step for PowerShell. In this newly created step write your PowerShell script to get the file that was generated (as a result of executing the previous step) and retrieve your list of email addresses. Once you have this information you can send the email with the report attached. There are several ways you can do this, but I chose to use PowerShell. Since PowerShell requires a double quoted semicolon (;) delimited list of email addresses when using SMTP, I wrote my SQL query to return a double quoted semicolon (;) delimited list of the email addresses. You could have just as easily used PowerShell command-lets to format your list. Here’s my PowerShell script:
$FilePath = "c:\temp\"
$smtpServer = "10.0.0.4"
$smtpFrom = email@example.com
$AddressQuery = "DECLARE @List varchar(MAX);"
$AddressQuery = $AddressQuery + "SELECT @List = COALESCE(@List + '"";""', '') + EmailAddress "
$AddressQuery = $AddressQuery + "FROM dbo.SSRSEmailSubscribers "
$AddressQuery = $AddressQuery + "WHERE ReportName = 'Daily Sales Report'; "
$AddressQuery = $AddressQuery + "SELECT '""' + @List + '""';"
Invoke-Sqlcmd -Query $AddressQuery -ServerInstance "MyServer" -Database "MyDatabase" -Variable $smtpTo
$messageSubject = "Daily Sales Report was executed"
$latest = Get-ChildItem -Path $FilePath -Filter "*.xlsx" | Sort-Object CreationTime -Descending | Select-Object -First 1
$FullFileName = $FilePath + $latest
$body = "Attached is the Daily Sales Report"
send-mailmessage -from $smtpFrom -to $smtpTo -subject $messageSubject -body $body -smtpServer $smtpServer -Attachments $FullFileName
Now schedule this newly created SQL Agent job for the time you need your Daily Sales Report to be executed and Wa-La, you now have a data driven subscription that distributes a report to a fluctuating list of email addresses.