Thursday, August 28, 2014

SSRS report: Multi-value parameter going after Data-driven subscription

It is always good to enrich SSRS reports with multi-value parameters; users have more flexibility to combine different reporting scenario with no need to run a report for different parameter values separately. On the other hand, developers can craft their skills to embed multi-value parameters into report datasets; referencing different rows, labels and values from those parameters into some custom controls within reports is another competence which can also be quickly obtained.

Alright, report is developed and deployed for users. They requested to create report subscriptions and now enjoy luxury of scheduled results delivery in different formats. Report Manager in Native mode and SharePoint in SSRS Integrated mode read report parameters metadata pretty well and allow setting new values for multi-value parameters or keep default settings.

However when you’re dealing with data-driven subscription, it’s not that easy to pass data query values for multi-value parameters. You have a choice:
1) To be happy with reports default values and change nothing in report subscription 
2) You can still be happy and create regular subscription with manually setting parameter values
3) You can programmatically define parameter values using CreateDataDrivenSubscription method of the ReportingService assembly
4) Also, you can be crazy enough and built set of XML parameters tags in Parameters column and update the dbo.Subscriptions table of your Report Server instance.

But still, it doesn't let to link a data element from data-driven subscription query with your multi-value parameter; how would you pass several values in one columns of a single record. And I think you’ve already guessed one of the right answers: delimited string of values!

Here is how I’ve made it working for a project, where my client wanted to trigger report subscriptions from their web UI and each SSRS report executions could have different multi-value parameters values.

1) I've created two multi-value parameters in my SSRS reports:
        a. ReportParameter; made it Hidden, Available Values – None, Default Values – Specified values for just 2 or 3 lines for my reports (number of them is not really important at this moment)
        b. ReportParameterActual; kept it Visible, Available Value – Query from a dataset, Default values – Specify values with the following expression:
=Split(@ReportParameter, ",")
2) Once this report was deployed, I’ve created a data driven subscription where my data query looked similar to this: select ReportParameter = ‘Value1,Value2,Value3’ from MyTable
3) And then I mapped ReportParameter from the report with the ReportParameter column from this query.

This was the whole solution to the problem. ReportParameterActual was visible parameter for users and they could interact and set different values themselves beyond the subscriptions definition. ReportParameter was the actual interface and gateway for default values (which with data-driven subscription now became a dynamic control on set of values that would be supplied to multi-value ReportParameter (i.e. ReportParameterActual) in different subscription scenarios). And Split(@ReportParameter, ",") was the whole trick :-)

Update (2014-09-03)Richard D'Angelo asked me to share the RDL with the solution. I couldn't upload the actual client report, but I've made a sample one based on the [AdventureWorksDW2014].[dbo].[DimGeography] table. You can get this sample through this link - MultiValueParameterReport.rdl.

Happy data adventures!

Friday, July 25, 2014

Power View report migration from SharePoint 2010 to 2013

With the latest changes to the Microsoft Power BI and ability to create Power View reports within an Excel file (starting from Excel 2013) perhaps there will be no big need to migrate old stand alone Power View reports which were based on Power Pivot data models created in Excel 2010. You can just recreate you Power View analytics in Excel 2013 and off you go: users can see and interact with new reports either using desktop or Office 365 versions of Excel; and not to worry about SharePoint deployment platform.

However, if you really want to bring your old Power View reports from SharePoint 2010 to SharePoint 2013, it is still possible. During the time working on this task and I couldn't find any automotive tools that would enable me to migrate both the Power Pivot model file along with its related Power View report; so all the techniques described in this post are pure file upgrading and manipulation (so if you find some tools for this I would really appreciate).

Basically, in the old SharePoint 2010 and Excel 2010 world you could create a Power View report with two artifacts: (1) Power Pivot data model in Excel 2010 file and (2) Power View report which is based on that model.

First thing I did was upgrading the Power Pivot data model from Excel 2010 to Excel 2013 by just opening my old data model file in Excel 2013 and consented to upgrade the data model (here is the official information for this: Upgrade Power Pivot Data Models to Excel 2013). It took less then a minute to do the upgrade in my case, it make take longer depending on a size of your model. And then I deployed this upgraded Power Pivot data model file to the SharePoint 2013.

I could have done the same thing with the Power View .rdlx file by just copying it to the new SharePoint location; however the report still was connected to the old Power Pivot model file; and this needed to be fixed.

1) I've changed the file extension of my Power View .rdlx report to .zip and opened it as an archive file:






2) Then I went to the [Reports] folder where my PowerView report definition resided in the [report.rdl] file:






3) And then I just edited the data source connecting string and directed it to the upgraded Power Pivot data model file which had already been saved in the new SharePoint 2013 instance:











4) Then I saved my [report.rdl] file in that opened .zip archive; changed the .zip back to .rdlx and saved it in my new SharePoint 2013 Power View gallery.

There were a few gotchas that I needed to adjust in my upgraded Powev Pivot data model after this. What happened that during the data model upgrade all my custom column renaming and sorting were lost and this resulted in breaking some of the relationships within the data model and thus my adjusted Power View report didn't show all the charts at they were in the old SharePoint 2010 environment. But with a few adjustments to the data model, the new Power View report looked exactly like the original one with only difference that now it was in SharePoint 2013 environment.

It worked, and it worked well for me :-) 

Happy data adventures!

Tuesday, July 22, 2014

SSRS report subscription: Fire Event from SSIS package

My previous two posts about SSRS reports being deployed to a SharePoint site along with their subscription didn't stop me from exploring the whole things of SSRS reports subscriptions; regardless of their deployment environment either native or SharePoint mode.

After exploring different aspects of the subject (Subscriptions and Delivery) you'll become familiar with setting up delivery types, extensions and other metadata for your subscriptions (data-driven subscription brings even more flexibility :-). The only thing that puzzled me was how I could test each of the subscriptions: schedule, schedule and nothing but the schedule; and you even don't have a UI option to initiate or trigger subscriptions.

However, MS Reporting Service has a way to create an event that will trigger an SSRS report subscriptions no mater what those schedule definitions are. You can simply call a report server stored procedure dbo.AddEvent and provide two parameters: @EventType='TimedSubscription', @EventData=@Subsciption_ID (primary key from the dbo.Subscriptions table). And whatever action is defined in your subscription, it'll get performed!

There is another way to create the 'TimedSubscription' event within a report server, it's a FireEvent method of the ReportingServices web services that you can you call from anywhere. So let's be crazy enough and test this method from an SSIS Script Component that will trigger SSRS report subscriptions which are based in a SharePoint mode report server. Here is how it worked for me:

1) I created a proxy class from the ReportingServices web service (file location and SharePoint site will be different on your side)
C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\Bin\wsdl.exe /out:ReportService2010.cs http://yoursite.com/_vti_bin/ReportServer/ReportService2010.asmx

2) Then I compiled ReportService2010.cs into DLL:
C:\Windows\Microsoft.NET\Framework64\v4.0.30319\csc.exe csc /target:library ReportService2010.cs 

3) Signed this DLL file (Adding strong name for the library)

Details for this step are taken from here:

   a) Generate a KeyFile  
       sn -k keyPair.snk

   b) Get the MSIL for the assembly   
       ildasm ReportService2010.dll /out:ReportService2010.il  

   c) Build a new assembly from the MSIL output and your KeyFile 
       C:\Windows\Microsoft.NET\Framework\v2.0.50727\ilasm.exe ReportService2010.il /dll /key=keyPair.snk

   d) Install the DLL in to the GAC 
       gacutil -i ReportService2010.dll

5) Created a Script Component task in my SSIS Data Flow task




6) Diverted sourcing column with the Subscription IDs data to this component























7) Added reference to the ReportingService DLL in the script component 


















8) Placed a code to execute FireEvent method of the ReportingService web service















and executed my SSIS package! :-)

In a few moments I was able to see the results: emails were sent out and report files had been created; and I didn't even have to wait for schedules.

I hope that those 8 steps were detailed enough, otherwise let me know if you need more details on this.

Happy data adventures!

Tuesday, July 8, 2014

SSRS reports in SharePoint mode, Saga #2: PowerShell generates email subscriptions

Life is full of opportunities, it only takes to notice and use them. About 3 months ago I wrote a post about deploying reports using a PowerShell script (PowerShell, help me to publish my SSRS reports to SharePoint!) and at my new project I proposed to apply the very same technique of deploying reports to different environments. There was only one new thing to this, each of the SSRS reports at this new project had several data driven subscriptions (emails, file shares) and to recreate them manually, it would take a big junk of your life time.

So, that was my new opportunity that I didn't want to miss and decided to enhance my original PowerShell script even more, why not, let's utilize PowerShell potential and bring SSRS and SharePoint even closer.

Having this said, here is the complete PowerShell script (PS_Subscriptions) that does:
- Deploy SSRS datasources, datasets, reports,
- Maps all datasources and datasets for reports when it's necessary,
- and Creates report data subscription.

Main knowledge on how to automate SSRS subscription came from CreateDataDrivenSubscription method of the ReportingService assembly; also be sure to check this Microsoft knowledge base article (http://support.microsoft.com/kb/842854) that shows some of the corrections that were applied to the original examples of the CreateDataDrivenSubscription method.

Let's start from the main changes to the original script. I created two new functions that removes old subscriptions and creates new ones (CreateDataDrivenSubscription method doesn't overwrite old subscription with the same name, it just creates same name subscription with different subscription IDs):

















Also, don't forget to specify a necessary delivery extension for SSRS reports, because there are 3 of them available (Report Server Email, Report Server DocumentLibrary, Report Server FileShare):






Main steps to create a new SSRS report subscription can be described with these steps:
1) Set delivery extension setting
2) Create the data source for the delivery query with data query and fields
3) Define subscription schedule
4) Set parameter values
5) Execute CreateDataDrivenSubscription method

It's been a great experience to apply PowerShell techniques in creating SSRS report subscriptions; and I've spent a few days writing this code; however I'm sure that more experienced PS developers could accomplish this task in a few hours. Nevertheless, I've learned a few lessons and would like to share them with you.

Lessons learned:

a) If you create a new datasource, then use DataSourceDefinition; otherwise if it's a shared data source use DataSourceReference.
b) Don't use parameters labels (as they're shown in SharePoint), but use real parameters names:











c) Lack of matchdata schema explanation; look at your ReporServer backend table for the XML tag examples in order to understand a schedule creation.

d) Delivery Extension parameter values are case sensitive; for Email and Sharepoint file location they're different: for example, "IncludeLink" for Email and "RENDER_FORMAT" for SharePoint file.

e) Weird error about method not being supported; check all the variables, parameters format and value and test your script again.

I've been using PowerGUI tool (http://en.community.dell.com/techcenter/powergui/m/bits/) to develop and test all of my PowerShell scripts, perhaps you can suggest other PS GUI environments. But in the meantime, I continue wishing you happy data adventures!

Wednesday, April 30, 2014

And I wonder, where that column is?

I wouldn't have decided to write this post if I didn't see my boss at one of my previous jobs being really persistent and careful in his attempts to write his TSQL code. It started when I heard him pressing space-bar at his keyboard too much, I came to his desk and saw him lining up all of his select statement columns with the equal numbers of space characters before comma delimiter and actual column names, like this:

Select Column1
     , Column2
     , Column3
       …
     , ColumnN
From TableA


I questioned him on his tedious work and he simply explained that in some text editors default tab characters are shown differently (sometimes it’s expanded to several space characters and sometimes it’s being shown as a single one). And in this case our SQL code may look like this:

Select Column1 
, Column2 
     , Column3 
       … 
 , ColumnN 
From TableA

It’s still a working code, however it will take more time to read and comprehend its scope, different elements joined by various database object, calculated or concatenated column aliases, etc. That whole example of orderly space characters from my boss had made a final impression on me about a need for a good written code which could be easily read by others.

There are many ways to make your TSQL code look more readable, several free add-ons and plugins exists for SSMS. However I would like to emphasize a big value of where you manualy place your column aliases within you SQL code.

Let's take an example of the [dbo].[vTimeSeries] view from the [AdventureWorksDW2012] SQL Server sample database, more specifically the first section of the select statement with all the columns listed:

















and now compare the same code with this:
















where I can easily identify column definition for TimeIndex, ReportingDate and other columns. It's basically all about how we like putting column alias, one way expression [AS] column_alias or another column_alias = expression.

I personally like using the second approach; even if someone may say that it mixes result set column naming with variable assignment. I just like it because it saves me time to locate element names within the code and then I can easily find the answer to the original question on where that column is :-)

Enjoy your day and happy data adventures!

Monday, March 10, 2014

PowerShell, help me to publish my SSRS reports to SharePoint!

Every time when I heard the word 'PowerShell' it was just like a call to Terra Incognito, you know nothing about it and the longer you stay away the more interesting it becomes to learn. You think of people who could create PS scripts to install and setup complicated systems in a single run and maybe of some others who dreamed of starting engine of their cars with the help of PowerShell (as I heard about this at one of our local Toronto SQL Server user group meetings).

There are many different PowerShell training materials available in the internet, and it only takes time and a little efforts to start learning and practicing this technology at your every walk of you life (I've read it and it made me laugh of how PowerShell has become a panacea for everything :-). Well anyway, I've kept telling myself for a very long time that someday, somehow I would get into this and would get at least a sound knowledge to start applying it at some of my database related activities.

So this day has come and at my company we were deploying a big number of SSRS reports to a SharePoint application site. I know that you can do reports deployment to SharePoint from Visual Studio and manually publish RDL files too; the latter skill I've mastered with no basic training at all :-) Our SharePoint developers handed me a PowerShell script that did exactly was I was looking for: publishing all the datasources, datasets and reports; however all mappings between them all were left unset and you had manually specify all datasets and datasources for already published reports; and it does takes a lot of time!!!

With the help of a few articles in the internet:
PowerShell:Deploying SSRS Reports in Integrated Mode
SSRS SharePoint Foundation 2010 Integration - Deployment Query

and MSDN resource (http://technet.microsoft.com/en-us/library/reportservice2010.aspx) that lets you to explore all the details of the ReportService2010 Web service, I was able to understand how PowerShell could help me.

The main issue I've had was that the ReportService2006 web service didn't have a method to map datasets for reports, but the next version of this web service now had this option.














After my long trials and tribulations to build a single script that would do all the steps of SSRS reports deployment in a single run were over, then the main section of my script looked this way:

















and complete script can be downloaded from here: DeploySSRS_DataSource_DataSet_Report

You can adjust this script any way you need to: set variables values your own way, change routine logic, adjust output messages, etc. And again, I would like to repeat myself, then I didn't create most parts if this script, however after reading numerous articles and forum threads related to PowerShell; I made it running in the 2010 version of the ReportServer web service and successfully tested it in several SharePoint environments.

I hope you'll find this script useful and provide more improvements and modifications.

Enjoy your day and happy data adventures!


PS. Link to my post with the next developments of this script - SSRS reports in SharePoint mode, Saga #2: PowerShell generates email subscriptions.

Thursday, February 13, 2014

SQL Server and MS Access 2013 are still friends in Azure!

With more and more features being added to the SQL Server I started thinking that its friendship with MS Access 2013 began to fade away. Originally, their long relationship had been solidified through the course of both products maturity, from SQL Server 7.0 to 2008 R2 and MS Access 2000 to 2010 with the introduction of ADP (Access data project). However recent changes in Access 2013 introduced some challenges to those who heavily invested their application infrastructure into this couple of SQL Server and MS Access ADP. Many people got really upset about this and I understand both parties of this situation (you can read all the comments from the last link).

My first MS Access ADP project was developed in 2001 with SQL Server 7.0 and Access 2000 and at that time I realized all the potential that this framework had despite to all the skeptical criticism. During my last engagement with a different client (approximately 10 years after) I've built a solution using SQL Server 2008 R2 and MS Access 2010 and it was a very stable couple I would say :-)

Nevertheless,  I've decided to test a proof of concept, what if a Windows Azure SQL Database can still find a friendship with MS Access 2013, and if yes to what level.

1) I need to create a Windows Azure SQL Database, let's just name is Windows_Azure_Access:





















this is how I see this database in my Windows Azure portal:








2) Then let's add a new dbo.TableTest table and populate it with some values:

























I especially like the new look of a query execution plan: less colors and more details I think:





















3) So now we have the data in the Windows Azure SQL Database; let's try connect to the table we've created from a desktop MS Access application. Since the MS Access ADP technology is no longer available in 2013 version, we will go with the ODBC way of connecting to data.

We create a blank MS Access 2013 database. Then through the ODBC connection console I need to add a new SQL Server connection: provide Server name for my SQL Azure database with account and password to access it (make sue to use a 32-bit version of the odbcad32.exe file, for some of the reasons the 64-bit version didn't allow me to create new DNS).








4) Then I create a new ODBC connection within the MS Access database for my dbo.TableTest table from the Azure; and after I accomplish this then I can see both the table and its data:
















The same was as I can see this data in SSMS:























and in the native Azure environment:






For sure the T-SQL in Windows Azure SQL Databases has some limitations, but still provides a way to build database solutions; and as result of this simple test I can say that Windows Azure SQL Database and MS Access 2013 can be friends, even not that close as they used to be before, but still Azure data can be retrieved.


Enjoy your day and happy data adventures!