Wednesday, December 24, 2014

PowerPivot Saga 2: DAX OR function (OR there is something better)

I didn't realize that the DAX OR function only accepts 2 arguments compared with the similar function in Excel where a limit of arguments is set to 255. This difference only occurred to me when I received an error in my PowerPivot data model with an attempt to list more than 2 arguments for the OR functions; and only then I decided to check this DAX function syntax:

DAX: OR(<logical1>,<logical2>) vs. Excel: OR(logical1,logical2,...)

The MSDN documentation has a little remark (
The OR function in DAX accepts only two (2) arguments. If you need to perform an OR operation on multiple expressions, you can create a series of calculations or, better, use the OR operator (||) to join all of them in a simpler expression.

I liked the use of the word “better” in this section, and I made a little test with the data from the AdventureWorksDW2014 database. I pulled a small set of data from the dbo.DimCustomer table into my PowerPivot table:

SELECT [CustomerKey]
  FROM [AdventureWorksDW2014].[dbo].[DimCustomer]

Also, for this list of customers I added a column that would show me if a customer has finished a school. Actually I added two columns that showed me the same thing using two different types of OR logic (one with OR function and the second one with pipes):

Finished School A:=IF(OR([Education]="Partial College", [Education]="Partial High School"), "No", "Yes")
Finished School B:=IF(([Education]="Bachelors") || ([Education]="High School") || ([Education]="Graduate Degree"), "Yes", "No")

Then I used the [Finished School B] column to build one of the sample PowerView reports with the Customer count data:

So basically, for me this whole “number of arguments” limit for the DAX OR function was a lesson to learn, to check function syntax as well; however why this “number of arguments” limit differs with the limit of the Excel version of the function is still a mystery to me :-)

Happy data adventures!

Monday, November 3, 2014

PowerPivot Saga: Scheduled Refresh more than once a day

PowerPivot data models in Excel is a great way to quickly pull data from various sources, build relationship between different tabs along and creating different calculations. Once a model is built then it can be further utilized in reporting. 

Users want to have their reports to show updated information that means the PowerPivot data model that support those analytics should hold updated data, i.e. all data tabs being sourced from different data sources need be refreshed. 

We can refresh data in PowerPivot in two ways: 
Manually or interactively by refreshing all or specific data connector. 

Or we can schedule a data refresh if a PowerPivot model is deployed to SharePoint.

That's where a light disappointment starts developing when we realize that the most frequent data refresh cycle could be scheduled only once a day. Some people tried to interfere and tweak tables in the PowerPivot SQL Server database in SharePoint that are responsible for data model refresh schedules:

A Request has been at the Microsoft Connect website to enhance this functionality and add more granularity to how often a data model could be refreshed. However this request has been closed as by design in December of 2012:

At the recent PowerBI Saturday event in Mississauga Microsoft office I questioned a program manager who works in the SSRS team about this very aspect of the data refresh frequency, and he confirmed that there was no plan to bring that functionality to the PowerPivot model in SharePoint, however he suggested a workaround to solve this problem. The very same approach (suggested by one of my colleagues) is to store my tabular model in a SQL Server where I can have more control. 

So this is what I needed to to: 
1) Import a PowerPivot data model from an Excel file  into a Tabular Model solution  
2) then deploy this solution to an SSAS instance 
3) create a SQL job to process the newly deployed SSAS tabular database with my unique schedule (hours, minutes, seconds,.. anything you want to) 
4) create a BISM data connector in the SharePoint for the newly deployed SSAS tabular database 
5) and finally change sourcing or starting developing all my PowerView reports based on that new BISM data connector in SharePoint. 

And why this is a viable solution, because PowerPivot BI Semantic Model Connection (.bism) support two data source types:
- A tabular model database running on a standalone Analysis Services instance in tabular server mode.
- and PowerPivot workbooks saved to SharePoint.

At the end users were able to analyze their reports with the updated data. 

If a daily data refresh is sufficient for business users reports than we can use out of box feature of SharePoint, however if more granular time intervals for PowerPivot model data refresh are required that we can: 
A) design a PowerPivot data model in Excel  
B) Import it to the SSAS tabular database 
C) schedule more frequent database processing jobs for the new SSAS tabular database and use it as a source for your PowerView reports.

Happy data adventures!

PS. This material was also presented at the Toronto's SQL Server User Group on 2014-Oct-30.

Thursday, October 16, 2014

SSRS report rendered in EXCELOPENXML then something went missing

Recently I had been requested by a client to look into one of their SSRS reports issues. They had one report with a text box that contained some padding from the left side. All seems to be OK, report showed indented text values both in RPL and PDF rendering extensions; however when the report was exported to Excel those indentions were lost.

I removed the Left Padding information and substituted it with hard coded extra spaces in text box expression. This tweak didn't solve the problem; now my Report server didn't show those extra spaces (i.e. Excel Exported version of the report didn't show them either).

Then I removed extra spaces from the SSRS text box expression and added those spaces into a data set that sourced that text box. Now Report Server showed all the indented text boxes, however Excel exported report didn't show those extra spaces again; what ?!

Let’s be crazy, and then I removed all extra spaces from data query and built my text box expression using a conditional check for Globals!RenderFormat.Name="EXCELOPENXML" to only add extra spaces for my text box when a report gets exported to Excel; and this… solved the problem: all RPL, PDF and EXCELOPENXML versions of the reports showed me the left padded text box as it was intended.

Then I noticed another text box attribute (LeftIndent) that I didn’t try to adjust.  After adjusting the LeftIndent  value all my rendering extensions worked perfectly, so I no longer needed to use the conditional check for Excel rendering extension.

It’s funny that I’ve dedicated the whole blog post to EXCELOPENXML rendering  extension which I didn’t need to check at all; but … now I know how to solve this problem differently :-)

Wednesday, October 15, 2014

SSIS Pivot makes you stronger

At the recent Toronto SQL Server User group meeting Michal J Swart (@MJSwart) gave a good presentation based on 10 of his blog posts. There was one (Forget About PIVOT, You Don’t Need It) that got my attention; first, I remembered that I had read a while ago and my second thought was that I could add two cents to it.

Very early in my career I got fascinated by the MS Access TRANSFORM statement, then in 2005 version of the SQL Server the PIVOT/UNPIVOT operators came out and that made me even more excited in the way that now we can forget about all custom T-SQL code and write a simple statement to turn a data set anyway we want to.

I agree with Michael, that all the available reporting tools are much better for preparing a final data set for business users. However I've used PIVOT many times and I like using it in a middle data layer when one data set gets transformed (or in Excel terms, 'transposed') to another form and then moved further along the way.

The SSIS Pivot with UI gave some additional flexibility to transform input data set, however I still prefer the T-SQL method and I can explain you why.

Here is my sourcing query that pulls data the AdventureWorksDW2014 database that I want to transform:

In SSIS I can do the same thing and push the sourcing query through the Pivot task:

However when I execute it will fail due unique key constraint violation of my Pivot Key of the Pivot task:

So I had to change the sourcing query and added aggregation function to it:

and then it worked:

So, with a few hiccups I was able to adopt SSIS Pivot task, but I still prefer writing the Pivot T-SQL statement manually, more control, more flexibility :-)

Lessons learned:
1) Sourcing data set for SSIS Pivot task needs to be pre-aggregated.
2) If you don't like Pivot output columns (based on Pivot Key and Pivot Value) you can always rename them in the Advanced Editor.

Working with SSIS Pivot task didn't destroy my attitude but made me stronger :-)

Happy data adventures!

Thursday, August 28, 2014

SSRS report: Multi-value parameter going after Data-driven subscription

It is always good to enrich SSRS reports with multi-value parameters; users have more flexibility to combine different reporting scenario with no need to run a report for different parameter values separately. On the other hand, developers can craft their skills to embed multi-value parameters into report datasets; referencing different rows, labels and values from those parameters into some custom controls within reports is another competence which can also be quickly obtained.

Alright, report is developed and deployed for users. They requested to create report subscriptions and now enjoy luxury of scheduled results delivery in different formats. Report Manager in Native mode and SharePoint in SSRS Integrated mode read report parameters metadata pretty well and allow setting new values for multi-value parameters or keep default settings.

However when you’re dealing with data-driven subscription, it’s not that easy to pass data query values for multi-value parameters. You have a choice:
1) To be happy with reports default values and change nothing in report subscription 
2) You can still be happy and create regular subscription with manually setting parameter values
3) You can programmatically define parameter values using CreateDataDrivenSubscription method of the ReportingService assembly
4) Also, you can be crazy enough and built set of XML parameters tags in Parameters column and update the dbo.Subscriptions table of your Report Server instance.

But still, it doesn't let to link a data element from data-driven subscription query with your multi-value parameter; how would you pass several values in one columns of a single record. And I think you’ve already guessed one of the right answers: delimited string of values!

Here is how I’ve made it working for a project, where my client wanted to trigger report subscriptions from their web UI and each SSRS report executions could have different multi-value parameters values.

1) I've created two multi-value parameters in my SSRS reports:
        a. ReportParameter; made it Hidden, Available Values – None, Default Values – Specified values for just 2 or 3 lines for my reports (number of them is not really important at this moment)
        b. ReportParameterActual; kept it Visible, Available Value – Query from a dataset, Default values – Specify values with the following expression:
=Split(@ReportParameter, ",")
2) Once this report was deployed, I’ve created a data driven subscription where my data query looked similar to this: select ReportParameter = ‘Value1,Value2,Value3’ from MyTable
3) And then I mapped ReportParameter from the report with the ReportParameter column from this query.

This was the whole solution to the problem. ReportParameterActual was visible parameter for users and they could interact and set different values themselves beyond the subscriptions definition. ReportParameter was the actual interface and gateway for default values (which with data-driven subscription now became a dynamic control on set of values that would be supplied to multi-value ReportParameter (i.e. ReportParameterActual) in different subscription scenarios). And Split(@ReportParameter, ",") was the whole trick :-)

Update (2014-09-03)Richard D'Angelo asked me to share the RDL with the solution. I couldn't upload the actual client report, but I've made a sample one based on the [AdventureWorksDW2014].[dbo].[DimGeography] table. You can get this sample through this link - MultiValueParameterReport.rdl.

Happy data adventures!

Friday, July 25, 2014

Power View report migration from SharePoint 2010 to 2013

With the latest changes to the Microsoft Power BI and ability to create Power View reports within an Excel file (starting from Excel 2013) perhaps there will be no big need to migrate old stand alone Power View reports which were based on Power Pivot data models created in Excel 2010. You can just recreate you Power View analytics in Excel 2013 and off you go: users can see and interact with new reports either using desktop or Office 365 versions of Excel; and not to worry about SharePoint deployment platform.

However, if you really want to bring your old Power View reports from SharePoint 2010 to SharePoint 2013, it is still possible. During the time working on this task and I couldn't find any automotive tools that would enable me to migrate both the Power Pivot model file along with its related Power View report; so all the techniques described in this post are pure file upgrading and manipulation (so if you find some tools for this I would really appreciate).

Basically, in the old SharePoint 2010 and Excel 2010 world you could create a Power View report with two artifacts: (1) Power Pivot data model in Excel 2010 file and (2) Power View report which is based on that model.

First thing I did was upgrading the Power Pivot data model from Excel 2010 to Excel 2013 by just opening my old data model file in Excel 2013 and consented to upgrade the data model (here is the official information for this: Upgrade Power Pivot Data Models to Excel 2013). It took less then a minute to do the upgrade in my case, it make take longer depending on a size of your model. And then I deployed this upgraded Power Pivot data model file to the SharePoint 2013.

I could have done the same thing with the Power View .rdlx file by just copying it to the new SharePoint location; however the report still was connected to the old Power Pivot model file; and this needed to be fixed.

1) I've changed the file extension of my Power View .rdlx report to .zip and opened it as an archive file:

2) Then I went to the [Reports] folder where my PowerView report definition resided in the [report.rdl] file:

3) And then I just edited the data source connecting string and directed it to the upgraded Power Pivot data model file which had already been saved in the new SharePoint 2013 instance:

4) Then I saved my [report.rdl] file in that opened .zip archive; changed the .zip back to .rdlx and saved it in my new SharePoint 2013 Power View gallery.

There were a few gotchas that I needed to adjust in my upgraded Powev Pivot data model after this. What happened that during the data model upgrade all my custom column renaming and sorting were lost and this resulted in breaking some of the relationships within the data model and thus my adjusted Power View report didn't show all the charts at they were in the old SharePoint 2010 environment. But with a few adjustments to the data model, the new Power View report looked exactly like the original one with only difference that now it was in SharePoint 2013 environment.

It worked, and it worked well for me :-) 

Happy data adventures!

Tuesday, July 22, 2014

SSRS report subscription: Fire Event from SSIS package

My previous two posts about SSRS reports being deployed to a SharePoint site along with their subscription didn't stop me from exploring the whole things of SSRS reports subscriptions; regardless of their deployment environment either native or SharePoint mode.

After exploring different aspects of the subject (Subscriptions and Delivery) you'll become familiar with setting up delivery types, extensions and other metadata for your subscriptions (data-driven subscription brings even more flexibility :-). The only thing that puzzled me was how I could test each of the subscriptions: schedule, schedule and nothing but the schedule; and you even don't have a UI option to initiate or trigger subscriptions.

However, MS Reporting Service has a way to create an event that will trigger an SSRS report subscriptions no mater what those schedule definitions are. You can simply call a report server stored procedure dbo.AddEvent and provide two parameters: @EventType='TimedSubscription', @EventData=@Subsciption_ID (primary key from the dbo.Subscriptions table). And whatever action is defined in your subscription, it'll get performed!

There is another way to create the 'TimedSubscription' event within a report server, it's a FireEvent method of the ReportingServices web services that you can you call from anywhere. So let's be crazy enough and test this method from an SSIS Script Component that will trigger SSRS report subscriptions which are based in a SharePoint mode report server. Here is how it worked for me:

1) I created a proxy class from the ReportingServices web service (file location and SharePoint site will be different on your side)
C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\Bin\wsdl.exe /out:ReportService2010.cs

2) Then I compiled ReportService2010.cs into DLL:
C:\Windows\Microsoft.NET\Framework64\v4.0.30319\csc.exe csc /target:library ReportService2010.cs 

3) Signed this DLL file (Adding strong name for the library)

Details for this step are taken from here:

   a) Generate a KeyFile  
       sn -k keyPair.snk

   b) Get the MSIL for the assembly   
       ildasm ReportService2010.dll /  

   c) Build a new assembly from the MSIL output and your KeyFile 
       C:\Windows\Microsoft.NET\Framework\v2.0.50727\ilasm.exe /dll /key=keyPair.snk

   d) Install the DLL in to the GAC 
       gacutil -i ReportService2010.dll

5) Created a Script Component task in my SSIS Data Flow task

6) Diverted sourcing column with the Subscription IDs data to this component

7) Added reference to the ReportingService DLL in the script component 

8) Placed a code to execute FireEvent method of the ReportingService web service

and executed my SSIS package! :-)

In a few moments I was able to see the results: emails were sent out and report files had been created; and I didn't even have to wait for schedules.

I hope that those 8 steps were detailed enough, otherwise let me know if you need more details on this.

Happy data adventures!