Search This Blog

Wednesday, December 24, 2014

PowerPivot Saga 2: DAX OR function (OR there is something better)

I didn't realize that the DAX OR function only accepts 2 arguments compared with the similar function in Excel where a limit of arguments is set to 255. This difference only occurred to me when I received an error in my PowerPivot data model with an attempt to list more than 2 arguments for the OR functions; and only then I decided to check this DAX function syntax:

DAX: OR(<logical1>,<logical2>) vs. Excel: OR(logical1,logical2,...)

The MSDN documentation has a little remark (
The OR function in DAX accepts only two (2) arguments. If you need to perform an OR operation on multiple expressions, you can create a series of calculations or, better, use the OR operator (||) to join all of them in a simpler expression.

I liked the use of the word “better” in this section, and I made a little test with the data from the AdventureWorksDW2014 database. I pulled a small set of data from the dbo.DimCustomer table into my PowerPivot table:

SELECT [CustomerKey]
  FROM [AdventureWorksDW2014].[dbo].[DimCustomer]

Also, for this list of customers I added a column that would show me if a customer has finished a school. Actually I added two columns that showed me the same thing using two different types of OR logic (one with OR function and the second one with pipes):

Finished School A:=IF(OR([Education]="Partial College", [Education]="Partial High School"), "No", "Yes")
Finished School B:=IF(([Education]="Bachelors") || ([Education]="High School") || ([Education]="Graduate Degree"), "Yes", "No")

Then I used the [Finished School B] column to build one of the sample PowerView reports with the Customer count data:

So basically, for me this whole “number of arguments” limit for the DAX OR function was a lesson to learn, to check function syntax as well; however why this “number of arguments” limit differs with the limit of the Excel version of the function is still a mystery to me :-)

Happy data adventures!

Monday, November 3, 2014

PowerPivot Saga: Scheduled Refresh more than once a day

PowerPivot data models in Excel is a great way to quickly pull data from various sources, build relationship between different tabs along and creating different calculations. Once a model is built then it can be further utilized in reporting. 

Users want to have their reports to show updated information that means the PowerPivot data model that support those analytics should hold updated data, i.e. all data tabs being sourced from different data sources need be refreshed. 

We can refresh data in PowerPivot in two ways: 
Manually or interactively by refreshing all or specific data connector. 

Or we can schedule a data refresh if a PowerPivot model is deployed to SharePoint.

That's where a light disappointment starts developing when we realize that the most frequent data refresh cycle could be scheduled only once a day. Some people tried to interfere and tweak tables in the PowerPivot SQL Server database in SharePoint that are responsible for data model refresh schedules:

A Request has been at the Microsoft Connect website to enhance this functionality and add more granularity to how often a data model could be refreshed. However this request has been closed as by design in December of 2012:

At the recent PowerBI Saturday event in Mississauga Microsoft office I questioned a program manager who works in the SSRS team about this very aspect of the data refresh frequency, and he confirmed that there was no plan to bring that functionality to the PowerPivot model in SharePoint, however he suggested a workaround to solve this problem. The very same approach (suggested by one of my colleagues) is to store my tabular model in a SQL Server where I can have more control. 

So this is what I needed to to: 
1) Import a PowerPivot data model from an Excel file  into a Tabular Model solution  
2) then deploy this solution to an SSAS instance 
3) create a SQL job to process the newly deployed SSAS tabular database with my unique schedule (hours, minutes, seconds,.. anything you want to) 
4) create a BISM data connector in the SharePoint for the newly deployed SSAS tabular database 
5) and finally change sourcing or starting developing all my PowerView reports based on that new BISM data connector in SharePoint. 

And why this is a viable solution, because PowerPivot BI Semantic Model Connection (.bism) support two data source types:
- A tabular model database running on a standalone Analysis Services instance in tabular server mode.
- and PowerPivot workbooks saved to SharePoint.

At the end users were able to analyze their reports with the updated data. 

If a daily data refresh is sufficient for business users reports than we can use out of box feature of SharePoint, however if more granular time intervals for PowerPivot model data refresh are required that we can: 
A) design a PowerPivot data model in Excel  
B) Import it to the SSAS tabular database 
C) schedule more frequent database processing jobs for the new SSAS tabular database and use it as a source for your PowerView reports.

Happy data adventures!

PS. This material was also presented at the Toronto's SQL Server User Group on 2014-Oct-30.

Thursday, October 16, 2014

SSRS report rendered in EXCELOPENXML then something went missing

Recently I had been requested by a client to look into one of their SSRS reports issues. They had one report with a text box that contained some padding from the left side. All seems to be OK, report showed indented text values both in RPL and PDF rendering extensions; however when the report was exported to Excel those indentions were lost.

I removed the Left Padding information and substituted it with hard coded extra spaces in text box expression. This tweak didn't solve the problem; now my Report server didn't show those extra spaces (i.e. Excel Exported version of the report didn't show them either).

Then I removed extra spaces from the SSRS text box expression and added those spaces into a data set that sourced that text box. Now Report Server showed all the indented text boxes, however Excel exported report didn't show those extra spaces again; what ?!

Let’s be crazy, and then I removed all extra spaces from data query and built my text box expression using a conditional check for Globals!RenderFormat.Name="EXCELOPENXML" to only add extra spaces for my text box when a report gets exported to Excel; and this… solved the problem: all RPL, PDF and EXCELOPENXML versions of the reports showed me the left padded text box as it was intended.

Then I noticed another text box attribute (LeftIndent) that I didn’t try to adjust.  After adjusting the LeftIndent  value all my rendering extensions worked perfectly, so I no longer needed to use the conditional check for Excel rendering extension.

It’s funny that I’ve dedicated the whole blog post to EXCELOPENXML rendering  extension which I didn’t need to check at all; but … now I know how to solve this problem differently :-)

Wednesday, October 15, 2014

SSIS Pivot makes you stronger

At the recent Toronto SQL Server User group meeting Michal J Swart (@MJSwart) gave a good presentation based on 10 of his blog posts. There was one (Forget About PIVOT, You Don’t Need It) that got my attention; first, I remembered that I had read a while ago and my second thought was that I could add two cents to it.

Very early in my career I got fascinated by the MS Access TRANSFORM statement, then in 2005 version of the SQL Server the PIVOT/UNPIVOT operators came out and that made me even more excited in the way that now we can forget about all custom T-SQL code and write a simple statement to turn a data set anyway we want to.

I agree with Michael, that all the available reporting tools are much better for preparing a final data set for business users. However I've used PIVOT many times and I like using it in a middle data layer when one data set gets transformed (or in Excel terms, 'transposed') to another form and then moved further along the way.

The SSIS Pivot with UI gave some additional flexibility to transform input data set, however I still prefer the T-SQL method and I can explain you why.

Here is my sourcing query that pulls data the AdventureWorksDW2014 database that I want to transform:

In SSIS I can do the same thing and push the sourcing query through the Pivot task:

However when I execute it will fail due unique key constraint violation of my Pivot Key of the Pivot task:

So I had to change the sourcing query and added aggregation function to it:

and then it worked:

So, with a few hiccups I was able to adopt SSIS Pivot task, but I still prefer writing the Pivot T-SQL statement manually, more control, more flexibility :-)

Lessons learned:
1) Sourcing data set for SSIS Pivot task needs to be pre-aggregated.
2) If you don't like Pivot output columns (based on Pivot Key and Pivot Value) you can always rename them in the Advanced Editor.

Working with SSIS Pivot task didn't destroy my attitude but made me stronger :-)

Happy data adventures!

Thursday, August 28, 2014

SSRS report: Multi-value parameter going after Data-driven subscription

It is always good to enrich SSRS reports with multi-value parameters; users have more flexibility to combine different reporting scenario with no need to run a report for different parameter values separately. On the other hand, developers can craft their skills to embed multi-value parameters into report datasets; referencing different rows, labels and values from those parameters into some custom controls within reports is another competence which can also be quickly obtained.

Alright, report is developed and deployed for users. They requested to create report subscriptions and now enjoy luxury of scheduled results delivery in different formats. Report Manager in Native mode and SharePoint in SSRS Integrated mode read report parameters metadata pretty well and allow setting new values for multi-value parameters or keep default settings.

However when you’re dealing with data-driven subscription, it’s not that easy to pass data query values for multi-value parameters. You have a choice:
1) To be happy with reports default values and change nothing in report subscription 
2) You can still be happy and create regular subscription with manually setting parameter values
3) You can programmatically define parameter values using CreateDataDrivenSubscription method of the ReportingService assembly
4) Also, you can be crazy enough and built set of XML parameters tags in Parameters column and update the dbo.Subscriptions table of your Report Server instance.

But still, it doesn't let to link a data element from data-driven subscription query with your multi-value parameter; how would you pass several values in one columns of a single record. And I think you’ve already guessed one of the right answers: delimited string of values!

Here is how I’ve made it working for a project, where my client wanted to trigger report subscriptions from their web UI and each SSRS report executions could have different multi-value parameters values.

1) I've created two multi-value parameters in my SSRS reports:
        a. ReportParameter; made it Hidden, Available Values – None, Default Values – Specified values for just 2 or 3 lines for my reports (number of them is not really important at this moment)
        b. ReportParameterActual; kept it Visible, Available Value – Query from a dataset, Default values – Specify values with the following expression:
=Split(@ReportParameter, ",")
2) Once this report was deployed, I’ve created a data driven subscription where my data query looked similar to this: select ReportParameter = ‘Value1,Value2,Value3’ from MyTable
3) And then I mapped ReportParameter from the report with the ReportParameter column from this query.

This was the whole solution to the problem. ReportParameterActual was visible parameter for users and they could interact and set different values themselves beyond the subscriptions definition. ReportParameter was the actual interface and gateway for default values (which with data-driven subscription now became a dynamic control on set of values that would be supplied to multi-value ReportParameter (i.e. ReportParameterActual) in different subscription scenarios). And Split(@ReportParameter, ",") was the whole trick :-)

Update (2014-09-03)Richard D'Angelo asked me to share the RDL with the solution. I couldn't upload the actual client report, but I've made a sample one based on the [AdventureWorksDW2014].[dbo].[DimGeography] table. You can get this sample through this link - MultiValueParameterReport.rdl.

Happy data adventures!

Friday, July 25, 2014

Power View report migration from SharePoint 2010 to 2013

With the latest changes to the Microsoft Power BI and ability to create Power View reports within an Excel file (starting from Excel 2013) perhaps there will be no big need to migrate old stand alone Power View reports which were based on Power Pivot data models created in Excel 2010. You can just recreate you Power View analytics in Excel 2013 and off you go: users can see and interact with new reports either using desktop or Office 365 versions of Excel; and not to worry about SharePoint deployment platform.

However, if you really want to bring your old Power View reports from SharePoint 2010 to SharePoint 2013, it is still possible. During the time working on this task and I couldn't find any automotive tools that would enable me to migrate both the Power Pivot model file along with its related Power View report; so all the techniques described in this post are pure file upgrading and manipulation (so if you find some tools for this I would really appreciate).

Basically, in the old SharePoint 2010 and Excel 2010 world you could create a Power View report with two artifacts: (1) Power Pivot data model in Excel 2010 file and (2) Power View report which is based on that model.

First thing I did was upgrading the Power Pivot data model from Excel 2010 to Excel 2013 by just opening my old data model file in Excel 2013 and consented to upgrade the data model (here is the official information for this: Upgrade Power Pivot Data Models to Excel 2013). It took less then a minute to do the upgrade in my case, it make take longer depending on a size of your model. And then I deployed this upgraded Power Pivot data model file to the SharePoint 2013.

I could have done the same thing with the Power View .rdlx file by just copying it to the new SharePoint location; however the report still was connected to the old Power Pivot model file; and this needed to be fixed.

1) I've changed the file extension of my Power View .rdlx report to .zip and opened it as an archive file:

2) Then I went to the [Reports] folder where my PowerView report definition resided in the [report.rdl] file:

3) And then I just edited the data source connecting string and directed it to the upgraded Power Pivot data model file which had already been saved in the new SharePoint 2013 instance:

4) Then I saved my [report.rdl] file in that opened .zip archive; changed the .zip back to .rdlx and saved it in my new SharePoint 2013 Power View gallery.

There were a few gotchas that I needed to adjust in my upgraded Powev Pivot data model after this. What happened that during the data model upgrade all my custom column renaming and sorting were lost and this resulted in breaking some of the relationships within the data model and thus my adjusted Power View report didn't show all the charts at they were in the old SharePoint 2010 environment. But with a few adjustments to the data model, the new Power View report looked exactly like the original one with only difference that now it was in SharePoint 2013 environment.

It worked, and it worked well for me :-) 

Happy data adventures!

Tuesday, July 22, 2014

SSRS report subscription: Fire Event from SSIS package

My previous two posts about SSRS reports being deployed to a SharePoint site along with their subscription didn't stop me from exploring the whole things of SSRS reports subscriptions; regardless of their deployment environment either native or SharePoint mode.

After exploring different aspects of the subject (Subscriptions and Delivery) you'll become familiar with setting up delivery types, extensions and other metadata for your subscriptions (data-driven subscription brings even more flexibility :-). The only thing that puzzled me was how I could test each of the subscriptions: schedule, schedule and nothing but the schedule; and you even don't have a UI option to initiate or trigger subscriptions.

However, MS Reporting Service has a way to create an event that will trigger an SSRS report subscriptions no mater what those schedule definitions are. You can simply call a report server stored procedure dbo.AddEvent and provide two parameters: @EventType='TimedSubscription', @EventData=@Subsciption_ID (primary key from the dbo.Subscriptions table). And whatever action is defined in your subscription, it'll get performed!

There is another way to create the 'TimedSubscription' event within a report server, it's a FireEvent method of the ReportingServices web services that you can you call from anywhere. So let's be crazy enough and test this method from an SSIS Script Component that will trigger SSRS report subscriptions which are based in a SharePoint mode report server. Here is how it worked for me:

1) I created a proxy class from the ReportingServices web service (file location and SharePoint site will be different on your side)
C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\Bin\wsdl.exe /out:ReportService2010.cs

2) Then I compiled ReportService2010.cs into DLL:
C:\Windows\Microsoft.NET\Framework64\v4.0.30319\csc.exe csc /target:library ReportService2010.cs 

3) Signed this DLL file (Adding strong name for the library)

Details for this step are taken from here:

   a) Generate a KeyFile  
       sn -k keyPair.snk

   b) Get the MSIL for the assembly   
       ildasm ReportService2010.dll /  

   c) Build a new assembly from the MSIL output and your KeyFile 
       C:\Windows\Microsoft.NET\Framework\v2.0.50727\ilasm.exe /dll /key=keyPair.snk

   d) Install the DLL in to the GAC 
       gacutil -i ReportService2010.dll

5) Created a Script Component task in my SSIS Data Flow task

6) Diverted sourcing column with the Subscription IDs data to this component

7) Added reference to the ReportingService DLL in the script component 

8) Placed a code to execute FireEvent method of the ReportingService web service

and executed my SSIS package! :-)

In a few moments I was able to see the results: emails were sent out and report files had been created; and I didn't even have to wait for schedules.

I hope that those 8 steps were detailed enough, otherwise let me know if you need more details on this.

Happy data adventures!

Tuesday, July 8, 2014

SSRS reports in SharePoint mode, Saga #2: PowerShell generates email subscriptions

Life is full of opportunities, it only takes to notice and use them. About 3 months ago I wrote a post about deploying reports using a PowerShell script (PowerShell, help me to publish my SSRS reports to SharePoint!) and at my new project I proposed to apply the very same technique of deploying reports to different environments. There was only one new thing to this, each of the SSRS reports at this new project had several data driven subscriptions (emails, file shares) and to recreate them manually, it would take a big junk of your life time.

So, that was my new opportunity that I didn't want to miss and decided to enhance my original PowerShell script even more, why not, let's utilize PowerShell potential and bring SSRS and SharePoint even closer.

Having this said, here is the complete PowerShell script (PS_Subscriptions) that does:
- Deploy SSRS datasources, datasets, reports,
- Maps all datasources and datasets for reports when it's necessary,
- and Creates report data subscription.

Main knowledge on how to automate SSRS subscription came from CreateDataDrivenSubscription method of the ReportingService assembly; also be sure to check this Microsoft knowledge base article ( that shows some of the corrections that were applied to the original examples of the CreateDataDrivenSubscription method.

Let's start from the main changes to the original script. I created two new functions that removes old subscriptions and creates new ones (CreateDataDrivenSubscription method doesn't overwrite old subscription with the same name, it just creates same name subscription with different subscription IDs):

Also, don't forget to specify a necessary delivery extension for SSRS reports, because there are 3 of them available (Report Server Email, Report Server DocumentLibrary, Report Server FileShare):

Main steps to create a new SSRS report subscription can be described with these steps:
1) Set delivery extension setting
2) Create the data source for the delivery query with data query and fields
3) Define subscription schedule
4) Set parameter values
5) Execute CreateDataDrivenSubscription method

It's been a great experience to apply PowerShell techniques in creating SSRS report subscriptions; and I've spent a few days writing this code; however I'm sure that more experienced PS developers could accomplish this task in a few hours. Nevertheless, I've learned a few lessons and would like to share them with you.

Lessons learned:

a) If you create a new datasource, then use DataSourceDefinition; otherwise if it's a shared data source use DataSourceReference.
b) Don't use parameters labels (as they're shown in SharePoint), but use real parameters names:

c) Lack of matchdata schema explanation; look at your ReporServer backend table for the XML tag examples in order to understand a schedule creation.

d) Delivery Extension parameter values are case sensitive; for Email and Sharepoint file location they're different: for example, "IncludeLink" for Email and "RENDER_FORMAT" for SharePoint file.

e) Weird error about method not being supported; check all the variables, parameters format and value and test your script again.

I've been using PowerGUI tool ( to develop and test all of my PowerShell scripts, perhaps you can suggest other PS GUI environments. But in the meantime, I continue wishing you happy data adventures!

Wednesday, April 30, 2014

And I wonder, where that column is?

I wouldn't have decided to write this post if I didn't see my boss at one of my previous jobs being really persistent and careful in his attempts to write his TSQL code. It started when I heard him pressing space-bar at his keyboard too much, I came to his desk and saw him lining up all of his select statement columns with the equal numbers of space characters before comma delimiter and actual column names, like this:

Select Column1
     , Column2
     , Column3
     , ColumnN
From TableA

I questioned him on his tedious work and he simply explained that in some text editors default tab characters are shown differently (sometimes it’s expanded to several space characters and sometimes it’s being shown as a single one). And in this case our SQL code may look like this:

Select Column1 
, Column2 
     , Column3 
 , ColumnN 
From TableA

It’s still a working code, however it will take more time to read and comprehend its scope, different elements joined by various database object, calculated or concatenated column aliases, etc. That whole example of orderly space characters from my boss had made a final impression on me about a need for a good written code which could be easily read by others.

There are many ways to make your TSQL code look more readable, several free add-ons and plugins exists for SSMS. However I would like to emphasize a big value of where you manualy place your column aliases within you SQL code.

Let's take an example of the [dbo].[vTimeSeries] view from the [AdventureWorksDW2012] SQL Server sample database, more specifically the first section of the select statement with all the columns listed:

and now compare the same code with this:

where I can easily identify column definition for TimeIndex, ReportingDate and other columns. It's basically all about how we like putting column alias, one way expression [AS] column_alias or another column_alias = expression.

I personally like using the second approach; even if someone may say that it mixes result set column naming with variable assignment. I just like it because it saves me time to locate element names within the code and then I can easily find the answer to the original question on where that column is :-)

Enjoy your day and happy data adventures!

Monday, March 10, 2014

PowerShell, help me to publish my SSRS reports to SharePoint!

Every time when I heard the word 'PowerShell' it was just like a call to Terra Incognito, you know nothing about it and the longer you stay away the more interesting it becomes to learn. You think of people who could create PS scripts to install and setup complicated systems in a single run and maybe of some others who dreamed of starting engine of their cars with the help of PowerShell (as I heard about this at one of our local Toronto SQL Server user group meetings).

There are many different PowerShell training materials available in the internet, and it only takes time and a little efforts to start learning and practicing this technology at your every walk of you life (I've read it and it made me laugh of how PowerShell has become a panacea for everything :-). Well anyway, I've kept telling myself for a very long time that someday, somehow I would get into this and would get at least a sound knowledge to start applying it at some of my database related activities.

So this day has come and at my company we were deploying a big number of SSRS reports to a SharePoint application site. I know that you can do reports deployment to SharePoint from Visual Studio and manually publish RDL files too; the latter skill I've mastered with no basic training at all :-) Our SharePoint developers handed me a PowerShell script that did exactly was I was looking for: publishing all the datasources, datasets and reports; however all mappings between them all were left unset and you had manually specify all datasets and datasources for already published reports; and it does takes a lot of time!!!

With the help of a few articles in the internet:
PowerShell:Deploying SSRS Reports in Integrated Mode
SSRS SharePoint Foundation 2010 Integration - Deployment Query

and MSDN resource ( that lets you to explore all the details of the ReportService2010 Web service, I was able to understand how PowerShell could help me.

The main issue I've had was that the ReportService2006 web service didn't have a method to map datasets for reports, but the next version of this web service now had this option.

After my long trials and tribulations to build a single script that would do all the steps of SSRS reports deployment in a single run were over, then the main section of my script looked this way:

and complete script can be downloaded from here: DeploySSRS_DataSource_DataSet_Report

You can adjust this script any way you need to: set variables values your own way, change routine logic, adjust output messages, etc. And again, I would like to repeat myself, then I didn't create most parts if this script, however after reading numerous articles and forum threads related to PowerShell; I made it running in the 2010 version of the ReportServer web service and successfully tested it in several SharePoint environments.

I hope you'll find this script useful and provide more improvements and modifications.

Enjoy your day and happy data adventures!

PS. Link to my post with the next developments of this script - SSRS reports in SharePoint mode, Saga #2: PowerShell generates email subscriptions.