Power BI for partners

Frequently, partners ask me about getting access to Power BI for a little longer than a trial period. Up until recently the only answer I could give was there was nothing available. However, there is news! Partners can now get access to Power BI through the Partner benefits portal. This blog explains it all: http://blogs.technet.com/b/uspartner_ts2team/archive/2014/07/10/power-bi-for-office-365-now-available-to-competency-partners.aspx

Hope this helps!

Power BI pro tip: using Access Online for data entry

With powerful self-service BI tools such as Power BI comes the need for business user data entry; data does not exist in source systems or does need to be enhanced / enriched before going into the report, or the business user just wants to change the way the data is organized. In those cases (which are present more often than not) we need to find a way to give the business user an easy to use way to do data entry while keeping it robust: i.e. not use a tool the user could easily make mistakes in and hurt the reporting process. You could use Excel but you would have to secure it so no mistakes can be made. Also, SharePoint lists are a good option if you have less than 5000 data rows (that’s the hard limit in SharePoint Online). If you need to store a lot of data and need a robust solution, Access Services or Access Online is a great tool for the job and the best part is it works perfectly with Power BI.

Perhaps the biggest change in Access 2013 is that it now stores that in SQL Server Databases rather than Access files. In this post I will show you how to build a sample application concerning reports on KPIs for production plants around the world. The data is entered by the business user using a web form generated by Access and the dashboard is created using Power BI. So here we go.

First step is to get the data. For that I created a simple Access 2013 application that I published on my SharePoint Online site. The Access application consists of three tables: KPIs, Periods, Plants and of course the actual facts: the KPI Values. On top of this sits a very basic data entry screen that enables the user to enter new actuals and targets for a KPI for a period for a given plant:

I entered some test data and saved the app. Imagine your business user just entering their data in here.

The next step is to get the data out of the SQL database Access Services will store it in and build a report / dashboard on top of it. For this, you will need to go to the Info pane of the File menu in Access. Look for the ‘Manage’ button next to Connections:

If you click it you get a big flyout presenting you with a lot of options. You will need to select the following:

-From My location or From Any location. I chose from Any.

-Enable Read Only connections.

See this screenshot:

Now, click on ‘View Read-Only Connection Information’ and leave it open for now. You will need to later.

Next step is to start Excel, go to Power Query, select From Database à SQL Server (and not Access since data is stored in SQL Server by default in Access 2013).

Copy paste the server and database name from the Connection information screen in Access and choose Ok. In next screen enter your credentials and passwords (again copy/paste from the connection information screen in Access). After a while you can select the table you are interested in and you can load the data into PowerPivot. I loaded my Plants, Periods and Values (I skipped KPIs since it was only the KPI label):

Next step is to create relationships between tables in PowerPivot, hide some columns as well as add a KPI definition. I ended up with this model:

Now, with Power View I created the following basic report (I did not give myself time to work on the layout, this is just quick and dirty):

 

This concludes this Power BI Pro Tip!

Hybrid BI environments with Power BI whitepaper

A new whitepaper highlighting best practices for building hybrid business intelligence environments with Power BI has been released. I highly recommend reading it: http://blogs.msdn.com/b/powerbi/archive/2014/08/26/best-practices-for-building-hybrid-business-intelligence-environments-with-power-bi.aspx

 

Combining text (csv) files using Power Query – follow-up

I have been getting quite a few responses on my original post on how to combine text (csv) files using Power Query. One of the FAQs is how to keep the filename where the data came from in your result set. Said differently: what you want is the contents of all files in a folder plus the filename of the originating files all in one table. I thought it was simply a matter of adding a column, but Nicolas pointed out in a comment that adding a column would create a cross product of all data and all filenames. So, I needed to come up with another solution. The solution I present here might not be the best one, but it works. The magic trick here is knowing that the Csv.Document() function exists. Allow me to explain. First of all, I followed the “normal” approach to list files in the folder using Power Query. What you get is: Now, it might be tempting to expand the ‘Content’ column (as I did in the original post). However, as in this scenario the goal is to get the contents as well as keep the originating filename, we need a different approach. What we need to do is add a custom column that equals the following:

Csv.Document([Content])

What this is doing is opening the contents in a single column and it expects CSV format. The custom column shows up like this: Next step is to expand the table. The contents will be displayed in your custom column: Then, I did a split on the CSV separator (semi-column in this case), so I ended up with three columns: The only thing left to do is clean up (remove columns) and filter rows (since my CSVs had headers the header from the second CSV is still in my data). The end result is: For your reference, here is my code:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
  {% raw %}
let
Source = Folder.Files("[Path to the files]"),
#"Inserted Custom" = Table.AddColumn(Source, "Custom", each Csv.Document([Content])),
#"Expand Custom" = Table.ExpandTableColumn(#"Inserted Custom", "Custom", {"Column1"}, {"Custom.Column1"}),
#"Split Column by Delimiter" = Table.SplitColumn(#"Expand Custom","Custom.Column1",Splitter.SplitTextByDelimiter(";"),{"Custom.Column1.1", "Custom.Column1.2", "Custom.Column1.3"}),
#"Changed Type" = Table.TransformColumnTypes(#"Split Column by Delimiter",{{"Custom.Column1.1", type text}, {"Custom.Column1.2", type text}, {"Custom.Column1.3", type text}}),
#"Removed Columns" = Table.RemoveColumns(#"Changed Type",{"Content", "Extension", "Date accessed", "Date modified", "Date created", "Attributes", "Folder Path"}),
#"First Row as Header" = Table.PromoteHeaders(#"Removed Columns"),
#"Renamed Columns" = Table.RenameColumns(#"First Row as Header",{{"Sales Data 1.csv", "Source"}}),
#"Filtered Rows" = Table.SelectRows(#"Renamed Columns", each ([Datum] <> "Datum"))
in
#"Filtered Rows"
  

With this I hope Sunflowers and Nicolas are happy J

Macro malware on the rise (again)

I have reported on the subject of macros and how I consider them a security risk multiple times on this blog (see http://www.dutchdatadude.com/macros-are-dead/ and http://www.dutchdatadude.com/keep-macros-under-control/). Also, in my talks macros are a frequent topic.

Now, Virus Bulletin (https://www.virusbtn.com/virusbulletin/archive/2014/07/vb201407-VBA) has observed a rise in macro related malware, specifically trojan horses. I encourage you to read the report and see for yourself why macros need to be treated carefully. Maybe, sometime in the future, macros will be really extinct. I certainly hope so. However, before that happens we will need to find a way to bring comparable functionality to end-users without the security problems.