Walkthrough: how to connect to Dynamics CRM Online with Power BI

In this walkthrough I will step through the process of connecting to a Dynamics CRM Online instance with Power BI (specifically Power Query).

For this you will need to latest version of Power Query installed. After you launched Excel, navigate to the Power Query tab and choose From Other Sources à Dynamics CRM Online. You will need to enter the service URL in this window:

The OData service URL takes the following format: https://.crm.dynamics.com/XRMServices/2011/OrganizationData.svc.

Once you filled out your tenant name click OK. In the next screen you will be asked for your credentials. Since I use a demo environment I will need to use an organizational account. If your organization uses Dynamics CRM Online in production chances are you will be automatically authenticated or can use your Windows account.

Next step is to specify what tables I would like to load:

I chose OpportunitySet, since I wanted to get a list of the opportunities in the system. The opportunities have a modified date which I would like to show as an ‘age’ in days; meaning that I would like to show the number of days that have passed since the opportunity was last modified. I can easily do that using the Power Query editor (select the table and click Edit); select the ModifiedDate column and use Transform à Date à Age to calculate a rather exact age:

After the transformation the column looks like this:

This is awfully exact, I only wanted the age in number of days. To change this choose Duration à Days:

And now the column reports 60 days.

 

As you can see, it is very easy to retrieve data from Dynamics CRM Online; we even did a typical ‘age’ or ‘number of days passed since’ type of calculation, because retrieving the data was so easy!

Link to a one-slide Powerpoint to show the latest services in Azure

Image

Use below link to download a very usefull slide which shows you all IAAS and PAAS services within Azure:

Overall services in Azure

Have fun,

Harry

Platform as a Service (PAAS) moving a rocket speed!

As an Enterprise Architect in an organization life has always been dynamic to say the least! It is your responsibility to keep up with the latest developments in ICT both in technique as in architecture. In the old days of on-premise only that was a big challenge. But with the Cloud as a integral part of your information systems it became even more complex.

But still… The Cloud was moving vm’s to Amazon or Microsoft. So architecturally not that complex. Identity & access off course, but that’s about it. Then came Platform as a service (PAAS). That was something completely different! Not moving vm’s to the Cloud, but move complete technical workloads to the Cloud like an ESB in the Cloud, Media Services, Federated identity, Storage, etc, etc..

This does impact your architecture!

A blazing 78 new PAAS services were introduced in 2014 within Azure. So it’s moving rocket fast! And to be fair: not only at Microsoft, also are other Cloud vendors moving into the PAAS area with new services.

What is the impact for you as Enterprise Architect?

In your normal day to day work you make choices based on software you can purchase and implement at your data center. But now you should at least ask yourself for every choice you have to make: Do I want to do this myself or shall I take this as a service from one of the Cloud vendors.

An example: Your organization wants to use Cloud services from multiple Cloud vendors but you want a single sign on experience for your users. Now you can buy a federated identity server, do research on all Cloud vendors on how to connect and then build the connections. But you can also use The Windows Azure Active Directory Federation Service (ADFS) from Microsoft with over 2600 Cloud vendors already pre-installed.

Second example: You have a new web application that you need to deploy. Again you can buy a few servers, install IIS, SQL Server, the application and install everything and schedule things like backup, patch management, storage, etc., etc. But you can also take a web-role to host the web-application, Azure SQL database to host you data and let Microsoft worry about backup’s, 3 replica’s for DR, patching the server, etc.., etc.

So my message to all you Enterprise Architects out there: Examine carefully the PAAS offerings from the Cloud vendors before making expensive buy decisions. My recommendations to checkout:

Azure Service Bus, Azure Machine Learning, WAAS, BizTalk Services and Azure SQL Database. Next blog-post I will dig deeper on Azure SQL Database.

Power BI Public Preview now available worldwide

Yesterday, Microsoft announced that the Power BI Public Preview is now available worldwide. Until now only US based users could access the preview. Not anymore, so you can checked out all the great new stuff right now on http://www.powerbi.com.

See http://blogs.msdn.com/b/powerbi/archive/2015/03/16/power-bi-preview-now-available-worldwide.aspx

Happy Power BI-ing!

New Power Query update

Recently a Power Query update was released (see http://blogs.office.com/2015/03/05/3-updates-excel-power-query/). Mayor updates: performance on load, Dynamics CRM Online connector and new transformations, most notably advanced date/time calculations. Personally I enjoy the CRM Online connector, but I am most fond of the ‘Age’ transformation; it makes it very easy to do the typical ‘number of days since this order was entered’ type of calculations, since it compares the date in the column with today.

The update to Power Query is available here: http://www.microsoft.com/en-us/download/details.aspx?id=39379&WT.mc_id=Blog_PBI_Announce_DI

Enjoy!

 

In memory technology in SQL Server

Everybody noticed the increase in technology using in memory techniques. At SAP they fully go for Hana, Oracle just started last year with in-memory database. At Microsoft we started in 2012 with in memory analytics and added OLPT in memory April 2014. The buzz is high with big marketing events, lots of whitepapers and broad press coverage.

So, but what about the real life practice?

When I visit my customers (top-50 in The Netherlands) I rarely see in memory databases used. So I always ask why they don’t make use of it. This resulted in the following reasons:

  1. I didn’t know I could run in memory with my databases.
    The marketing engine could be hitting the wrong people. Lots of database administrators are not up to speed.
  2. We don’t do it because it must be very difficult.
    True – if you use SAP then it’s common knowledge that implementing SAP Hana is not very easy. And you have to rewrite some of your programs. False – if you use Microsoft SQL Server. To start using in memory you can switch it on for certain tables (or part of tables) and without any change to the application it will work.
  3. The power of our servers is high enough. We don’t need the power.
    This is of course a compliment that our SQL Servers run so smoothly (-:

But still I think that by using in memory technology you can achieve the following:

  • Prevent hardware refresh. If servers run out of performance, moving to in memory the speeds increases again by 5x – 10x. Thus the servers can remain the same.
  • Run more VM’s on a host. By using in memory technology the number of cores can be less because of the more economic processing in memory. Thus more core’s for new VM’s on the same host.
  • Increase processing to reduce wait time for your users

So my advice: Start experimenting with the technology and look for those business cases.

My ask: anyone who has experience with the practical implementation: please reply with your live experience!

Maiden blog post

Hi all,

Today I joined my colleague Jeroen in this Blogsite about Data.
Where Jeroen is the expert in BI, Big Data & Data warehousing I tend to focus on applications, databases, development & integration. And off course with a main focus on the Cloud with Azure.

I will try to keep you posted on all new developments with my personal view on it combined with my 25 years+ experience and day to day work at my customers.
So keep posted for my new posts!

Greets, Harry

Power BI trial for non-US customers

We are currently running a preview in the US of the new Power BI experience. This experience is not available yet to customers outside of the US. If you are outside of the US and want to get a trial of Power BI right now, you can set up a trial account for the “old” experience here. The new experience will be available outside of the US as well later.

R Plotting using Azure Machine Learning

Azure Machine Learning is Microsoft’s cloud data mining and machine learning solution. It features a studio that is fully web based. One of the best features is integration with R through the ‘Execute R Script’ component. One of the best things of R is the plotting capability and I recently decided to try to make R plots from Azure ML studio. It is amazing how easy this works and it really brings the power of Azure ML together with the great exploration, plotting and data manipulation capabilities of R.

Here is a very simple sample I made:

I used to Flight Delays sample dataset from Azure ML to make this. In the ML Studio you will need to create a new experiment and drag the ‘Flight Delays Data’ component to the canvas. The only other component you will need to drop on the canvas is ‘Execute R Script’ (I told you this was a very simple example). Drag a line from the data to the left most input port of the R script container like so:

Click on the R script component and edit the R script on the right. Here is my script:

 

This script gets the data from the input port and rbinds it into data.set. Then I executed a very simple plot using the plot base R package to create the plot shown above. The last line of this code is not even necessary but it was there by default.

After running the experiment the plot can be seen by selecting the right output port of the ‘Execute R Script’ container and selecting ‘Visualize':

The plot will be at the bottom of this page.

Pretty cool huh? Stay tuned for more as I will continue experimenting with R integration in Azure ML as well as other ML things.

Master Data Services 2014 Add-in for Excel published

Just a bit over a week ago a new version of the Master Data Services Add-in for Excel was released. It is the 2014 release, which will also work for SQL Server 2012. Main benefit is up to 4x performance improvement without any server configuration changes. If you need extra performance you can get 2x extra by enabling Dynamic Content Compression in IIS on your MDS server.

Here is the download page: http://www.microsoft.com/en-us/download/details.aspx?id=42298.