Automatically building a Microsoft BI machine using PowerShell – preparation: install files using disk (post #2)

This post is #2 in the series to automatically build a Microsoft BI machine using PowerShell – see the start of series.

The first step in our preparation is making the install files available. I see two options for this, namely using VHD / disk and Azure File Service. In this post we will walk through how to make the install files available using a VHD / disk in Azure.

The way this works is making a new disk that you can store the installer files on. After you created your virtual machine that you would like to automatically you would attached that disk so the installer files are available to the machine and thus to the automatic installer script.

To create a new disk, log in into the Azure portal (either the production or the preview) and navigate to Virtual Machines. Select a machine you have created. This could be the one you will be setting up or any other VM. The disk you will create is re-usable across machines. Once you have a machine selected, click ‘Attach’ and select ‘Attach empty disk’ to create a new empty disk:

Enter a file name for the new disk and set up the size in GB you expect to be using. No need to change the host cache settings here. When done, click button at the bottom right:

After the disk has been created, login to the machine you attached the disk to using Remote Desktop. You can you download the install files and save them to the disk you have just attached. In the VM, Start ‘Disk management’ (for example by right-clicking the Windows button and selecting it from the list). You will see a notification to initialize the disk. Accept the defaults and click OK:

Once the initialization is done we need to create a partition on the disk. In Disk Management, right-click on the new disk and choose ‘New Simple Volume’. Follow the steps of the wizard, taking note of the drive letter assigned. Also make sure to set a volume label and wait for the format to finish.

Once the formatting is done, you can download and store you install files to the disk. I created a folder in the root of the disk named ‘Resources’ and created a sub-folder per software item required. I saved the install files in these folders. The scripts we will create will point to these install files. The scripts I also store on the same install disk.

When you are done downloading the files (and later making the scripts) you can detach the disk from the VM and re-attach it to another machine by using the portal and select the VM that currently has the install disk attached and selecting ‘detach disk’ and choosing the disk to detach. You can then re-attach the disk to another VM.

This is an OK way to work with the install files. In the next post we will explore an alternative way using Azure File Service.

 

 

 

 

 

Automatically building a Microsoft BI machine using PowerShell – Start of Series

I used to spend quite some time on building and re-building Microsoft BI demo machines. As you can imagine this manual process takes a lot of time and effort. Therefore (and also for my own education on PowerShell) I decided to look into automating the whole process. I will explain this in this series of posts.

The goal

In the end, we want to have a virtual machine that is configured as follows: Windows Server 2012 R2, with Active Directory Domain Controller role. Additionally, SQL Server 2014 is installed and configured as well as SharePoint 2013. Finally, the BI tools like Power Pivot and Power View are configured.

Ok, but how do we build such a machine?

Here are the steps to take. I always do them in this order, partly because there are some dependencies and partly because it stops me from going insane.

  1. Install Windows (doh). I will skip this step (therefore it is number 0) since I use Azure and a VM in Azure comes with Windows Server pre-installed. I happen to use Windows Server 2012 R2 b.t.w.
  2. Disable Internet Explorer Enhanced Security Configuration. Although it is a great idea (see http://technet.microsoft.com/en-us/library/dd883248(v=WS.10).aspx for more info on this) it is hard to give a good demo on the machine with this thing on. So first step is disabling it.
  3. Set up Active Directory; AD is required for the PowerPivot service.
  4. After AD has been set up we need to promote the Domain Controller.
  5. After promotion we configure a very unrestrictive password policy; remember, this is just a demo machine!
  6. Virus protection is important, even for a demo machine; therefore set up System Center Endpoint Protection.
  7. Install SQL Server 2014.
  8. Install SharePoint.
  9. Install PowerPivot Service.
  10. Configure PowerPivot Service.
  11. Configure last parts of PowerPivot Service.
  12. Configure Master Data Services.
  13. Configure Data Quality Services.
  14. Configure other SharePoint Service Applications.
  15. Activate SharePoint site features.
  16. Add favorites in Internet Explorer to point to MDS and SharePoint site.

In this blog series I will share my PowerShell code to accomplish this. Please note that I am not a developer so things can probably be done a lot smarter J

Next step is preparation: the install files.

First look: Project Oxford – powerful face, vision, language and speech APIs

Just announced: Project Oxford (http://www.projectoxford.ai), a project that aims at providing powerful APIs for developers that are looking to use face, speech and language recognition capabilities to their applications. The site of Project Oxford provides a way to interact with the APIs currently available.

In this first look we will focus on the Face APIs that can be used for face detection (finding faces in photos, determining gender and age), age verification (checking that a person in two photos is the same), similar face searching, face grouping and face identification.

Here is a sample result of the Face Detection API (you can do this yourself too!) using a photo of me:

This is actually quite good, this picture was made a while ago and I was 29 at the time, so not too far off.

The Face Verification API allows you to check if the person in two photos is the same. Here is the result of my test (again, do this yourself!):

Have a look at that, I turn out to be the same person in both photos J

This is very powerful stuff, I am looking forward to start using this in projects. Will keep you posted on that.

Azure SQL Database almost on par with the on-premise version

Hi blogreaders J

I started the blog with the title: “Azure SQL Database almost on par with the on-premise version”.

And looking from a T-SQL perspective it is.
Our latest version of Azure SQL Database (let’s call it SQL Azure for now) is for 95% compatible with SQL 2014.
So not a bad title!

But there is more to it…

There are two perspectives to take in account here:

  1. Microsoft is not only working on T-SQL compatibility, but is working on al fronts to enhance functionality.
    So actually… the engine powering SQL Azure is already beyond SQL 2014 and more towards SQL 2016.
  2. SQL Azure is a PAAS offering in Azure.
    So… you don’t only get the database, but also services around it in maintenance like:
    1. Automatic replication of your date 3 times (and 3 more times if geo-redundant)
    2. Point in time restore up to 14 days
    3. Patching, security updates

Thinking of SQL Azure of being on par with SQL 2014 is kind of old skool thinking.
Thinking of SQL Azure should be about thinking about using a relational database as a service.
And those services will enhance in time.

So no more thinking about versions, but thinking about functions.
This changes the paradigm of developing big time!
You will have to develop is a much more pure way.
Keeping your application architecture clean and crisp.

For example: No more business logic in database triggers, but logic in webservices and the call’s to the database in as plain as ANSI-SQL as possible.
Code examples can be found at: https://msdn.microsoft.com/en-us/library/azure/ee621787.aspx
And to keep up to speed on the latest developments of SQL Azure keep checking the blog: http://azure.microsoft.com/blog/

Happy programming!

Walkthrough: how to connect to Dynamics CRM Online with Power BI

In this walkthrough I will step through the process of connecting to a Dynamics CRM Online instance with Power BI (specifically Power Query).

For this you will need to latest version of Power Query installed. After you launched Excel, navigate to the Power Query tab and choose From Other Sources à Dynamics CRM Online. You will need to enter the service URL in this window:

The OData service URL takes the following format: https://.crm.dynamics.com/XRMServices/2011/OrganizationData.svc.

Once you filled out your tenant name click OK. In the next screen you will be asked for your credentials. Since I use a demo environment I will need to use an organizational account. If your organization uses Dynamics CRM Online in production chances are you will be automatically authenticated or can use your Windows account.

Next step is to specify what tables I would like to load:

I chose OpportunitySet, since I wanted to get a list of the opportunities in the system. The opportunities have a modified date which I would like to show as an ‘age’ in days; meaning that I would like to show the number of days that have passed since the opportunity was last modified. I can easily do that using the Power Query editor (select the table and click Edit); select the ModifiedDate column and use Transform à Date à Age to calculate a rather exact age:

After the transformation the column looks like this:

This is awfully exact, I only wanted the age in number of days. To change this choose Duration à Days:

And now the column reports 60 days.

 

As you can see, it is very easy to retrieve data from Dynamics CRM Online; we even did a typical ‘age’ or ‘number of days passed since’ type of calculation, because retrieving the data was so easy!

Link to a one-slide Powerpoint to show the latest services in Azure

Image

Use below link to download a very usefull slide which shows you all IAAS and PAAS services within Azure:

Overall services in Azure

Have fun,

Harry

Platform as a Service (PAAS) moving a rocket speed!

As an Enterprise Architect in an organization life has always been dynamic to say the least! It is your responsibility to keep up with the latest developments in ICT both in technique as in architecture. In the old days of on-premise only that was a big challenge. But with the Cloud as a integral part of your information systems it became even more complex.

But still… The Cloud was moving vm’s to Amazon or Microsoft. So architecturally not that complex. Identity & access off course, but that’s about it. Then came Platform as a service (PAAS). That was something completely different! Not moving vm’s to the Cloud, but move complete technical workloads to the Cloud like an ESB in the Cloud, Media Services, Federated identity, Storage, etc, etc..

This does impact your architecture!

A blazing 78 new PAAS services were introduced in 2014 within Azure. So it’s moving rocket fast! And to be fair: not only at Microsoft, also are other Cloud vendors moving into the PAAS area with new services.

What is the impact for you as Enterprise Architect?

In your normal day to day work you make choices based on software you can purchase and implement at your data center. But now you should at least ask yourself for every choice you have to make: Do I want to do this myself or shall I take this as a service from one of the Cloud vendors.

An example: Your organization wants to use Cloud services from multiple Cloud vendors but you want a single sign on experience for your users. Now you can buy a federated identity server, do research on all Cloud vendors on how to connect and then build the connections. But you can also use The Windows Azure Active Directory Federation Service (ADFS) from Microsoft with over 2600 Cloud vendors already pre-installed.

Second example: You have a new web application that you need to deploy. Again you can buy a few servers, install IIS, SQL Server, the application and install everything and schedule things like backup, patch management, storage, etc., etc. But you can also take a web-role to host the web-application, Azure SQL database to host you data and let Microsoft worry about backup’s, 3 replica’s for DR, patching the server, etc.., etc.

So my message to all you Enterprise Architects out there: Examine carefully the PAAS offerings from the Cloud vendors before making expensive buy decisions. My recommendations to checkout:

Azure Service Bus, Azure Machine Learning, WAAS, BizTalk Services and Azure SQL Database. Next blog-post I will dig deeper on Azure SQL Database.

Power BI Public Preview now available worldwide

Yesterday, Microsoft announced that the Power BI Public Preview is now available worldwide. Until now only US based users could access the preview. Not anymore, so you can checked out all the great new stuff right now on http://www.powerbi.com.

See http://blogs.msdn.com/b/powerbi/archive/2015/03/16/power-bi-preview-now-available-worldwide.aspx

Happy Power BI-ing!

New Power Query update

Recently a Power Query update was released (see http://blogs.office.com/2015/03/05/3-updates-excel-power-query/). Mayor updates: performance on load, Dynamics CRM Online connector and new transformations, most notably advanced date/time calculations. Personally I enjoy the CRM Online connector, but I am most fond of the ‘Age’ transformation; it makes it very easy to do the typical ‘number of days since this order was entered’ type of calculations, since it compares the date in the column with today.

The update to Power Query is available here: http://www.microsoft.com/en-us/download/details.aspx?id=39379&WT.mc_id=Blog_PBI_Announce_DI

Enjoy!

 

In memory technology in SQL Server

Everybody noticed the increase in technology using in memory techniques. At SAP they fully go for Hana, Oracle just started last year with in-memory database. At Microsoft we started in 2012 with in memory analytics and added OLPT in memory April 2014. The buzz is high with big marketing events, lots of whitepapers and broad press coverage.

So, but what about the real life practice?

When I visit my customers (top-50 in The Netherlands) I rarely see in memory databases used. So I always ask why they don’t make use of it. This resulted in the following reasons:

  1. I didn’t know I could run in memory with my databases.
    The marketing engine could be hitting the wrong people. Lots of database administrators are not up to speed.
  2. We don’t do it because it must be very difficult.
    True – if you use SAP then it’s common knowledge that implementing SAP Hana is not very easy. And you have to rewrite some of your programs. False – if you use Microsoft SQL Server. To start using in memory you can switch it on for certain tables (or part of tables) and without any change to the application it will work.
  3. The power of our servers is high enough. We don’t need the power.
    This is of course a compliment that our SQL Servers run so smoothly (-:

But still I think that by using in memory technology you can achieve the following:

  • Prevent hardware refresh. If servers run out of performance, moving to in memory the speeds increases again by 5x – 10x. Thus the servers can remain the same.
  • Run more VM’s on a host. By using in memory technology the number of cores can be less because of the more economic processing in memory. Thus more core’s for new VM’s on the same host.
  • Increase processing to reduce wait time for your users

So my advice: Start experimenting with the technology and look for those business cases.

My ask: anyone who has experience with the practical implementation: please reply with your live experience!