SQL Server 2014 In A Jiffy: Credential for Windows Azure Storage

This is another #InAJiffy blog post on SQL Server 2014, particularly if you want to create SQL Server data files on Azure Storage. Below is an example of my hybrid database, called HybridTestA1 where the database instance and its log file are located on prem, but the data file (“HybridTestA1_dat.mdf”) is located on my Azure storage.

Data file stored on Windows Azure Storage Container

 

Credential for Windows Azure Storage

As I was testing out SQL Server 2014 CTP 2 and playing around with hybrid database (one or more data / log files stored on Azure and the rest stored on-premise), I found out that the tutorial on MSDN on creating a Credential was not clear. You need to create a Credential in SQL Server 2014 database instance on-premise in order to access / create the file on Azure.

The CREATE CREDENTIAL syntax that lets you access storage on Windows Azure is

CREATE CREDENTIAL [<path to the container>]
WITH IDENTITY = 'SHARED ACCESS SIGNATURE'
SECRET = '<SAS Key>'

As an example, I have a storage account, “mssqlgirldbstorage’ with a URL of https://mssqlgirldbstorage.blob.core.windows.net. In this storage, I have a couple of containers called “data” and “log”.

Below is the CREATE CREDENTIAL statement I execute for the “data” container

CREATE CREDENTIAL [https://mssqlgirldbstorage.blob.core.windows.net/data]
WITH IDENTITY = 'SHARED ACCESS SIGNATURE'
SECRET = 'sr=c&si=sqlpolicy&sig=89oywMCStX4iQrIGYVIl%2Bfh3cMtEm5ZA3fSDxh2w8najd9';

 

The sig value above is just a random generated key that I have created, so it won’t work for you. To obtain the SAS Key that you can use as the SECRET value, you can use Azure Storage Explorer tool. The documentation (see page 18 – 20) on how to create a policy and SAS key is very good.

Please note that when you use Azure Sotrage Explorer, the Signature generated looks like below. You only want to use the value that starts with “sr=c” to the end of the URI as the SECRET value in the CREATE CREDENTIAL statement.

https://mssqlgirldbstorage.blob.core.windows.net/data?sr=c&si=sqlpolicy&sig=89oywMCStX4iQrIGYVIl%2Bfh3cMtEm5ZA3fSDxh2w8najd9

02. Blob & Container Security

You can create a credential using SQL Server Management Studio (SSMS) too, but you still need to use Azure Storage Explorer to create the Policy and to generate the SAS Key based on the policy. In the example below, I am creating a credential to access my “log” container on my Azure storage. Please make sure that you enter “SHARED ACCESS SIGNATURE” as the Identity value on SSMS as shown here.

Creating Credential using SSMS

Wrap Up

Credential is needed for creating a hybrid SQL Server 2014 database, where one or more database file is located on Azure while creating a database on on-premise server.

Get familiar with Azure Storage Explorer to begin with when learning / exploring your data on Azure. The UI is really good.

Further Reading

Tutorial: SQL Server Data Files in Windows Azure Storage service on MSDN

Azure Storage Explorer on Codeplex

 

HDInsight In A Jiffy: Executing Hadoop Commands

Welcome back to another “In A Jiffy” blog post, where in these series we learn something quick and at a beginner / intro level. Since the HDInsight GA announced on 28 Oct 2013, the User Interface has been revamped, features have been modified, and documentation has been slowly updated to reflect the changes. One of your questions might be where to execute Hadoop Commands.

Where Can I Execute Hadoop Commands?

One of the ways to manage Windows Azure Blob Storage for HDInsight is by using the Hadoop command as mentioned in this documentation Use Windows Azure Blob storage with HDInsight. In order to do this, you’ll need to enable the remote connection to your HDInsight cluster and connect to it (just like you connect to a server remotely). Once you connect to it, luckily there is a desktop shortcut called “Hadoop Command Line” to make life easier to execute the Hadoop commands – e.g.

hadoop fs -ls /output/result.txt

For my version of the HDInsight cluster, the Hadoop distribution file is located on this directory:

C:\apps\dist\hadoop-1.2.0.1.3.1.0-06

So if the desktop shortcut is not available, you can launch Command Prompt on the server via remote connection and go to the path similar to the above, then start using the Hadoop commands.

That’s it for the “In A Jiffy” part.

Want more? Read on…

 

Configuring Remote Connection to HDInsight Cluster

Below is a set of instruction that elaborates how to setup remote connection to your HDInsight Cluster and where to execute Hadoop command lines.

Prerequisites

1. HDInsight Cluster created (check Your First HDInsight Cluster–Step by Step if you have not create an HDInsight Cluster yet)

2. Access to Windows Azure Management Portal

The Steps

1. Login to your Windows Azure Management Portal and go to the HDInsight cluster that you want to execute the Hadoop command against.

2. Go to the Configuration option of the HDInsight cluster and click on the “Enable Remote” button on the bottom of the screen.

HDInsight Cluster Configuration : Enable Remote

3. A “Configure Remote Desktop” window will be launched where you can create a new user that can login via Remote Desktop.

HDInsight: Configure Remote Desktop

Once you enter the details, you’ll see that the “Connect” and “Disable Remote” buttons are disabled while the Remote Desktop access being configured is created in the background.

HDInsight: Enabling remote desktop

The background configuration usually takes a couple of minutes (or less). Once it is done, click on the “Connect” button and it will start downloading an rdp file to connect to the HDInsight Cluster.

HDInsight:. Connect HDInsight:. Opening RDP file

4. When prompted, enter the credential to connect remotely.

HDInsight: Log in via RDP

Once connected, you will see “Hadoop Command Line” on the desktop – and voila you can make use your Hadoop skills here.

HDInsight: Hadoop Command line shortcut HDInsight: Hadoop command line

 

Wrap Up

Hadoop Commands can be executed on HDInsight Cluster via Remote Connection. You’ll first need to enable the remote connection.

 

Further Reading

Use Windows Azure Blob storage with HDInsight by Windows Azure

Your First HDInsight Cluster–Step by Step by Cindy Gross and Murshed Zaman

Upload data to Blob Storage using Hadoop Command Line by Windows Azure

 

 

Power BI In A Jiffy : Featured Reports

This is one of the first few series on Power BI In A Jiffy. The series is meant to provide a quick update on Power BI related news. Being on the cloud, sometimes changes happen over night and new features are added. The Power BI developer team has the ability to release updates in a shorter cycle than most on-premise / box applications.

Power BI Preview Featured Reports

Power BI site for Power BI Preview has just recently been updated with a new section called Featured Reports as shown below. If you have signed up for Power BI Preview, you would see the following addition on your Power BI site.

Power BI Site - Featured Reports

 

Excel files that are listed under Documents can be added to the Featured Reports section by clicking on the “” area and choosing Feature option.

 

Add Feature

 

 

 

If you wish to remove it from the Featured Reports list, you can simply click on the “” sign of the report that you wish to unfeature as shown below.

Unfeature

 

Please note that Featured Reports are effective site-wide, meaning that other users / team members would see those reports as featured too. You can feature and unfeature reports that are created by other users. In my example above, I have just featured / unfeatured report created / uploaded by another team member at LobsterPot Solutions.

 

Wrap Up

Featured Reports help the team to access frequently used / popular reports quickly. Reports / Excel files are promoted to Featured Reports by a member of the team, for the team.

 

Further Reading

Power BI for Office 365

 

Upcoming SQL Server Presentations – Nov to Dec 2013

I am very delighted to announce that I have been selected as a speaker in the following events:

1. SQL Saturday 265 – 16 Nov 2013, Oregon

Understanding Query Modes in Tabular Model

2. Auckland Business Intelligence User Group – 27 Nov 2013

From Impaired to Insightful: Analysis with Geospatial Data

3. Canberra SQL Server User Group – 3 Dec 2013 (to be confirmed)

Understanding Query Modes in Tabular Model

 

I have also been busy organising a number of webinars for PASS DW/BI Virtual Chapter for November, December and January. If you would like to speak or to volunteer for the Virtual Chapter, please let us know at passdwbivc at sqlpass dot org.

PASS DW/BI VC

Hope to catch you in one of my sessions or PASS DW/BI VC sessions!

 

Julie

 

 

Annual #SQLFamily Reunion – PASS Summit 2013

A few people asked me, “Wait, you’ve just moved from US to Australia for a month and you’re flying back to US again?”. I generally grinned when I got asked this question. Sometimes you knew what you were doing, and no matter how many times people asked you to check your validity (or sanity), you were content with your decision and grinned.

Well, because I can not grin via blog, and I have a few hours before my return flight home to Australia, here is what I thought about my third PASS Summit..

Why PASS Summit 2013?

It is a big deal to me. I have just joined LobsterPot Solutions as a full time member, interestingly enough, PASS Summit is the one time in the year that we see each other, which happens to be in another country instead of our own! Yes, we are unique that way and we make a great team. This year, I was excited to see and to hang out with Rob Farley (b | t)  and Martin Cairney (t), as work mates, a team and #SQLFamily..

Secondly, I was selected to present on “From Impaired to Insightful Analysis with Geospatial Data“, a topic that I continued revising in the past year. Plus, I have also volunteered as a Virtual Chapter Mentor and a PASS DW/BI Virtual Chapter Leader role, so it was an honour to have an in person meetings with the other Virtual Chapter leaders and PASS coordinators; Denise McInerney (t) and Karla Landrum (t). It truly is about Connect, Share and Learn!

Last, but not least, PASS Summit is my ritual  reunion with a lot of friends from all over the world, and to meet with new friends too. I also felt honoured to have Mark Stacey (b | t)  sign a copy of my Visual Intelligence book (thank you to Christina Leo (b | t) for giving me a copy of the books). Most importantly and I could not begin to say how much I am proud of her, I planned my PASS Summit trip with Mickey Stuewe (b | t), my #SQLSister. Mickey has grown so much since I met her last year, as a SQL Professional, speaker, blogger and writer! Yup, my #SQLSister rocks!

VisualIntelligence signed

“To my twitter friend I finally met in person! Mark”

Visual Intelligence: Microsoft Tools and Techniques for Visualizing Data

 

What was your best moment?

Best moment was when I felt welcomed, which was all the time. From attending sessions, speaking at a session / interviews, talking to new friends, talking to old friends, being invited to official and unofficial PASS Summit after parties! I mean, wow! Sometimes by the end of the day, my cheeks would hurt from smiling and grinning all the time, but it felt great!

What was your sad moment?

On Friday at 5:15pm, after I attended Oliver Engels’s (t) and Julian Breunung’s brilliant and entertaining session on “R” vs “M”: Languages for Data Exploration. I soon realised that it was over for the year – no more PASS Summit 2013 session.

What was your funniest moment?

It’s hard to pick one, so I’m going to list 3:

Being at Quizbowl with Mickey and Mike Donnelly (b | t), losing points to -700 and brought it up as a team to 1100 which put us 2nd rank! Mickey and I really had a great time, losing and winning every single point with laughter.

Wearing the awesome propeller hats as Alumni Summit Buddies with Mickey, and coming up with “SQL-Dee” and “SQL-Dum” name for ourselves (as opposed to Tweeddle-Dee and Tweeddle-Dum). Photo courtesy of Mickey Stuewe.

Owly Images

Oh, did I mention I demo-bombed Kasper de Jonge (b | t) at BI Power Hour. Kasper is a polite Program Manager at Microsoft, who has helped me with a couple of Power Pivot bugs/questions this year, and I accidentally demo-bombed his presentation, commenting his accent. The 300+ attendees were laughing and entertained :)

Where to next?

I am very much looking forward to watching the session recordings. But in the mean time, I have checked out a few of the PASS TV episodes, and will continue watching them in the next couple of weeks.

I have learned that there are other good conferences all year round, around Europe/UK. I love travelling – I always have. So, where to next should hopefully include one of the conferences in Europe/UK as 54 weeks til another PASS Summit is way too long to wait.

 

 

Thank you for reading.

Until the next time we meet, hug and cheer as a #SQLFamily, stay safe and healthy.

Julie

 

Other Related Articles

Prelude to my PASS Summit Summary by Mickey Stuewe

PASS Summit 2013. We do not remember days, we remember moments by Maria Zakourdaev.

PASS Summit – looking back on my first time by Jonathan (@fatherjack)

 

 

Power Query for D&B Data Mashup – Step by Step

A couple of weeks ago, I posted an introduction to Power BI for data mash up using D&B data. Generally speaking, before you buy anything, you want to test it out first. The same rule applies to data. You want to make sure that you get the right quality of data and do some testing on this. The easiest way for me to do this is by using Power Query. There are other ways to test it out of course, but in this article I will focus on using Power Query.

Case Study

I have a list of companies with City, State and Country, that may or may not be correct. The requirement is to get more company details, such as Number of Employees, Annual Sales, the year that the company started, the Legal Status and Phone Number. These are common information that competitors or marketers would like to investigate.

After a few minutes of research at Windows Azure Market Place, I have found a couple of interesting data services provided by D&B:

1. Company Cleanse & Match for Microsoft SQL Server Data Quality Services – I will refer this as Company Cleanse & Match for the rest of the article.

2. Company Firmographics – Industry and Sales - I will refer this as Company Firmographics for the rest of the article

Completed Excel Solution is available here: Power Query for DnB Data Mashup.xlsx. You will need Excel 2013 with Power Query installed – check full System Requirements for Power Query here.

Step by Step

1. Prepare test companies

Let’s first start with the data that we want to test out. I have a list of 7 companies in an Excel table, called “CompanyInput” as shown below.

Excel Table: CompanyInput

 

 

2. Create a Power Query function for Company Cleanse & Match

The strategy here is to create a Power Query function that represent the call to Company Cleanse & Match, which can be called by another query.

To do so in the same Excel workbook:

a. Click on Power Query > From Other Sources > From OData Feed from the ribbon.

Create Power Query function for Company Cleanse Match via OData Feed

Note: At the time of testing this, the D&B Company Cleanse & Match service does not work with the Windows Azure MarketPlace option.

b. Enter the Service Root URL as provided on the “Details” tab of the D&B Company Cleanse & Match page, which is:

https://api.datamarket.azure.com/DNB/DQSCompanyMatch/v1/

Specify OData Feed URL of D&B Company Cleanse & Match

Note: This step assumes that your Azure account has a valid subscription to the D&B Company Cleanse & Match service.

c. Clicking OK will display the Query Editor window with the following message:

Power Query dialog box

d. Rename the “Query1″ to “fnSuggestCompanyDetails”. The new name is now the friendly name that we will use in other Queries that we will create using Power Query.

Under the Navigator menu, you can see a “fx” symbol next to SuggestCompanyDetails, which is the function that is exposed by the D&B Company Cleanse & Match service. Clicking on the script symbol on the Formula Bar as shown below will bring up the Query Editor window.

Script Button

So, all the steps that we have done in Step 2 essentially is to produce the script as shown below, which defines “fnSuggestCompanyDetails”.

M query - fnSuggestCompanyDetails

Code:

let
 Source = OData.Feed("https://api.datamarket.azure.com/DNB/DQSCompanyMatch/v1/"),
 SuggestCompanyDetails = Source{[Name="SuggestCompanyDetails"]}[Data]
in
 SuggestCompanyDetails

e. Clicking Done on Edit Query then on Query Editor, will create a new worksheet, Sheet2 containing the fnSuggestCompanyDetails definition.

New Worksheet containing fnSuggestCompanyDetails

 

 

3. Repeat Step 2 for Company Firmographics service. The OData Feed URL for this service is:

https://api.datamarket.azure.com/DNB/CompanyFirmographics/v1/

Don’t forget to rename the Query to fnGetFirmographic, so that you have something similar as below:

fnGetFirmographic

This step should create a new worksheet, called Sheet3 which contains the definition of the new function of fnGetFirmographic.

New Sheet created to contain fnGetFirmographic

Code:

let
 Source = OData.Feed("https://api.datamarket.azure.com/DNB/CompanyFirmographics/v1/"),
 ReturnData = Source{[Name="ReturnData"]}[Data]
in
 ReturnData

4. Go to Power Query and click on Fast Combine.

Note: This step is important as it allows. you to call other Queries (or functions that we have defined earlier) in another Query within the same workbook.

Enable Fast Combine Click on Enable on the next dialog box: Enable Fast Combine     5. Create a new Query based on the CompanyInput table in Excel. a. Place the cursor on CompanyInput table, and click on Power Query > From Table From Table Clicking From Table will bring up the Query Editor dialog box as follows, which brings the data from the table in CompanyInput table.

Query Editor: CompanyInput

b. Right click on Min Confidence column, then click on Insert Column > Custom… Insert Column Custom   c. Enter the function as specified below. The order of the input parameter must match exactly as laid out by D&B Company Cleanse & Match in Step 2d.

Insert Custom Column - fnSuggestCompanyDetails

  d. Clicking OK on the dialog box will return Query Editor with a new column added called “Custom” with “Table” value. Right click on the Custom column header, and choose to Expand a handful of columns (including DunsNumber as you will need it as an input parameter for the fnGetFirmographic call later).

Expand Custom column to get DUNS Number

Clicking OK will cause Power Query to call D&B service to bring back the values for the asked columns, and after a few seconds, the Query Editor dialog box should display more Custom columns as shown below:

Output of Expanding Custom Columns

  e. Scroll right across so you can see Custom.Confidence column, and insert a new Custom column to bring back more data from fnGetFirmogrpahic.

Insert Custom Column - fnGetFirmographic

  f. Similar to Step 5f, expand the Custom column to choose a few columns from the Company Firmographics output. Expand Fimogrpahics Columns   g. Clicking OK from the drop down list above should return Query Editor with additional chosen columns as shown below.   Custom Column Details - fnGetFirmographic h. Clicking Done on Query Editor will create a new sheet that contains the final output of the data from both D&B Company Cleanse & Match as well as Company Firmograhics. Final Output   Code:

let
 Source = Excel.CurrentWorkbook(){[Name="CompanyInput"]}[Content],
 InsertedCustom = Table.AddColumn(Source, "Custom", each fnSuggestCompanyDetails([Company Name],null,null,null,[State],[City],[Country],null,[Max Returned],[Min Confidence])),
 #"Expand Custom" = Table.ExpandTableColumn(InsertedCustom, "Custom", {"RecordId", "DunsNumber", "CompanyName", "AddressLine", "ZIP", "State", "City", "Country", "Confidence", "MatchGradeText", "MatchGradeComponentCount", "MatchGradeComponentTypeText", "MatchGradeComponentRating", "MatchGradeComponentScore"}, {"Custom.RecordId", "Custom.DunsNumber", "Custom.CompanyName", "Custom.AddressLine", "Custom.ZIP", "Custom.State", "Custom.City", "Custom.Country", "Custom.Confidence", "Custom.MatchGradeText", "Custom.MatchGradeComponentCount", "Custom.MatchGradeComponentTypeText", "Custom.MatchGradeComponentRating", "Custom.MatchGradeComponentScore"}),
 InsertedCustom1 = Table.AddColumn(#"Expand Custom", "Custom", each fnGetFirmographic([Custom.DunsNumber])),
 #"Expand Custom1" = Table.ExpandTableColumn(InsertedCustom1, "Custom", {"AnnualSalesUSDollars", "EmployeesHere", "EmployeesTotal", "LegalStatus", "CompanyStartYear", "SingleLocation", "OutOfBusinessInd", "LineOfBusiness", "IndustryCodeType1", "IndustryCode1", "IndustryCodeType2", "IndustryCode2", "FormerCompanyName", "TransactionFlag", "TransactionStatus", "TransactionStatusCode"}, {"Custom.AnnualSalesUSDollars", "Custom.EmployeesHere", "Custom.EmployeesTotal", "Custom.LegalStatus", "Custom.CompanyStartYear", "Custom.SingleLocation", "Custom.OutOfBusinessInd", "Custom.LineOfBusiness", "Custom.IndustryCodeType1", "Custom.IndustryCode1", "Custom.IndustryCodeType2", "Custom.IndustryCode2", "Custom.FormerCompanyName", "Custom.TransactionFlag", "Custom.TransactionStatus", "Custom.TransactionStatusCode"})
in
 #"Expand Custom1"

 

 

You can also rename columns. To do that, click on Filter & Shape under Query Settings side pane. For example, instead of Custom.DunsNumber, it would be more user friendly to call it DUNS Number.

 

Important Notes

Every time you make changes to the query, Power Query will make the calls to D&B Data Service. Since the D&B Services are based on number of transactions per month, it is advisable to make sure that you are using a trial subscription while developing.

The Step by Step section above is accurate as at 1 Sep 2013. There may be some changes to the API which may render the instruction above invalid.

Earlier I mentioned that try it before you buy it. Most companies that sell data, should be able to provide you with reasonable good documentation and would be able to supply you with the latest documentation in case their data service changes. As an example, the Company Cleanse & Match for Microsoft SQL Server Data Quality Services provides Reference Guide, which is helpful in understanding what data you are getting.

Company Cleanse & Match for Microsoft SQL Server Data Quality Services Reference Guide

 

Wrap Up

Data has become a commodity where it is sold, massaged and reused in many different ways. One of the ways to purchase data is through Windows Azure Market Place. Power BI helps us easily to test data and to quickly  prototype self-service analysis. Using Power Query, you can retrieve and perform data mash up, within Excel. The data sources can come from an Excel table and a number of other data sources such as D&B Data services on Windows Azure Market Place, as shown in the case study above.

No doubt when you deal with larger scale of data with more complexities, that it may be better to use SQL Server to do the job, as shown in this example.

 

Further Reading

Company Cleanse & Match for Microsoft SQL Server Data Quality Services - Windows Azure Market Place

Company Firmographics – Industry and Sales - Windows Azure Market Place

Power Query articles by Chris Webb (No doubt that if I have some questions about Power Query, I generally go to Chris Webb’s website first)

Power Query: Creating a Function to Correct and and Verify Addresses via External API – Rafael Salas (Thanks to Rafael for the great tutorial on how to reusing code in Power Query via functions)

Cleansing your Customer Data using Dun & Bradstreet DQS service - MSDN DQS Blog

Power BI for D&B Data Mashup – Introduction - Ms SQL Girl

 

PASS Summit 2013 Check List

PASS Summit 2013

Last year, I wrote a number of tips leading to PASS Summit 2012. As time is short (and also I am not as familiar with Charlotte), I prefer to write a summarised check list instead.

The list below is dedicated to my PASS Summit First Timers group, which I think may be useful for Alumnis as well. If there is anything that I should add here, I would really appreciate it if you could share it with us.

1. If you are not local, make sure your accommodation and travel arrangement is done. It may also be a good idea to organise for travel insurance. Session schedule is out already and so is the Schedule Builder. (If you are in my First Timers group, you would have sent me your top 3 things of what you want to achieve at PASS Summit. It is a good idea to use these top 3 things to guide your session selection)
Based on past experience, it would be a good idea to choose an additional backup for every time slot, just in case the preferred session is cancelled, the room is full or the session is not really what you are after. In the past, there is Guidebook for it too, which will most likely to be announced on the week of PASS Summit.

2. Melissa Coates who is based in Charlotte, has posted a fantastic article on the layout of the Convention Center.

3. Networking and staying connected is an important aspect of PASS Summit (or PASS community in general). So I highly encourage you to check PASS Summit Connect page.

4. The PASS WIT team pours their heart and soul into organising the WIT Luncheon. It is free for everyone to attend. Similar to last year, the discussion is in panel format that allows attendees to voice their thoughts and opinions. Check it out here.

5. There are a few After Hour events organised by SQL Sentry and community members to let you see what Charlotte has to offer. If you are not local, also check out Patrick Keisler post on Charlotte.

6. Have you ever heard of SQL Karaoke? It is a lot of fun! Get your singing voice ready or just chillax! You have the option to attend the Tuesday one (thank you SQL Sentry and SIOS) or the Wednesday one (thank you Pragmatic Works); or both.

7. Don’t forget to visit the Community Zone - I would feel very honoured if you could stop by and say hello to me at either of my booth times – Wednesday: 10:15 am to 11:30 am and Friday: 9:00 am to 10:00am.

8. Get your questions answered! If you have some questions that have been tickling your brain, either Performance Tuning related, SSAS or Professional Development topics, get ‘em ready. PASS Summit gathers MVPs around the world, from as far as Australia, Europe and Africa, as well as SQL Server CAT team and Office365 / BI team too!

9. Bring a notepad, your Surface RT/Pro, laptop or iPad to take notes. Don’t forget the charger too! For international travellers, don’t forget a plug adapter.

10. Business Cards, with your latest details!

11. If you have been thinking about getting certified, PASS Summit is a great time to do it ! I passed my 070-465 exam last year at PASS Summit. You can pass yours too! Did I mention, it’s 50% off?

12. Feel like PASS Summit is not long enough? Check out SQL In the City on Monday and SQL Saturday #237 Charlotte on Saturday right after PASS Summit.

13. Connect with other First Timers and Summit Buddy on Twitter with hash tag #SQLFirstTimers

14. Attend a webcast for First Timers by Rick Heighes on 3 Oct 2013 at 12:30 PM ET. http://sqlblog.com/blogs/rick_heiges/archive/2013/09/26/what-to-expect-at-the-pass-summit-2013.aspx

15. Post your thoughts and suggestions for the First Timers Orientation here before the Summit starts http://feedback.sqlpass.org/forums/223060-first-timers-orientation-at-pass-summit-2013

16. Check the First Timers page regularly for the latest news: http://www.sqlpass.org/summit/2013/connect/firsttimers.aspx

17. Lastly but most importantly, please stay healthy. Drink plenty of water, eat healthily, maintain your exercise regime (check out #SQLRun and #SQLWalk) if possible and do things in moderation. We want you to be able to learn and have fun at your absolute best condition.

See you there!

 

 

 

Power BI for D&B Data Mashup – Introduction

This article offers a gentle introduction to Dun and Bradstreet (D&B) Data services on Azure Marketplace. Using only Excel 2013 Power BI, you can perform ETL on the D&B data with Power Query, create reports with Power View and generate 3D visualisation with Power Map for story telling. There will be a second  post outlining steps to perform data mashup using Power Query.

Let’s get started with a couple of D&B Data available on Azure Marketplace that are used in this series of Power BI posts.

D&B Data on Azure Marketplace

Recently I have been invited by D&B to review a number of their Data Services on Azure Marketplace. At the time of writing, there were 10 data services provided by D&B. The two that I find quite interesting are:

Company Cleanse & Match for Microsoft SQL Server Data Quality Services

Despite its name, the Company Cleanse & Match for Microsoft SQL Server Data Quality Services is not specific for Microsoft SQL Server Data Services (DQS). For the remaining of this article, I am going to refer to this particular data service as Company Cleanse & Match.

By using the Company Cleanse & Match, you would be able to retrieve more information including Duns Number, Address details, the Confidence level of the matching against the input specified and a few other columns. Based on the current specification, minimally you would need to have the following information:

  • Company Name
  • Country
  • Maximum Suggestions (the number of suggestions to be returned by the D&B service)
  • Minimum Confidence (the minimum confidence level for matching)
  • State

In simpler words, this offering is a sophisticated way of searching companies online, such as local restaurants. After using it for a while, it is more useful than what it first appeared to be as it also provides DUNS Number. D&B introduced DUNS Number, a numbering system back in 1963 to support D&B’s credit reporting practice. The number is key to most business owners based in US for company identification as well as applying for credit (Source: sba.gov.au)

Company Firmographics – Industry and Sales

The Company Firmographics – Industry and Sales, based on the current specification, is more useful, if you have DUNS Numbers of the companies you want to search for. Not only, the service provides the complete address (Mailing and presumably physical address), it also provides other information such as the total number of employees, the legal status, the year company started, the line of business, Industry Code and more.

Using the data generated by the queries from Company Cleanse & Match and Company Firmogrpahics, you could create insightful story / visualisation just using Excel.

 

Visualisation Example

In this example, I use both Company Cleanse & Match and Company Firmographics to retrieve business details of 50 random camera related companies in California.  The basis of the visualisation is the Power Query output from the data mashup. The output is in a table format as shown below.

Data Mashup Output form Power Query

Using Power View, I can generate a simple bubble map as follows to visualise the distribution of camera related companies and their annual sales.

Data Visualisation using Power View

 

Using Power Map, I can generate two different visualistions as part of a 3D tour. Below is a snapshot of the geospatial visualisation of camera retailers by Line of Business in my sample data.

Data Mashup visualisation using Power Map (Column bar)

Below is a snapshot of Annual Sales in (’000s) by Legal Status in the Camera Retailers data.

Data Mashup visualisation using Power Map (Heat map)

And just for fun, I have created a 3D tour of the two visualisations to a video, exported directly from Power Map.

 

Wrap Up

Using data available from Azure, you can perform data mashup to create more insightful information. The information can then be exposed to the different visualisaiton tools of your choice that are appropriate for conveying the story. In this article, we discuss two of the D&B Data Services to get more details of companies and the final visualisation results of the data mash-ups, using nothing but Excel 2013.

Please stay tuned for another post, where I will outline the steps to connect to D&B Data services and performing the data mashup in Power Query 

 

Further Reading

If you are new to Power BI, the following articles are highly recommended:

Power BI for Office 365 Overview and Learning

Power BI Preview Review by Chris Webb

Power Map Preview – Exploring The New Features! by Jason Thomas

More details on Azure Marketplace:

Learn About Windows Azure Marketplace

D&B Data Services on Azure Marketplace

Bye bye GeoFlow, Hello Power Map!

If you have been following Worldwide Partner Conference 2013 (#WPC13) and Microsoft SQL Server blog, you would have heard about all the new features that are coming to Office 365. I am excited about the news as it means that small and medium businesses have not been forgotten. I am a little biased about this as I love technology and from a consultant’s point of view, I get to think about how to leverage these new features as a powerful and competitive strategy for our clients.

One of the most impressive new features in Office 365 is Power BI. Yes, it is “Power” themed. GeoFlow is now called Power Map. The new features are not available yet, but you can register just like I have.

Power BI for Office 365

Power BI related tools are:

  1. Power Pivot
  2. Power View
  3. Power Map (formerly known as GeoFlow)
  4. Power Query (formerly known as Data Explorer)

Other exciting news on Office 365 is natural language query engine and a Data Management Gateway as outlined on What powers Power BI in Office 365?).

 

Enjoy the Power BI demo by Amir Netz at WPC 13!

 

Further Reading

What powers Power BI in Office 365? at Office 365 blog

Introducing Power BI for Office 365 at SQL Server blog

Worldwide Partner Conference (#WPC13)

 

 

Undo Ghost Pending Changes in Team Foundation Service

I have been using the Team Foundation Service (http://tfs.visualstudio.com) in the last 7 months for a couple of my projects. Recently I have had to change laptops twice due to SSD issues on Lenovo T420s. So, I am on a third laptop (aouch!). One of the challenges that I had to go through when migrating from one (crashed) laptop to another is resolving Pending Changes associated to my account workspace on the crashed laptop. Despite my best efforts to check-in my work fairly frequently, disaster was inevitable and I still had ghost pending changes.

I call pending changes associated to my workspace in crashed laptop as “ghost pending changes”.

Once I have my new laptop setup, and Team Explorer installed on Visual Studio 2010, I am ready to investigate what the ghost pending changes are from my crashed laptop, recover the files from the crashed laptop, remove the ghost pending changes and check-in the files recovered via the new laptop.

In this article, I will be using TF Command-Line Utility Commands that are also available for cloud based TFS. This makes sense and is very handy to resolve ghost pending changes.

 

Pre-requisites

Install Visual Studio 2010 and Team Explorer in the new machine.

On the new machine, connect your Visual Studio to the existing team project on cloud TFS. This will create a new workspace in the new machine.

 

 

5 Steps to Recover Ghost Pending Changes

 

Below are some quick tips to tidy up loose ends from a workspace (e.g. a crashed laptop) with Pending Changes.

 

1. Backup the changes you have made from the crashed laptop, if possible.

If your hard drive / SSD is still salvageable and you can retrieve local copy of your TFS folder(s), make a copy of them in another hard drive and store them some where. Pronto! Don’t delay it.

 

2. Find out ghost pending changes and investigate the severity of potentially lost work.

In Source Control Explorer within Visual Studio, you can check folder by folder to see the changes that are not checked in by all users. Although this can be cumbersome if you have multiple collections and folders.

Pending Changes on Visual Studio 2010

Fortunately, there is a handy command that you can use.

tf status /user:<your user name> /recursive > <output file path>

More documentation on the Status command: http://msdn.microsoft.com/en-US/library/9s5ae285(v=vs.100).aspx

 

3. Get the workspace name

Before undoing the pending changes, you will need to know the name of the workspace that you are trying to undo. In my case, I have three workspaces, one for each laptop where I have installed TFS. The first one from my first crashed laptop. The second one from the latest crashed laptop. So, I want to undo the one from the latter one.

Using the following command, I can find out the list of workspaces related to my user name.

tf workspaces /owner:<user name>

tf workspaces /owner:<User Name>

In this example, the one that I want to “undo” the pending changes is “JKOESMARNO-T420″.

More documentation on Workspaces command: http://msdn.microsoft.com/en-us/library/54dkh0y3(v=vs.100).aspx

 

4. Undo the pending changes.

Now that we understand the severity of the files with ghost pending changes, we can undo the pending changes. Use the following command to undo one collection at a time:

tf undo /workspace:<Workspace Name> /recrusive <Collection Name>

tf undo

More information on the Undo command: http://msdn.microsoft.com/en-us/library/c72skhw4(v=vs.100).aspx

Once you undo pending changes, make sure you perform “Get Latest Version (Recursive)” on each Collection via in Source Control Explorer.

05. Get Latest Version Recursively

Source Control Explorer will show the latest check outs. If it had check outs by another team member, it will show as follows:

 Source Control Explorer with other user's pending changes

 If there is no check out by other users, it will look similar to the following:

Source Control Explorer - After Undo Campaign Management

 

5. Re-do the changes and check them in.

If you are lucky and manage to salvage the local changes from the crashed laptop that are not checked in, it may be a good idea to copy them to the new system and check them in.

 

6. (Optional) Remove unwanted old workspaces

Use the following command to delete unwanted old workspaces. You could substitute Step 4 above with this step.

tf workspace /delete:<Workspace Name>

Note: If you delete a workspace that contains pending changes, Team Foundation cancels the pending changes as part of the delete process.  Deleting a workspace does not delete the files and folders on the client computer that were in that workspace.

 

Wrap Up

TF Command-Line Utility Commands are powerful and work with Team Foundation Service (a cloud based version control from Microsoft). Using TF commands, we can resolve ghost pending changes before continuing development in a new environment.

 

Further Reading

TF Command-Line Utility Commands

Team Foundation Service - a cloud powered source control management by Microsoft