SQL Server 2014 In A Jiffy: Credential for Windows Azure Storage

This is another #InAJiffy blog post on SQL Server 2014, particularly if you want to create SQL Server data files on Azure Storage. Below is an example of my hybrid database, called HybridTestA1 where the database instance and its log file are located on prem, but the data file (“HybridTestA1_dat.mdf”) is located on my Azure storage.

Data file stored on Windows Azure Storage Container


Credential for Windows Azure Storage

As I was testing out SQL Server 2014 CTP 2 and playing around with hybrid database (one or more data / log files stored on Azure and the rest stored on-premise), I found out that the tutorial on MSDN on creating a Credential was not clear. You need to create a Credential in SQL Server 2014 database instance on-premise in order to access / create the file on Azure.

The CREATE CREDENTIAL syntax that lets you access storage on Windows Azure is

CREATE CREDENTIAL [<path to the container>]
SECRET = '<SAS Key>'

As an example, I have a storage account, “mssqlgirldbstorage’ with a URL of https://mssqlgirldbstorage.blob.core.windows.net. In this storage, I have a couple of containers called “data” and “log”.

Below is the CREATE CREDENTIAL statement I execute for the “data” container

CREATE CREDENTIAL [https://mssqlgirldbstorage.blob.core.windows.net/data]
SECRET = 'sr=c&si=sqlpolicy&sig=89oywMCStX4iQrIGYVIl%2Bfh3cMtEm5ZA3fSDxh2w8najd9';


The sig value above is just a random generated key that I have created, so it won’t work for you. To obtain the SAS Key that you can use as the SECRET value, you can use Azure Storage Explorer tool. The documentation (see page 18 – 20) on how to create a policy and SAS key is very good.

Please note that when you use Azure Sotrage Explorer, the Signature generated looks like below. You only want to use the value that starts with “sr=c” to the end of the URI as the SECRET value in the CREATE CREDENTIAL statement.


02. Blob & Container Security

You can create a credential using SQL Server Management Studio (SSMS) too, but you still need to use Azure Storage Explorer to create the Policy and to generate the SAS Key based on the policy. In the example below, I am creating a credential to access my “log” container on my Azure storage. Please make sure that you enter “SHARED ACCESS SIGNATURE” as the Identity value on SSMS as shown here.

Creating Credential using SSMS

Wrap Up

Credential is needed for creating a hybrid SQL Server 2014 database, where one or more database file is located on Azure while creating a database on on-premise server.

Get familiar with Azure Storage Explorer to begin with when learning / exploring your data on Azure. The UI is really good.

Further Reading

Tutorial: SQL Server Data Files in Windows Azure Storage service on MSDN

Azure Storage Explorer on Codeplex


HDInsight In A Jiffy: Executing Hadoop Commands

Welcome back to another “In A Jiffy” blog post, where in these series we learn something quick and at a beginner / intro level. Since the HDInsight GA announced on 28 Oct 2013, the User Interface has been revamped, features have been modified, and documentation has been slowly updated to reflect the changes. One of your questions might be where to execute Hadoop Commands.

Where Can I Execute Hadoop Commands?

One of the ways to manage Windows Azure Blob Storage for HDInsight is by using the Hadoop command as mentioned in this documentation Use Windows Azure Blob storage with HDInsight. In order to do this, you’ll need to enable the remote connection to your HDInsight cluster and connect to it (just like you connect to a server remotely). Once you connect to it, luckily there is a desktop shortcut called “Hadoop Command Line” to make life easier to execute the Hadoop commands – e.g.

hadoop fs -ls /output/result.txt

For my version of the HDInsight cluster, the Hadoop distribution file is located on this directory:


So if the desktop shortcut is not available, you can launch Command Prompt on the server via remote connection and go to the path similar to the above, then start using the Hadoop commands.

That’s it for the “In A Jiffy” part.

Want more? Read on…


Configuring Remote Connection to HDInsight Cluster

Below is a set of instruction that elaborates how to setup remote connection to your HDInsight Cluster and where to execute Hadoop command lines.


1. HDInsight Cluster created (check Your First HDInsight Cluster–Step by Step if you have not create an HDInsight Cluster yet)

2. Access to Windows Azure Management Portal

The Steps

1. Login to your Windows Azure Management Portal and go to the HDInsight cluster that you want to execute the Hadoop command against.

2. Go to the Configuration option of the HDInsight cluster and click on the “Enable Remote” button on the bottom of the screen.

HDInsight Cluster Configuration : Enable Remote

3. A “Configure Remote Desktop” window will be launched where you can create a new user that can login via Remote Desktop.

HDInsight: Configure Remote Desktop

Once you enter the details, you’ll see that the “Connect” and “Disable Remote” buttons are disabled while the Remote Desktop access being configured is created in the background.

HDInsight: Enabling remote desktop

The background configuration usually takes a couple of minutes (or less). Once it is done, click on the “Connect” button and it will start downloading an rdp file to connect to the HDInsight Cluster.

HDInsight:. Connect HDInsight:. Opening RDP file

4. When prompted, enter the credential to connect remotely.

HDInsight: Log in via RDP

Once connected, you will see “Hadoop Command Line” on the desktop – and voila you can make use your Hadoop skills here.

HDInsight: Hadoop Command line shortcut HDInsight: Hadoop command line


Wrap Up

Hadoop Commands can be executed on HDInsight Cluster via Remote Connection. You’ll first need to enable the remote connection.


Further Reading

Use Windows Azure Blob storage with HDInsight by Windows Azure

Your First HDInsight Cluster–Step by Step by Cindy Gross and Murshed Zaman

Upload data to Blob Storage using Hadoop Command Line by Windows Azure



Undo Ghost Pending Changes in Team Foundation Service

I have been using the Team Foundation Service (http://tfs.visualstudio.com) in the last 7 months for a couple of my projects. Recently I have had to change laptops twice due to SSD issues on Lenovo T420s. So, I am on a third laptop (aouch!). One of the challenges that I had to go through when migrating from one (crashed) laptop to another is resolving Pending Changes associated to my account workspace on the crashed laptop. Despite my best efforts to check-in my work fairly frequently, disaster was inevitable and I still had ghost pending changes.

I call pending changes associated to my workspace in crashed laptop as “ghost pending changes”.

Once I have my new laptop setup, and Team Explorer installed on Visual Studio 2010, I am ready to investigate what the ghost pending changes are from my crashed laptop, recover the files from the crashed laptop, remove the ghost pending changes and check-in the files recovered via the new laptop.

In this article, I will be using TF Command-Line Utility Commands that are also available for cloud based TFS. This makes sense and is very handy to resolve ghost pending changes.



Install Visual Studio 2010 and Team Explorer in the new machine.

On the new machine, connect your Visual Studio to the existing team project on cloud TFS. This will create a new workspace in the new machine.



5 Steps to Recover Ghost Pending Changes


Below are some quick tips to tidy up loose ends from a workspace (e.g. a crashed laptop) with Pending Changes.


1. Backup the changes you have made from the crashed laptop, if possible.

If your hard drive / SSD is still salvageable and you can retrieve local copy of your TFS folder(s), make a copy of them in another hard drive and store them some where. Pronto! Don’t delay it.


2. Find out ghost pending changes and investigate the severity of potentially lost work.

In Source Control Explorer within Visual Studio, you can check folder by folder to see the changes that are not checked in by all users. Although this can be cumbersome if you have multiple collections and folders.

Pending Changes on Visual Studio 2010

Fortunately, there is a handy command that you can use.

tf status /user:<your user name> /recursive > <output file path>

More documentation on the Status command: http://msdn.microsoft.com/en-US/library/9s5ae285(v=vs.100).aspx


3. Get the workspace name

Before undoing the pending changes, you will need to know the name of the workspace that you are trying to undo. In my case, I have three workspaces, one for each laptop where I have installed TFS. The first one from my first crashed laptop. The second one from the latest crashed laptop. So, I want to undo the one from the latter one.

Using the following command, I can find out the list of workspaces related to my user name.

tf workspaces /owner:<user name>

tf workspaces /owner:<User Name>

In this example, the one that I want to “undo” the pending changes is “JKOESMARNO-T420″.

More documentation on Workspaces command: http://msdn.microsoft.com/en-us/library/54dkh0y3(v=vs.100).aspx


4. Undo the pending changes.

Now that we understand the severity of the files with ghost pending changes, we can undo the pending changes. Use the following command to undo one collection at a time:

tf undo /workspace:<Workspace Name> /recrusive <Collection Name>

tf undo

More information on the Undo command: http://msdn.microsoft.com/en-us/library/c72skhw4(v=vs.100).aspx

Once you undo pending changes, make sure you perform “Get Latest Version (Recursive)” on each Collection via in Source Control Explorer.

05. Get Latest Version Recursively

Source Control Explorer will show the latest check outs. If it had check outs by another team member, it will show as follows:

 Source Control Explorer with other user's pending changes

 If there is no check out by other users, it will look similar to the following:

Source Control Explorer - After Undo Campaign Management


5. Re-do the changes and check them in.

If you are lucky and manage to salvage the local changes from the crashed laptop that are not checked in, it may be a good idea to copy them to the new system and check them in.


6. (Optional) Remove unwanted old workspaces

Use the following command to delete unwanted old workspaces. You could substitute Step 4 above with this step.

tf workspace /delete:<Workspace Name>

Note: If you delete a workspace that contains pending changes, Team Foundation cancels the pending changes as part of the delete process.  Deleting a workspace does not delete the files and folders on the client computer that were in that workspace.


Wrap Up

TF Command-Line Utility Commands are powerful and work with Team Foundation Service (a cloud based version control from Microsoft). Using TF commands, we can resolve ghost pending changes before continuing development in a new environment.


Further Reading

TF Command-Line Utility Commands

Team Foundation Service - a cloud powered source control management by Microsoft



Upgrading from SQL Server 2012 Evaluation Edition

This article provides a step-by-step instruction on how to upgrade a SQL Server 2012 RTM Evaluation Edition server to a BI edition. Similar steps can be used to upgrade to any other licensed editions, Developer/Standard/Enterprise.

Long story short, changing the edition from the Evaluation edition to another one does not require building and migrating to a new server with the intended edition. As the Evaluation edition is a 180-day trial edition of Enterprise edition, there can be some issues when changing the edition to a non-Enterprise edition due to incompatible features used. This article assumes that the features used in the sample SQL Server instance with the Evaluation edition are non-Enterprise only features.


  1. A SQL Server 2012 RTM instance in Evaluation mode to be upgraded.
  2. A Developer/Standard/BI/Enterprise License Key or the ISO file of the SQL Server 2012 RTM installation that contains the license key.


In the sample below, the SQL Server instance is running on Evaluation mode, and it is to be upgraded to the BI edition. None of the Enterprise only features are installed.

1. Run the SQL Server 2012 RTM ISO file.

2. Choose Maintenance > Edition Upgrade.

3. Click Next on Setup Support Rules.

4. Follow the Upgrade the Edition for SQL Server 2012, which starts with Setup Support Rules.

5. On the next screen, choose Enter the product key and type in the product key of the new license. This is usually pre-filled for Developer edition; or if the licensed ISO file is used.

6. Accept the License Terms on the next screen.

7. Select the instance to be upgraded to the new edition.

8. The installation will then continue with running Edition Upgrade Rules. This is where feature compatibility is being checked.

9. Click Upgrade on the next screen.

10. Once the upgrade is successful, click Close.

11. Verify that the existing SQL Server instance has been upgraded to the correct edition by checking via SQL Server Management Studio. For more information on how to check the edition, see my blog post here.


Wrap Up

This post provides a step-by-step instruction on how to upgrade a SQL Server 2012 RTM instance with Evaluation edition to a different edition. It is a relatively easy process when only the compatible features are used. To check when the Evaluation edition expire, please see my earlier post here.

If you have had some curly experience in upgrading, please feel free to leave a comment. Thank you!


Further Reading

Upgrade to a Different Edition of SQL Server 2012 (Setup) on Technet

Retrieving SQL Server 2012 Evaluation Period Expiry Date



Retrieving SQL Server 2012 Evaluation Period Expiry Date

SQL Server 2012 RTM Evaluation edition was made available on March 7, 2012. Some of us may have installed this on the date or later for evaluation, some of us may still be using the RC0 or an even an earlier version of SQL Server 2012. If you have SQL Server 2012 currently installed, have not purchased and installed the licensed key, the evaluation period may have expired or will be very soon. To avoid nasty surprises, it is a good idea to plan ahead and make a note of the expiry date of the evaluation period.


SQL Server 2012 Evaluation Period has expired


There are a number of ways to retrieve SQL Server 2012 Evaluation Period Expiry Date. In SQL Server 2012, the evaluation expiry date is not listed on the About dialog box of SQL Server Management Studio any more unlike previous versions.


SQL Server 2012 SSMS About Dailog Box

SQL Server Management Studio Dialog Box does not show Evaluation Expiry Date


The Evaluation period is 180 days, which we can calculate if we know the installation date. The three options of determining Evaluation Expiry Date are:

  • Running a simple T-SQL query
  • Inspecting Summary.txt in the installation log directory
  • Inspecting RegEdit configuration


Option 1: Running a simple T-SQL query

Run the following query to retrieve the installed and expiry date on an Evaluation edition of SQL Server 2012 instance.

	create_date AS 'SQL Server Install Date',
	DATEADD(DD, 180, create_date) AS 'SQL Server Expiry Date'
FROM sys.server_principals

“NT AUTHORITY\SYSTEM” account on the database server is a Local System Account and by default gets created at the time of installation. Therefore we can rely on inspecting its creation date to safely determine the installation date of SQL Server.

See more definition of “NT AUTHORITY\SYSTEM” account here: http://msdn.microsoft.com/en-us/library/ms191543.aspx

Note: to check if you are running Evaluation edition, you can do this simply by checking the SQL Server database instance properties via SQL Server Management Studio (SSMS) as shown below.

SQL Server 2012 Database Server Properties

The following query will also return the Product Version, Product Level and Edition.

	SERVERPROPERTY('ProductVersion') AS ProductVersion,
	SERVERPROPERTY('ProductLevel') AS ProductLevel,
	SERVERPROPERTY('Edition') AS Edition;


Option 2: Inspecting Summary.txt

When SQL Server 2012 instance is installed, a Summary.txt file is created. This file is typically located at “C:\Program Files\Microsoft SQL Server\110\Setup Bootstrap\LOG\Summary.txt“.

Summary.txt contains important values pertaining to the instance installation, including the version being installed and when it is being installed. The Evaluation edition is valid for 180 days, so with simple arithmetic on the install date, we can determine the expiry date as exhibited below.


In the above example, The install date is on 29 April 2012, so the expiry date is 26 October 2012 (180 days + 29 April 2012).

Option 3: Inspecting REGEDIT

  1. Open REGEDIT and navigate the following structure:
  2. Here you will see a long list of folders in form of a sequence of alphanumeric characters (GUID). Ensure the “Products” folder is highlighted/selected.
  3. From the Edit menu, click Find and type in “SQL Server 2012 Database Engine Services”. Clicking “Find Next” will open the [GUID] > InstallProperties folder which should look like this:

InstallProperties of SQL Server 2012 Database Engine Services

The InstallDate lists the date of installation in YYYYMMDD format, i.e. in this example, it is 29 April 2012. The expiry date for this instance is 180 days from the install date, which is 26 October 2012.


I hope this post has been useful for you. If there are other ways that you can think of, please don’t hesitate to let me know by leaving a comment.


Other related posts on SQL Server Evaluation Period Expiry Date:

Edit (June 4th, 2012):

On a side note, if you are using Evaluation edition for development purposes, it would be best to upgrade this instance to the Developer edition of SQL Server 2012. More information about the Developer edition and other license information on SQL Server 2012, please visit: http://www.microsoft.com/sqlserver/en/us/get-sql-server/how-to-buy.aspx

SQL Server 2012, The Good and The Bad


As a SQL Server specialist, hearing new updates on SQL Server always excites me. SQL Server 2012 provides plenty of features, such as Columnstore Index and AlwaysOn features would immensely help my previous projects. The PowerView feature and the integration of BI capabilities of SQL Server 2012 with Sharepoint 2010 is definitely not something to dismiss very easily.  These areas below are the new features in SQL Server 2012 that I would consider as being important for my past and upcoming projects, and I wlil discuss a few of the goods and the bads.


1. SQL Server Data Tools (SSDT)

The Good: This will replace BIDS. SSDT is using Visual Studio 2010 Shell and has revamped the look and feel of SSIS in some degree.

The Bad: After installing SQL Server 2012 RC0, I am unsure how to add some third party SSIS data flow component. It may mean that if you are using Balanced Data Distributor SSIS Data Flow component, it requires some testing first.

Further reading: http://msdn.microsoft.com/en-us/data/tools.aspx


2. ColumnStore Indexes

The Good: This would immensely help those Data Warehouse databases that are large (or getting larger and larger). Those fact tables with millions of records would definitely benefit this feature. There are some caveats to this; one in particular is that it only comes in the Enterprise Edition of SQL Server 2012.

The Bad: The table with ColumnStore indexes cannot be updated as per SQL Server 2012 release. This may not be critical as there is a workaround on this: http://msdn.microsoft.com/en-us/library/gg492088(v=SQL.110).aspx#Update


3. Rapid Data Analysis through PowerView and PowerPivot

The Good: Business Analysts and Data Analysts who are exploring the data, can start with simple, powerful and presentation ready reports using PowerView and PowerPivot.

The Bad: PowerView is only available with Sharepoint 2010 BI integration – as per the SQL Server 2012 RC0. This may change in the final version.

Further reading: http://msdn.microsoft.com/en-us/library/hh213579(v=sql.110).aspx


4. Tabular Model SSAS

The Good: This feature would help with leveraging data warehouse design with less effort from building overly complex cubes, for powerful and highly performing data analysis and business reporting.

The Bad: Self joins are not permitted, Composite Keys are not supported, Unique Lookup columns are required, only Single Active Relationship is supported and only One Relationship for each source column is allowed. Choosing DirectQuery mode is not fully compatible with Vertipaq mode. Careful consideration is required before switching all the way to DirectQuery mode. .

Further reading: http://msdn.microsoft.com/en-us/library/hh212945(v=sql.110).aspx


5. Licensing / editions

The Good: If you have Software Assurance Agreement with Microsoft, you may not be affected greatly by the costs incurred due to the new pricing model.

The Bad: Planning is required whether to choose Standard, BI or Enterprise edition appropriately and it has been reported that the new licensing model will incur more costs in most cases.

Further Reading: http://www.microsoft.com/sqlserver/en/us/future-editions/sql2012-editions.aspx, http://redmondmag.com/articles/2011/11/05/microsoft-unveils-sql-server-2012-licensing-and-pricing.aspx


The above are my take of SQL Server 2012, I strongly urge you to use it and to test it out. You may agree or disagree (or come up with different points to consider) with the above.  Please share your thoughts by leaving comments.


Last but not least, don’t forget to attend SQL Server 2012 Virtual Launch Event on March 7th: http://www.sqlserverlaunch.com/ww/Home

Important Trace Flag that Every Expert DBA Should Know

I have recently co-presented with Victor Isakov (from SQL Server Solutions) at the Sydney SQL Server User Group on “Important Trace Flag that Every Expert DBA Should Know. The presentation went quite well and I learnt more things that I could imagine from co-presenting.

Some of the trace flags we discussed were:


This trace flag lets SQL Server to log detailed deadlock information to SQL Server Error Log in text format. In SQL Server 2008, this trace flag is only available at the Global Level (i.e. applies to the SQL Server instance). In my experience, it is worth turning this trace flag on, only for debugging purposes. A wealth of information on Deadlocks can be found here: http://msdn.microsoft.com/en-us/library/ms178104.aspx



SImilar to trace flag 1204, this trace flag lets SQL Server to log detailed deadlock information to SQL Server Error Log in XML format.



In an environment where database backup is frequently performed, it is a good idea to turn trace flag 3226 on, to supress an entry to the SQL Server Error Log for each backup.  This is beneficial as the bigger the log, the harder it is to find other messages.  However, you will need to ensure that none of your scripts or systems rely on the backup entry detail on the SQL Server Error Log.


There are more trace flags that are covered in the presentation.  Above are the basic ones that every DBAs should know and would one way or another be very helpful in running production environment.

The presentation will be run again in a few weeks time (13 October 2011) at PASS Summit 2011 in Seattle!  I’m certainly looking forward to co-presenting it again with Victor Isakov (SQL Server Solutions).

See you in Seattle [PASS Summit 2011]



Edit (25 Feb 2012): As requested, below is a sample of a stored procedure that can be executed at the database server start up to enable certain trace flags. Please ensure you understand and test the trace flags before you turn them on, especially in Production environment.  Special thanks to Victor Isakov for providing the sample code.  This stored procedure was covered at the PASS Summit 2011 presentation and the slide deck is downloadable from here.


USE [master]

CREATE PROC [dbo].[EnableTraceFlags]
-- Author  : Victor Isakov
-- Company : SQL Server Solutions (http://www.sqlserversolutions.com.au)
-- Purpose : Enable global trace flags upon SQL Server startup.
-- Notes   : Need to execute sp_procoption to enable this stored procedure to autoexecute
--           whenever SQL Server instance starts:
--           EXEC sp_procoption 'dbo.EnableTraceFlags', 'startup', 'true'
-- Bugs    : None
-- Version : 1.0
-- History : 
-- ========== ==================================================
-- 11/04/2011 Version 1.0 released.
DBCC TRACEON (4199, -1);
-- Enable Query Optimiser fixes (http://support.microsoft.com/kb/974006)
DBCC TRACEON (1222, -1);
-- Write deadlocks to errorlog (BOL)
DBCC TRACEON (3226, -1);
-- Supress successfull backup messages (BOL)

EXEC sp_procoption N'[dbo].[EnableTraceFlags]', 'startup', '1'