VS2012 Ultimate SSIS Start Up Error

If you are using Visual Studio 2012 (VS2012) for Business Intelligence solutions and have SQL Server 2012 SP1 installed, you may be running into a VS2012 Start Up issue. This happened to me this morning when trying to open a non BI solution in VS2012.

It came up with “Microsoft Visual Studio 2012 has stopped working” error message as shown below. I also tried running devenv /safemode as well, but this does not help.

VS2012 Crashed At StartUp

Essentially, the main issue was related to:

System.TypeLoadException: Could not load type ‘Microsoft.SqlServer.Dts.Design.VisualStudio2012Utils

This issue happened on on my machine with SQL Server 2012 SP1 with KB2793634 (v11.0.3128) installed.

After tweeting it online, Jeff Rush (@JeffRush) mentioned a link that suggests SQL Server 2012 SP1 CU4 may need to be applied. Applying SQL Server 2012 SP1 CU4 (and also restarted my machine) didn’t seem to do the trick.  I did some more research and eventually found out that the problem was only solved in SQL Server 2012 SP1 CU5.

Applying the patch (part of hotfix included in SQL Server 2012 SP1 CU 5), 465914_intl_x64_zip.exe has solved the issue. After the patch, my SQL Server 2012 version is 11.0.3373.

Massive thanks to Jeff Rush for giving me a guidance that the issue was probably fixable with a hotfix for SSIS – not sure why I did not think of that. (That’s why Twitter and #SQLFamily rocks!)

Further Reading

Fix: The type initializer for ‘Microsoft.DataTransformationServices.Project.SharedIcons’ threw an exception. (Microsoft Visual Studio 2012) by Roel van Lisdonk

System.TypeLoadException: Could not load type ‘Microsoft.SqlServer.Dts.Design.VisualStudio2012Utils’ on SQL Server Forums

Cumulative Update package 5 for SQL Server 2012 SP1 on Microsoft Support

 

 

Editing Published SSIS Package in SQL Server 2012

So, you have inherited a set of SQL Server 2012 Integration Services projects that have been deployed on a server. Your company do not use source control so you can’t easily see what was last published (Oh no!!). Or, you have deployed something on the server but you are not sure if it is the right version. Bottom line is you want to inspect or to edit a package that is already published on the server. Unfortunately at the moment  there is no way of just quickly downloading that single SSIS package.

SQL Server 2012 Integration Services now is now making use of  Project Deployment Model. For more information, read it here. Back to the issue at hand, if you want to get a copy of what is in the production, in summary, there are two “easy” ways of doing so.

A. Creating a new project in SQL Server Data Tools (SSDT) and import the project from existing catalog.

B. Exporting the project into ispac file via SQL Server Management Studio (SSMS).

 

Caveat: Both these options require access to the project on the server.

 

 

Option A: New Project

1. In the New Project dialog box of SSDT, browse to Installed Templates > Business Intelligence Integration Services on the left navigation pane.

2. Choose Integration Service Import Project Wizard option, enter the Name, Location and Solution Name accordingly. Then click OK to proceed to the next step.

Integration Services Project Wizard

3. Click Next on the Integration Services Import Project Wizard window

SSDT - Integration Services Project Wizard - Introduction

4. On the Select Source dialog, choose Integration Services Catalog. Then provide the details of the project on the server that you wish to import.

SSDT - Integration Services Project Wizard - Select Source

SSDT - Integration Services Project Wizard - Select Project

 

 

5. Follow the rest of the instruction.

SSDT - Integration Services Project Wizard - Validation

SSDT - Integration Services Project Wizard - ReviewSSDT - Integration Services Project Wizard - Results


6. Once the project is created, you can browse to the package(s) that you wish to inspect. In my example here, I have  4 connection manager files and 9 dtsx files.

SSDT - Integration Services Project Wizard - Project Created

This option is one of the safest / simplistic ways of getting SSIS project from the server. All the components in the Project, such as Project Variables and Project Connection Managers are kept in tact.

 

Option B: The shortcut

1. Navigate to the SSIS project on SSMS; i.e.  [Server Name] > Integration Services Catalogs > SSISDB > [Project Folder] > Projects > [Project Name] as shown on the picture below.
SSMS - Export SSIS Project

2. Right click on the [Project Name], then choose Export….

3. Choose the destination folder on “Save As” to save the .ispac file

4. Navigate to the folder where the file has been saved and rename the file extension to .zip file.

5. Browse the zip file and you should be able to see the .dtsx file(s).  Please note that in this example, I have 4 connection manager files and 9 dtsx files.

Browse zip file that was renamed from ispac

6. Now you can extract the desired .dtsx file(s) from the zip file.

Note: If the server is in Production, you could ask your DBA to export the .ispac file of the project (Step 1 & 2). Then you can continue with Step 3.

Special thanks to Josh Fennessy (b | t), for providing Option B.

 

Would it work for SSDT Visual Studio 2012?

Update – 6 March 2013: Both options work for Visual Studio 2012 too. (Read Microsoft announcement on SQL Server Data Tools – Business Intelligence for Visual Studio 2012). At the time of writing (a few minutes of exploring SSDT Visual Studio 2012), everything works the same for Option A. There is a major face-lift with monochrome look.

 

SSDT - What it looks like in Visual Studio 2012

 

Wrap Up

Integration Services in SQL Server 2012 introduces Project Deployment Model. This means that accessing a single SSIS package that has been published from the server is not as trivial. This article describes two ways of accessing the desired package(s) or the project as a whole from a server. Option A, which is importing the package from the Integration Services Catalogs is the safest way. Option B, is a shortcut method that is recommended only for advanced level users.

This article does not describe how the pros and cons of the new Integration services Project Deployment Model as it is covered in a couple of blogs as listed on the Further Reading section.

 

Further Reading

Can I deploy a single SSIS package from my project to the SSIS Catalog? by Matt Masson

Short Review on .ispac SSIS Deployment by Julie Koesmarno – my thoughts on Matt’s article above.

What’s New (Integration Services) on MSDN

 

 

Upgrading from SQL Server 2012 Evaluation Edition

This article provides a step-by-step instruction on how to upgrade a SQL Server 2012 RTM Evaluation Edition server to a BI edition. Similar steps can be used to upgrade to any other licensed editions, Developer/Standard/Enterprise.

Long story short, changing the edition from the Evaluation edition to another one does not require building and migrating to a new server with the intended edition. As the Evaluation edition is a 180-day trial edition of Enterprise edition, there can be some issues when changing the edition to a non-Enterprise edition due to incompatible features used. This article assumes that the features used in the sample SQL Server instance with the Evaluation edition are non-Enterprise only features.

Prerequisites

  1. A SQL Server 2012 RTM instance in Evaluation mode to be upgraded.
  2. A Developer/Standard/BI/Enterprise License Key or the ISO file of the SQL Server 2012 RTM installation that contains the license key.

Steps

In the sample below, the SQL Server instance is running on Evaluation mode, and it is to be upgraded to the BI edition. None of the Enterprise only features are installed.

1. Run the SQL Server 2012 RTM ISO file.

2. Choose Maintenance > Edition Upgrade.

3. Click Next on Setup Support Rules.

4. Follow the Upgrade the Edition for SQL Server 2012, which starts with Setup Support Rules.

5. On the next screen, choose Enter the product key and type in the product key of the new license. This is usually pre-filled for Developer edition; or if the licensed ISO file is used.

6. Accept the License Terms on the next screen.

7. Select the instance to be upgraded to the new edition.

8. The installation will then continue with running Edition Upgrade Rules. This is where feature compatibility is being checked.

9. Click Upgrade on the next screen.

10. Once the upgrade is successful, click Close.

11. Verify that the existing SQL Server instance has been upgraded to the correct edition by checking via SQL Server Management Studio. For more information on how to check the edition, see my blog post here.

 

Wrap Up

This post provides a step-by-step instruction on how to upgrade a SQL Server 2012 RTM instance with Evaluation edition to a different edition. It is a relatively easy process when only the compatible features are used. To check when the Evaluation edition expire, please see my earlier post here.

If you have had some curly experience in upgrading, please feel free to leave a comment. Thank you!

 

Further Reading

Upgrade to a Different Edition of SQL Server 2012 (Setup) on Technet

Retrieving SQL Server 2012 Evaluation Period Expiry Date

 

 

Short Review on .ispac SSIS Deployment

My last two articles on SSIS have been related to bugs. That’s not because I am looking for its faults all the time. I like using SSIS – I have been since SQL Server 2005, more so in 2008 and even better in 2012. I think SSIS has limitations but also great advantages. For most of the projects that I have worked with, it simplifies quite a lot of things for me.

Matt Mason (twitter | blog) has discussed the branching/development strategies and how they fit in the revamped SQL Server 2012 SSIS in a couple of articles. Read them here:

Thoughts on Branching Strategies for SSIS Projects

Can I deploy a single SSIS package from my project to the SSIS Catalog?

 

So here’s my thought on the new .ispac method of deployment in SQL Server 2012, in response to the latter post by Matt.

After using SSIS in SQL Server 2012 for a moderately sized ETL for data warehouse for over 6 months, I’ve become more and more appreciative of the .ispac method of deploying. Building/deploying it, as if it was a C# project, in SSIS SQL Server 2012 makes more and more sense to me.

There has been a few seconds in the past, which I thought deploying just a single package would be useful. For example, when there is a need to import simple text files that can be done via one SSIS package on an ad hoc basis. However, the more I think about this, I think it’s not a “good” enough reason to do a single package deployment. In fact, if it gets run reasonably frequently or it may be reusable, I would create a project called “Ad hoc Import” which consists of a collection of “common” ad hoc SSIS packages.  In my experience, I don’t always know how often an SSIS package to be executed at conception. It could be a one-off / throw-away package; or it could start with being a one-off and then continue to be executed every so often. More often than not, my “first thought” one-off packages have become reusable (re-runnable) that they are worthy of being maintained as a project and deployed to the Integration Services Catalogs.

I think having / forcing project deployment just makes us think twice about how we organise our packages / projects in Integration Services Catalogs; which in my opinion, is not a bad thing at all.

Also – yes – I do use version control for all my SSIS projects. I use Team Foundation Server 2010. Although with the Integration Services Catalogs in SQL Server 2012, the deployed versions information is kept, this is not good enough to replace a version control tool; especially in team development environment.

Most of the issues that I have encountered and submitted at Microsoft Connect in regards to administering and executing deployed SSIS projects have workarounds (albeit risky). So I am all for project deployment.  By the way, more info on these issues:

http://connect.microsoft.com/SQLServer/feedback/details/751220/ssisdb-package-parameter-error-in-previous-version-project-lsn-carried-forward-to-the-future-deployed-versions. Also see: SSISDB Bug: Package Parameter Mapping

http://connect.microsoft.com/SQLServer/feedback/details/743735/ssisdb-catalog-get-parameter-values-with-null-able-package-name-input-parameter. Also see: Get Parameter Values from SSISDB Catalog: a bug?

I am also keen to hear what others think about a single package deployment and/or their experience with the new project deployment method in SQL Server 2012.

 

 

 

 

SSISDB Bug: Package Parameter Mapping

I have recently posted a new SSISDB bug at Microsoft Connect relating to Package Parameter mapping.  Running an SSIS project where the previous deployed version had an issue with the mapping of child package parameter, will carry forward the issue to the future deployed versions (even though it has been remapped properly in the future versions).

The unexpected error message will look like this:

Call Child Package:Error: Error 0×80131500 while executing package from project reference package “Child.dtsx”. There is no appropriate variable in the current environment to map to the parameter “paramChild”..

After a bit of investigation, it seems that the underlying stored procedure is looking at package parameters (mapped or unmapped) of all deployed versions of the project. Meaning that:

  1. If you accidentally delete an environment variable that was previously mapped to a previous version, any deployed SSIS project version will fail.
  2. If your previous version had incorrect mapping and failed, the current and future deployed versions will always fail, unless you deploy a change to ensure the previous version will run properly.
SSMS Integration Services Project Versions Dialog Box

Project Versions Dialog Box in SSMS

Workarounds

I have come up with a couple of workarounds, should you face a similar scenario (unless this issue is fixed in the future).

Workaround #1

1. Revert back to the old SSIS project version that is having the issue, and fix the mapping of the child package to the environment variable.

2. Repeat the above step for all SSIS projects that may have the issue.

3. Revert back to the latest version of SSIS project.

Caveat: if the variable is deleted from the environment for a reason, this needs to be kept instead; for the sake of rectifying the issue with older versions of the project AND to retain the history of older versions. This will make it harder for administering and managing SSIS project deployment and maintenance.

 

Workaround #2

1. Abandon / delete the SSIS project that is having issue.

2. Redeploy the project with the same name, and reconfigure the child package parameter to the environment variable properly.

Drawback: History of the SSIS project versions is not retained in the Integration Services Catalog.

 

Permanently Fixing This Bug

I think this is a real problem that most people will run into. Unless the solution is “don’t use package parameter” – which is not ideal – this problem should be fixed.

How?

Well, first, Microsoft SSIS team should hire me so I can help fixing it! ;) Or, you could just vote for the bug at Microsoft Connect here, and wait until it gets fixed.

In the mean time, I have come up with an alternative solution which I have to put a strong disclaimer so you should use this at your own risk. I will not be responsible if you deploy this in production. Here are the steps to permanently fix the issue:

1. Create a backup of your SSISDB.internal.get_project_parameter_values by scripting it out from the SSISDB database. Save this as a file somewhere safe.

2. Open my version of SSISDB.internal.get_project_parameter_values.

3. Comment out the CREATE PROCEDURE line.

4. Uncomment the ALTER PROCEDURE.

(Step 3 and 4 are deliberately put in place here, so you won’t accidentally overwrite the original stored procedure).

 

Wrap Up

Package parameter that is to be mapped to an environment variable should be use with caution. Deleting an environment variable that is mapped to a previous deployed version and running the properly mapped parameter of a new version will cause an unexpected error “There is no appropriate variable in the current environment to map to the parameter”. I have stated a number of workarounds above, please use them wisely. The fix that I have proposed on altering internal.get_project_parameter_values is not 100% tested and should only be used in development environment.

To have this bug fixed properly, please vote at Microsoft Connect site.

Comments and feedbacks are welcomed!

Hybrid Mode in Tabular BI Semantic Model – Part 1

During my presentation on “DirectQuery vs Vertipaq Modes in SSAS Tabular Model” at PASS BI/DW VC and SQL Rally in Dallas this year, I briefly explained the Hybrid Mode in Tabular BI Semantic Model (BISM). I would like to discuss Hybrid Mode further in two parts. Here is the first part, which contains a basic walkthrough of Hybrid Mode in Tabular BISM.

On a side note, DirectQuery vs Vertipaq Modes in SSAS Tabular Model slide deck can be downloaded from here: http://www.mssqlgirl.com/slide-deck-directquery-vs-vertipaq-for-pass-dwbi-vc.html

 

Introduction

The default mode of SSAS Tabular Model is the In-Memory (also known as Vertipaq Mode), where the data from various types of data source are processed and loaded into the in-memory Tabular databases. Any queries executed on the Tabular database will be served based on the data in Memory (cache). Due to the state of the art compression algorithm and multi-threaded query processors, the xVelocity in-memory analytics engine can provide fast access to the tabular model objects and data. In-Memory mode supports both DAX and MDX query types.

Another mode available for SSAS Tabular Model is DirectQuery. DirectQuery translates all DAX queries at run time to SQL statements, allowing real time access to the SQL Server source database. Unlike the In-Memory mode, DirectQuery only works with one SQL Server data source. The main advantage of using DirectQuery is the real time access and scalability. This comes with a price of restrictions on a number of DAX functions and missing Calculated Column feature. Only client tools that issues DAX queries can access Tabular Model with DirectQuery mode.

Hybrid Mode combines the design aspect of DirectQuery and Client Tool flexibility of In-Memory. Essentially, Tabular Database Model with Hybrid Mode is designed/developed with DirectQuery enabled, and is published with both DirectQuery access mode and In-Memory access mode. When published, the metadata is deployed and data will be processed into memory for the InMemory access. Hence, Hybrid Mode also requires processing mechanism of In-Memory and also supports both In-Memory partition type and DirectQuery partition type.

Querying in Hybrid Mode

There are two options of querying a Tabular database with Hybrid mode enabled.

The following diagram shows how Hybrid Mode can serve DAX issuing client tools (PowerView, SSRS) and MDX issuing client tools (Excel, Tableau, SSRS).

SSAS_Tabular_Hybrid Mode_Query Flow

 

When a client tool issues a DAX query via the DirectQuery access mode, the DAX query is passed and then converted into an equivalent SQL query that accesses the source SQL Server database directly. The result returned to the DAX issuing client tool will be straight from the source SQL Server data source.

When executing an MDX query via In-Memory access mode, the query is served from the cache (the tabular database). There is no conversion and the results returned to the MDX issuing client tool will be based on the data that is in the cache.

Note: PowerView is a DAX issuing client tool that can work with both DirectQuery and In-Memory. If the primary/default Query Mode of a hybrid Tabular database is DirectQuery, the result is served straight from the source SQL Server data source. If the primary/default mode is In-Memory, the result is served from the cache. More about the two hybrid modes this later.

 

Enabling Hybrid Mode – the Basics

1. Design phase

When creating a Tabular Model solution using SSDT, ensure that DirectQueryMode value is set to On. This will ensure that the solution will conform to DirectQuery design features.

Model.bim Properties on SSDT

Enable DirectQueryMode on tabular model via SSDT

2. Deployment phase

Prior to deploying the solution, change the Query Mode to “In-Memory with DirectQuery” or “DirectQuery with In-Memory”. These are the two available Hybrid modes.

Project Properties of Tabular Model in SSDT

Change Query Mode of Tabular Project in SSDT

 

 

Query Mode: In-Memory with DirectQuery

In-Memory with DirectQuery option means that In-Memory is the primary (or default) connection. Howevever, when needed and if the client tool supports this, the secondary Query Mode, i.e DirectQuery, can be used instead.

This query mode is ideal for the following scenarios

  1. Users are mainly using Excel to perform analysis on the tabular model.
  2. The processed data in memory will be used to serve Excel queries.
  3. The processed data in memory will be used to serve PowerView report.
  4. Only occasional real-time queries required for accessing the real time data, using SSRS as an example.

Query Mode: DirectQuery with In-Memory

DirectQuery with In-Memory option means that DirectQuery is the primary (or default) connection. Howevever, when needed and if the client tool supports this, the secondary Query Mode, i.e In-Memory, can be used instead.

This query mode is ideal for the following scenarios

  1. Users are mainly using PowerView (or DAX issuing Client Tool) to perform analysis on the tabular model.
  2. By default, always returns real time data.
  3. Only occasional processed in memory data is required to be retrieved from Excel.

 

Connecting to Hybrid Mode tabular database

Connecting to a tabular database with Hybrid Mode is the same as connecting to either InMemory or DirectQuery – that is, if you would like to use the primary mode. For example, if a tabular database is published with Query Mode of “DirectQuery with In-Memory”, the default connection via a client tool will always be made through to the DirectQuery, with no extra steps required. Similarly, if the Query Mode is “In-Memory with DirectQuery”, the default connection will use the In-Memory.

When using “DirectQuery with In-Memory” Query Mode, a client tool can connect to the tabular database using the In-Memory mode, by specifying it in the connection. Below is an example for connecting via SQL Server Management Studio, to the In-Memory part to a tabular database that has been published with “DirectQuery with In-Memory” .

SQL Server Management Studio Additional Connection Parameters

Specify "DirectQueryMode" parameter in SSMS

 

Below is a sample of specifying the DirectQueryMode in Excel:

Excel Connection String Dialog Box

Specifying DirectQueryMode in Excel Connection String Property

 

Note: As at the time of writing, the DirectQueryMode connection property cannot be specified on PowerView. So, PowerView will use the primary (default) of the Hybrid Mode of the tabular database.

Wrap Up

Above is the basic walk-through of Hybrid Mode in SSAS Tabular. Hybrid Mode Tabular Databases come in two Query Modes, “DirectQuery with In-Memory” or “In-Memory with DirectQuery”. The first Query Mode in the name is the “primary” Query Mode of the tabular database. Tabular databases with Hybrid mode are designed in DirectQuery, but published with either of the two Query Modes.  Some client tools provide the option to switch to the secondary Query Mode by specifying “DirectQueryMode” connection parameter.

Stay tuned for Part 2 of this series for more information on the design tips and important factors to consider when implementing Tabular solution with Hybrid Mode.


Retrieving SQL Server 2012 Evaluation Period Expiry Date

SQL Server 2012 RTM Evaluation edition was made available on March 7, 2012. Some of us may have installed this on the date or later for evaluation, some of us may still be using the RC0 or an even an earlier version of SQL Server 2012. If you have SQL Server 2012 currently installed, have not purchased and installed the licensed key, the evaluation period may have expired or will be very soon. To avoid nasty surprises, it is a good idea to plan ahead and make a note of the expiry date of the evaluation period.

 

SQL Server 2012 Evaluation Period has expired

 

There are a number of ways to retrieve SQL Server 2012 Evaluation Period Expiry Date. In SQL Server 2012, the evaluation expiry date is not listed on the About dialog box of SQL Server Management Studio any more unlike previous versions.

 

SQL Server 2012 SSMS About Dailog Box

SQL Server Management Studio Dialog Box does not show Evaluation Expiry Date

 

The Evaluation period is 180 days, which we can calculate if we know the installation date. The three options of determining Evaluation Expiry Date are:

  • Running a simple T-SQL query
  • Inspecting Summary.txt in the installation log directory
  • Inspecting RegEdit configuration

 

Option 1: Running a simple T-SQL query

Run the following query to retrieve the installed and expiry date on an Evaluation edition of SQL Server 2012 instance.

SELECT
	create_date AS 'SQL Server Install Date',
	DATEADD(DD, 180, create_date) AS 'SQL Server Expiry Date'
FROM sys.server_principals
WHERE name = 'NT AUTHORITY\SYSTEM'

“NT AUTHORITY\SYSTEM” account on the database server is a Local System Account and by default gets created at the time of installation. Therefore we can rely on inspecting its creation date to safely determine the installation date of SQL Server.

See more definition of “NT AUTHORITY\SYSTEM” account here: http://msdn.microsoft.com/en-us/library/ms191543.aspx

Note: to check if you are running Evaluation edition, you can do this simply by checking the SQL Server database instance properties via SQL Server Management Studio (SSMS) as shown below.

SQL Server 2012 Database Server Properties

The following query will also return the Product Version, Product Level and Edition.

SELECT
	SERVERPROPERTY('ProductVersion') AS ProductVersion,
	SERVERPROPERTY('ProductLevel') AS ProductLevel,
	SERVERPROPERTY('Edition') AS Edition;
GO

 

Option 2: Inspecting Summary.txt

When SQL Server 2012 instance is installed, a Summary.txt file is created. This file is typically located at “C:\Program Files\Microsoft SQL Server\110\Setup Bootstrap\LOG\Summary.txt“.

Summary.txt contains important values pertaining to the instance installation, including the version being installed and when it is being installed. The Evaluation edition is valid for 180 days, so with simple arithmetic on the install date, we can determine the expiry date as exhibited below.

Summary.txt

In the above example, The install date is on 29 April 2012, so the expiry date is 26 October 2012 (180 days + 29 April 2012).

Option 3: Inspecting REGEDIT

  1. Open REGEDIT and navigate the following structure:
    HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Installer\UserData\S-1-5-18\Products
  2. Here you will see a long list of folders in form of a sequence of alphanumeric characters (GUID). Ensure the “Products” folder is highlighted/selected.
  3. From the Edit menu, click Find and type in “SQL Server 2012 Database Engine Services”. Clicking “Find Next” will open the [GUID] > InstallProperties folder which should look like this:

InstallProperties of SQL Server 2012 Database Engine Services

The InstallDate lists the date of installation in YYYYMMDD format, i.e. in this example, it is 29 April 2012. The expiry date for this instance is 180 days from the install date, which is 26 October 2012.

 

I hope this post has been useful for you. If there are other ways that you can think of, please don’t hesitate to let me know by leaving a comment.

 

Other related posts on SQL Server Evaluation Period Expiry Date:

Edit (June 4th, 2012):

On a side note, if you are using Evaluation edition for development purposes, it would be best to upgrade this instance to the Developer edition of SQL Server 2012. More information about the Developer edition and other license information on SQL Server 2012, please visit: http://www.microsoft.com/sqlserver/en/us/get-sql-server/how-to-buy.aspx

Get Parameter Values from SSISDB Catalog: a bug?

catalog.get_parameter_values

Recently I have been working with a couple of SSIS projects on SQL Server 2012. During this time, I have managed to encounter an issue with SSISDB. This issue is quite trivial; however it can be important for querying/reporting deployed SSIS packages. In fact, I found this issue because I wanted to create a deployment script to configure Environment and SQL Job to run the SSIS project referencing the configured environment.

The issue is to do with SSISDB.catalog.get_parameter_values.

As documented on Books Online (http://msdn.microsoft.com/en-us/library/ff878039), to obtain all project parameters and the parameters from all packages, when executing catalog.get_parameter_values, the “package_name” input parameter should be set to NULL.

** Excerpt from BOL – catalog.get_parameter_values **

However when running the catalog.get_parameter_values stored procedure with @package_name set to NULL, it returns an error instead stating “The input parameter cannot be null”.

Msg 27138, Level 16, State 1, Procedure get_parameter_values, Line 18
The input parameter cannot be null. Provide a valid value for the parameter.

 

I have logged this as a bug at Microsoft Connect:

https://connect.microsoft.com/SQLServer/feedback/details/743735/ssisdb-catalog-get-parameter-values-with-null-able-package-name-input-parameter#details

 

My own version: catalog.get_any_parameter_values

As a temporary workaround, I have a modified version of catalog.get_parameter_values where it is possible to retrieve all project parameters and package parameters in a single stored procedure call.  I’m calling the new modified version, catalog.get_any_parameter_values which can be deployed to SSISDB and consumed in custom queries. Any built-in reports in SQL Server 2012 that may be calling catalog.get_parameter_values stored procedure will not be affected.

Please note that catalog.get_any_parameter_values is a draft version. Deploy and make use of it with care. This is intended to be as a supplement for administering Integration Services Catalog.

Example of usage:

USE [SSISDB]
GO
EXEC [catalog].[get_any_parameter_values]
    @folder_name = 'AdventureWorks BI',
    @project_name = 'AdventureWorks BI ETL',
    @package_name = NULL,
    @reference_id = NULL

 

The results will be similar to the following:

To deploy the script:

  1. Download catalog.get_any_parameter_values from here.
  2. Run the script against SSISDB database in a SQL Server 2012 Database Instance.  The catalog.get_any_parameter_values stored procedure will be created and will only work within the target Integration Services Catalog. If there are multiple servers running Integration Services Catalog, the script needs to be deployed against SSISDB database of each server.

Compatibility level: SQL Server 2012 RTM (11.0.2100)

If you have any feedback on my catalog.get_any_parameter_values stored procedure, please let me know. More importantly, please vote for the bug ticket relating to catalog.get_parameter_values at Microsoft Connect. https://connect.microsoft.com/SQLServer/feedback/details/743735/ssisdb-catalog-get-parameter-values-with-null-able-package-name-input-parameter#details

Other Workaround

An alternative workaround to the issue in catalog.get_parameter_values is to call the catalog.get_parameter_values once per project and per package in the project. I think this is a little inefficient but will do the job well with little risk on the SSISDB.

 

Upgrading SQL Server 2012 RC0 to RTM for Sharepoint 2010

After attending Rod Colledge’s presentation where he touched on steps in setting up Sharepoint 2010 Business Intelligence with SQL Server 2012 at SQL Saturday 138 in Sydney, I’ve decided that to be brave and tinker with Sharepoint installation and configuration a little bit more.

Recently I have this requirement to upgrade the SQL Server 2012 RC0 installation on a Sharepoint server to SQL Server 2012 RTM version. I decided to do a bit of research and typing in “upgrade sql server 2012 rc0 to rtm sharepoint” on Bing returned a few promising results. One of them was Jen Underwood’s comment posted at http://social.msdn.microsoft.com/Forums/en-US/sqldensetup/thread/cd7287b6-f587-4a87-9676-75ddc9b09800.

The worst case scneario that I could think of is that I would need to rebuild the Sharepoint server. That’s OK because luckily we’re still in an eval mode and not in live production yet. So I did the upgrade. For a non Sharepoint savvy person, the whole process took me about 1.5 hours including copying the SQL Server 2012 iso file across the network.

The good news is, it was successful! (Gee, I never thought that it would be this easy!)

Before we go ahead, I’d like to point out a few things:

  1. At the end of the upgrade, the Sharepoint machine will have 2 installation paths of SQL Server for each instance, i.e. the existing SQL Server 2012 RC0 folder and SQL Server 2012 RTM folder. I am pedantic and try to avoid this as much as possible, but for now, it’s good enough; perhaps we may even be able to clean the folders up later. SQL Server 2012 Directories after Upgrade RC0 to RTM in Sharepoint 2010
  2. The target Sharepoint environment only has SQL Server installation specific for Sharepoint – i.e. SQL Server instances on the Sharepoint machine are not used by any other Applications (if they do, then make sure that they’re no live connection to the SQL Server instances).
  3. The Sharepoint server is not an Active Directory server.
  4. Test this upgrade method in a Test machine first before rolling out to Production.

 

Let’s get into it now.

[Warning: The below steps should be used as guidelines and should be tested in a Test machine first. If you are using a virtual machine for your Sharepoint server, don't forget to create a backup prior to upgrading]

 

Prerequisites

  1. Administrator previlege on the machine as well as Sharepoint.
  2. SQL Server 2012 RTM version – and a product key for the Business Intelligence or the Enterprise Edition.

Steps

  1.  Go to Sharepoint Central Administration > Manage Services on Server (under the System Settings heading). Stop the following services: SQL Server Analysis Services, SQL Server PowerPivot System Service, SQL Server Reporting Services Service.Stop Sharepoint 2010 Services related to the SQL Server 2012 items to be upgraded
  2. Run setup.exe of SQL Server 2012 RC0 and choose Maintenance.
  3. Select the “Edition Upgrade” option (although it only lists SQL Server 2005, SQL Server 2008 and SQL Server 2008R2 upgrade) which will show you the list of instances that can be upgraded.
  4. Select “POWERPIVOT” instance to be upgraded.SQLServer2012 Upgrade RC0 to RTM Select POWERPIVOT Instance
  5. Click Next to move on to the “Select Features” which will grey out all the features installed under POWERPIVOT.
  6. Click Next to “Reporting Services Sharepoint Mode” which presents a message that Continuing with this upgrade may put your SharePoint farm in an inconsistent state. Tick on the “Continue with the upgrade.” option.SQL Server 2012 Reporting Services Sharepoint Mode upgrade message
  7. Click Next to go to “Instance Configuration” which you will need to give a new name, e.g. POWERPIVOT2100

    SQL Server 2012 Upgrade RC0 to RTM Instance Configuration

  8. Click Next until you reach “Upgrade Rules”. An error may come up stating that ‘Rule “SQL Server Analysis Services Upgrade Service Funcitonal Check” failed. The current instance of the SQL Server Analysis Services service cannot be upgraded because the Analysis Services service is disabled or not online. Please start the service and then run the upgrade rules check again‘. If this happens, go to SQL Server Configuration Manager and you will see that the Analysis Server is Stopped. Change this from Disabled to Automatic. (This may be because I stopped the SQL Server Analysis Services Service from Sharepoint earlier on).

    SQL Server 2012 Upgrade Rules Error related to Analysis Services
    SQL Server 2012 SSAS POWERPIVOT Start Mode changed to Automatic

  9. Click OK on the error dialog box and click Re-run on the Upgrade Rules window. It should rerun the upgrade checks and progress on to “Update Progress”.
  10. In the middle of the upgrade of SQL Server Analysis Services Service, a similar error will come up. “The following error has occurred: The service cannot be started, because it is disabled or because it has no enabled devices associated to it“. Again, go to SQL Server Configuration Manager and change the start mode from Disabled to Automatic.
    SQL Server 2012 Upgrade RC0 to RTM Upgrade Progress Error
  11. Click retry and the installation should resume normally to completion.
  12. Once this is done, you can verify the servers installed on the Sharepoint Server by launching SQL Server Management Studio and connecting to different parts of POWERPIVOT SQL Server instances.
    SQL Server 2012 SSMS Instances

After the upgrade, don’t forget to reboot the Sharepoint server.

Post upgrade verifications

  1. Verify that your PowerPivot gallery or Default PowerPivot Service Application is running properly, by going to Sharepoint Central Administration > Manage Service Applications (under Application Management) > Default PowerPivot Service Application
  2. Verify that Microsoft.AnalysisServices.SharePoint.Integration.dll in Windows Assemby is referring to 11.0.2100.60 (right click on the dll, choose Properties and click on Version tab). More detailed info http://msdn.microsoft.com/en-us/library/ee210646(SQL.110).aspx#checkver
  3. Verify that msmdsrv.exe version is 11.0.21.00.60 in \Program Files\Microsoft SQL Server\MSAS11_00.PowerPivot\OLAP\bin (right click on the file, choose Properties and click on the Details tab). More detailed info http://msdn.microsoft.com/en-us/library/ee210646(SQL.110).aspx#checkver

 

And that is it! OK, I bet you can do this in less than 1.5 hours!

Hope you find this instruction useful. Please let me know if you have any suggestions on the steps that I may have missed out from the above.

 

Paging Function Performance in SQL Server 2012

Late last year when I discovered the new TSQL Enhancement introduced in SQL Server 2012, particularly the new Paging Function, I thought to myself that this would be very useful in a number of scenarios. Before I go any further, let’s have a look at what the Paging Function is.

Update: For simplicity, the following scenario uses [Production].[TransactionHistoryArchive] from AdventureWorks2012 and assumes that TransactionID is always continous and incremental as well as starting with TransactionID of 1. (I will discuss more on a scenario where the TransactionID is not continuous in a different post). Special thanks to Shy Engelberg for highlighting the behaviour differences in all three methods discussed in this post. Please read Shy’s comment at the end of this post for further clarification.

The Paging Function is part of the SELECT syntax, as an extension to the ORDER BY clause. Below is an example of its usage to retrieve a data set with TransactionID from 5001 and for the next 100 rows.

-- Paging Function
-- Retrieve 100 rows starting from TransactionID of 5001
SELECT
	 [TransactionID]
	,[ProductID]
	,[ReferenceOrderID]
	,[ReferenceOrderLineID]
	,[TransactionDate]
	,[TransactionType]
	,[Quantity]
	,[ActualCost]
	,[ModifiedDate]
FROM [Production].[TransactionHistoryArchive]
ORDER BY [TransactionID]
OFFSET 5001 ROWS
FETCH NEXT 100 ROWS ONLY

This function is very intuitive and would be applicable for returning data set a “page” (batch of rows) at a time. The OFFSET value can be expressed as an integer variable and so can the FETCH NEXT value, which allows easy and configurable data set paging for web/reporting applications.

A few things to note about the Paging Function:

  • The ORDER BY column(s) doesn’t have to be consecutive, meaning that we can avoid creating a surrogate consecutive integer key for the purpose of paging. This helps in a typical query to retrieve a page of ”active” Transaction records whereby some rows in the table may be deleted or “deactivated”, rendering broken IDs.
  • OFFSET and FETCH can only be used in the last query that participates in UNION, EXCEPT or INTERSECT operation.
  • If the column specified in the ORDER BY column(s) is not unique, the order of the output is not always consistent.

Alternative: TOP … Batching

In an earlier version, you could write a query to return the same data set using the TOP keyword.  Please note that the SET ROWCOUNT clause to limit the number of rows returned in a SELECT query, will not be supported in the next version of SQL Server – and the TOP keyword should be used instead.

-- TOP... Batching
-- Retrieve 100 rows starting from TransactionID of 5001
SELECT TOP 100
	 [TransactionID]
	,[ProductID]
	,[ReferenceOrderID]
	,[ReferenceOrderLineID]
	,[TransactionDate]
	,[TransactionType]
	,[Quantity]
	,[ActualCost]
	,[ModifiedDate]
FROM [Production].[TransactionHistoryArchive]
WHERE [TransactionID] >= 5001
ORDER BY [TransactionID]

Alternative: Direct ID Batching

An alternative to the above in an earlier version to SQL Server 2012 is shown below, The result will be the same, assuming TransactionID values are always consecutive.

-- Direct ID Batching
-- Retrieve 100 rows starting from TransactionID of 5001
SELECT
	 [TransactionID]
	,[ProductID]
	,[ReferenceOrderID]
	,[ReferenceOrderLineID]
	,[TransactionDate]
	,[TransactionType]
	,[Quantity]
	,[ActualCost]
	,[ModifiedDate]
FROM [Production].[TransactionHistoryArchive]
WHERE [TransactionID] BETWEEN 5001 AND 5100
ORDER BY [TransactionID]

By now, you may realise that the Direct ID batching method may need a surrogate consecutive key to ensure the batch/paging size is consistent, with the exception of the last page.

Execution Plan Comparison

Let’s inspect the execution plan of all three methods.

1. Direct ID Batching

As expected, here SQL Server is using Clustered Index Seek as per the index filter on the ID between the 2 values (5001 and 5100). Note that the Actual Number of Rows on the Execution Plan Details of the Clustered Index Seek is 100.

Clustered Index Seek on Production.TransactionHistory

Clustered Index Seek on Production.TransactionHistory

Actual Number of Rows = 100 for Clustered Index Seek - Direct ID Batching

2. Top Batching

Similar to the Direct ID Batching, SQL Server is also using Clustered Index Seek. It then uses TOP as the next step.  Note that here similar to Direct ID Batching, the Clustered Index Seek is also returning Actual Number of Rows of 100.

 

Clustered Index Seek on Production.TransactionHistory

Clustered Index Seek on Production.TransactionHistory

Actual Number of Rows = 100 for Clustered Index Seek - Top ... Batching

 

3. Paging Function

Surprisingly, SQL Server is using Clustered Index Scan and the Actual Number of Rows returned here is 5101!

Clustered Index Seek on Production.TransactionHistory

Clustered Index Seek on Production.TransactionHistory

 

Actual Number of Rows = 5101 for Clustered Index Seek - Paging Function

Performance Comparison

Using Paging Fuction on a source table with a large number of records may not be ideal. As you can already guess, the larger the offset size, the larger the Actual Number of Rows is returned too. Consequently, the Paging Function will take longer and longer as the paging progresses. To demonstrate this, I have a table with over 14.5 Million rows where I iterate through the records in a batch size of 500,000. Each iteration inserts the batched records into a heap table that is empty to start with.

Below is the chart exhibiting the time taken to insert the records in batches using Direct ID batching vs Top batching vs Paging function methods. To start with, the three techniques take about the same amount of time. After inserting about 1.5 Million rows the Paging Function query takes significantly longer than the other two.

SQL_Batching_Methods_Comparison_Chart

Wrap Up

Although SQL Server 2012 new Paging function is intuitive for the purpose of paging a data set, it comes at a cost for large volume of data. The Top Batching and Direct ID Batching perform significantly better under similar circumstances, i.e. simple Source Data with continuous ID.

The above discussion stems from my recent research for the 24 Hours of PASS Spring 2012 session “High Volume Data Processing Without Driving Your DBAs Crazy!” which will be held on 21 March 2012 at 16:00 AEST. 24 Hours of PASS is free and contains 24 back-to-back  great SQL Server sessions. You can find many different topics of SQL Server, from DBA, Database development, BI to Performance sessions. So, register now!

Disclaimer

Please note that every environment is different and that these tests are conducted on my small laptop and should not be used as an indication on how the above batching methods will perform in your Production environment. Please test and confirm that you understand these techniques before applying to your Production environment.

 Last updated on 25 March 2012 to add further explanation on the scenario used in this post.