SSIS 2008 – Solving DB2 connectivity with manual Connectionstring edits

February 4, 2011

Yesterday at a client i came across a new situation with SSIS 2008. Well, this wasnt really something to do with SSIS itself, and i am sure this problem could manifest with other ETL tools as well.



Need to build a package that connects to an IBM DB/2 server. The DB/2 connection uses a specifc username and password, embedded in the package, protected with a package password (i.e. not User Key). Build and test this on a developer workstation. Then deploy to formal DEV box for integation testing.


The problem:

So we build the package and tested it successfully locally on a workstation. Once deployed to the formal DEV server we tried to schedule it via a job. This job was setup to use a specific SQL Server proxy with SSIS related permissions granted. All standard thus far.

Now we run the job and find the DB/2 Connection Manager (using the Native IBM OLEDB DB2 provider) is throwing an error. Unfortunately i cant remember the exact SSIS error message and code, but it was along the lines of “could not acquire connection” and the specific error string was “Access is Denied”. This threw us off!

This led me to believe that we had not got our SQL proxy permissions correct or had not deployed the package with sensitive information correctly. So we double checked this on the developer workstation, re-deploying the package using “Rely on Server Storage for Encryption” when deploying. The job ran fine for the developer workstation.

So i got together with a DBA and we ran a DB/2 trace – trying both our job and also a package that was actually developed on the server over RDP. Strangely we could not even get the trace to start with our job, yet the package developed on the server was fine and could read rows from DB/2 and wrote to the trace.

Next we compared connectionstrings and then found the issue!

Our connection string had: Provider=IBMDB2.1

The working package had: Provider=IBMDB2.DB@COPY1


Very strange indeed. Turns out someone had some issues with the DEV server with both the Microsoft and IBM DB2 providers installed at one time. Their package had not worked for some reason so they removed the Microsoft provider but in the process also uninstalled an IBM 32-bit DB2 driver then installed a 64-bit version after. In the process the uninstalls may not have been clean, so the latest install had named the provider IBMDB2.DB2COPY1.


The Temprary Solution:

Edit the connectionstring property of the Connection Manager to match the provider string on ths server, then redeploy. Viola! However this is not a permanent solution. We want to have all environments standard and not have to change the provider name in the connectionstring (even if via configuration). it just feels wrong to do this. Normally only the server location, catalog and authentication details are changed via configs.

If anybody has some experience on how to get the provider fixed on the server i would love to hear the suggestions!

WMWare workstation – copying, snapshots, cloning

February 4, 2011

A lot of people these days use Virtual Machines for testing and development due to their flexibility and versatility.

I make heavy use of them to host demos and also to have an instance of various environments at hand. For example, i have a the following VMs:

  • Windows XP
  • Windows Server 2003/SQL 2005/Sharepoint 2007/PPS07
  • Windows Server 2003 R2/SQL 2008
  • Windows Server 2008 R2/SQL 2008R2/Sharepoint 2010
  • Windows Server 2008R2/Denali

Sometime when building an image i like to keep a complete “snapshot” backed up at various points so i can revert to it if need be. I also want to use the “snapshot” as a base for future work. For example, once i have Windows server 2008 R2 installed i could store that away so that next time i have this ready to go to build a different VM.

I say “snapshot” in quotes here because i did not use the Snapshot feauture of VMware workstation. This lets you keep a single physical copy of a VM with snapshots stored as part of the whole VM defnition. Note many native VMware snapshots on a VM can degrade performance over time due to the differential nature of the storage after snapshotting. Anyway, I specifically want a separate image that i can pick up and at any time and work with independently.

So, in order to satisfy my requirement i would simply copy my VM folder to a backup location after each signficiant point of development. All good? Not really! One day i had some corruption in a VM and needed to revert to a previous copy. I booted it up only to find it was behaving strangely and wasnt the copy i expected it to be. I thought somehow it had gotten corrupt/confused as well.

After a few days i looked at the VM settings and figured it out. When you copy a VM it still points to the virtual hard disk at the original locations! So my pristine backup was botting off the corrupted virtual disk! There are two ways out of this.

  1. Reconfigure the backup VM to point to the virtual hard drive in its own folder
  2. The preferred method – to avoid this situation in the first place – dont simply copy the VM to back it up. Use the Clone option in Workstation and it will make a proper independent copy. You can also use this to rename a Virtual Machine safely.

This may seem like a trivial and newbie mistake – it probably is. But hey, if it took me years to run into this situation perhaps this info will hep someone else!

Customizing Filter Persistence in PPS2007

November 14, 2010

A couple of days ago i had to get into the PPS 2007 backend stored procs in order to customize the filter behaviour. As we know, PPS remembers the value of the filter selection across user sessions. This can be a good thing or a bad thing depending on your setup and requirements.

In my case I had a number a number dashboards built with a filter called “Site”. This filter was populated from an SSAS 2005 cube and simply populated a single-select list with a list of facilities that my client had in their business. So a user would pick a facility from the list and the dashboard would then filter the information based on the selection.

We had leveraged security in SSAS to lock down access to facilities via roles (as is best practice.) This worked great, since users would only be able to see facilities in the dropdown if they had been granted access via an SSAS role.

Most users in the business had access to only one facility, so the filter list would contain just a single item. This way regardless of which dashboard they visited, the “Site” filter would have the same selection.

So far so good.. but here things start causing problems. Some users needed access to more than one facility, so again the roles in SSAS reflected this. The issue was that these select users were now getting “lost” because they would select “Facility A” on one dashboard, then drill down into a detailed dashboard expecting it to remember the selection from another dashboard. Some things to note here:

  • The “drilldown” was a hack to get to a completely different dashboard. Sadly, PPS does not support URL links (while the older BSM did!). So I had used this commonly used trick originally highlighted by Nick Barclay
  • PPS does not natively support passing ANY values between dashboard pages

So I started researching how to turn off the filter persistence. Annoyingly this is not a simple on/off setting somewhere.. not even in a text config file! What is required is to get into your PPSMonitoring database and comment out some code in the stored proc [dbo].[ParameterValuesCreate]. So this only gets me halfway there.. now any filter will always default to the first item in the list.

So i knew i needed to customise this stored proc and turned out i needed to learn some XPath in the process. My requirements summarised:

  • PPS must remember filter values per user ONLY for filters called Site.
  • The selection must persist across any dashboard. So a user looking at “Facility B” will always see this selected no matter which dashboard they change to.

The final solution was:

  • Look at [dbo].[FCObjects], parse the XML column [SerializedXML] to extract the GUID for filters called “Site”. (This means the facility filters must be named “Site” all the time for this to work everyehere in the solution).
  • Within [SerializedXML] i get a number of ParameterUIElement nodes for each dashboard. Each of these contains a single filter defitinion, as a descendant node of a dashboard element.
  • I then had to loop through these to extract just a single node pertaining to the site filter. This is done for each dashboard, so effectively i end up with a list of all GUIDs for any filter called site across the whole solution.
  • Finally i check if the filter being passed to the stored proc is actually a Site filter by seeing if it exists in the list created above. If it is, I update the rows in the table [dbo].[ParameterValues] which correspond to the GUIDs i got earlier (i.e. “Site” filter values)

So here is the code below.. noting that this was a first hack and there are improvements I can think of already such as getting rid the of cursor and possible replacing with a CTE.. also creating a table of the GUIDs found so that we dont have to parse [dbo].[FCObjects] every single time a dashboard is viewed or filter is changed. This version may get slow if there are many items in the PPSMonitoring DB.

USE [PPSMonitoring]
/****** Object: StoredProcedure [dbo].[ParameterValuesCreate] Script Date: 11/12/2010 11:03:15 ******/

SET ansi_nulls ON
SET quoted_identifier ON

ALTER PROCEDURE [dbo].[Parametervaluescreate] @Login [NVARCHAR](1800), @ParameterUniqueName [NVARCHAR](100),@SerializedXml [XML]
SET nocount ON
SET @tErrMsg = ‘ParameterValuesCreate Failed For Object – ‘
DECLARE @TransactionIsOurs INT

— init
SET @TransactionIsOurs = 0
BEGIN — only if @@TRANCOUNT is 0, we do BEGIN TRAN
SET @TransactionIsOurs = 1

— BEGIN MOD BY BHAVIK – customise filter persistence behaviour
/* —————————— */
— check if this was a “Site” parameter, if so update all other site parameters for this user
— this way site will be consistent across any dashboard they visit
— ** Note – filter MUST be called Site in any dashboards for this code to work across all **

— vars
DECLARE @tempxml XML

SET @search = ‘Site’

— retrieve the guids for all filters in dashboard elements with name “Site”
— this brings back ALL ParameterUIElement items for the dashboard
SELECT serializedxml.QUERY(‘/Dashboard/ParameterElements/ParameterUIElement’)
FROM dbo.fcobjects
WHERE serializedxml.EXIST(‘/Dashboard/ParameterElements/ParameterUIElement/Properties/BpmProperty[@Text=sql:variable(“@search”)]’) = 1

— iterate over the list of ParameterUIElements and filter keep only the “Site” ones
OPEN nodeloop
FETCH NEXT FROM nodeloop INTO @NodeList
SELECT @tempXML = c.QUERY(‘.’)
FROM @NodeList.NODES(‘/ParameterUIElement’) t(c)

SELECT @tempXML.VALUE(‘(/ParameterUIElement/@Guid)[1]’, ‘uniqueidentifier’)

FETCH NEXT FROM nodeloop INTO @NodeList

CLOSE nodeloop

IF EXISTS (SELECT id FROM @FilterGUIDS WHERE id = @ParameterUniqueName)

/* —————————— */
–If this parameter value already exists (for this login), update. Otherwise, insert.
FROM [ParameterValues]
AND [ParameterUniqueName] = @ParameterUniqueName) )
UPDATE [ParameterValues]
SET [LastUpdated] = Getdate(),
[SerializedXml] = @SerializedXml
WHERE [Login] = @Login
AND [ParameterUniqueName] = @ParameterUniqueName

IF @@ERROR <> 0
RAISERROR (5580001, 16, 1, @tErrMsg, 7, N’ParameterValues’) WITH LOG
SET @ReturnCode = 1
GOTO exit_label
–Insert record
INSERT INTO [ParameterValues] ([Login],[ParameterUniqueName],[LastUpdated],[SerializedXml])

VALUES (@Login, @ParameterUniqueName, Getdate(), @SerializedXml)

IF @@ERROR <> 0
RAISERROR (5580002,16,1,@tErrMsg,8,N’ParameterValues’) WITH LOG
SET @ReturnCode = 1
GOTO exit_label

/* —————————— */
— update all Site params for this user
UPDATE [ParameterValues]
SET serializedxml = @SerializedXml,
lastupdated = Getdate()
WHERE parameteruniquename IN (SELECT id FROM @FilterGUIDS)
AND [Login] = @Login

/* —————————— */

IF @TransactionIsOurs = 1

IF @TransactionIsOurs = 1
RETURN @ReturnCode

SQL Server 11 “Denali” CTP1 – Part 1 – SSMS Changes

November 10, 2010

The next version of SQL Server, Codename “Denali” had its CTP1 release today. Ive downloaded it and installed on a test VM and thought i would update the blog my experience so far.

Rather than jump under the covers straight away, i referred to the “Whats New” document online as a starting point. This looks to be a little incomplete in some places at the moment. Im going to summarise the bits i found most interesting and relevant to my work i.e. this is not an exhaustive list of all new features. As i looked at each of the features in the documentation i had a play around with some in SSMS or elsewhere.


  • Right off the bat SSMS has had a littble bit of an overhaul.. you will notice new blue background shading and windows are more dockable like in Visual Studio 2010:

  • Something i discovered by mistake is that the dockable items can snap to the side or top of your screen just like any other Windows 7/Server 2008 window.. interesting. This means multi-monitor support is there. As confirmed by Brent Ozar here:

The default font for queries has changed changed to the fixed-width “Consolas” font. I also found it cool that you could CTRL-mouseup or down to zoom, just like Word and other apps. Screenshot shows me zooming to an exaggerated level.. note the Zoom level box in the bottom left:

I will update this post as i discover more new things in SSMS.

Getting Data Structure Changes to show up in Dashboard Designer

October 12, 2010

If you find you are working with Dashboard Designer and Sharepoint 2010 and you have made some changes to the backend (ive tested with cubes/tables), they will not automatically appear consistently in DD. For example if you have a SQL Server table source and add some columns, when you preview the data the new columns will show up, however they wont appear as new “Dimension” items for use in scorecards for example.

The trick here is to edit the data source and re-pick your database and table/cube. This forces DD to re-evaluate the source and fetch new metadata. Fortunately you wont have to remap your Time Intelligence!

Locating SQL Server Samples

October 6, 2010

When trying to get SQL Server samples from codeplex you may wonder where they all go. This is something i keep coming across as it is not something one does often. So here is quick guide to where to find them once installed.

Assuming you installed all the SQL server samples available, you will notice that after installing samples you will have a Start menu shortcut leading to:

C:\Program Files\Microsoft SQL Server\100\Samples

Here you will find the Adventurworks online/offline reports and other SSIS/SSAS samples.

However, if you install the Adventureworks OLTP, DW, and OLAP databases (usually if you are working with B.I.) you need to find the location of some projects in order to deploy. Now you need to look in the Toosl folder i.e.

C:\Program Files\Microsoft SQL Server\100\Tools\Samples

i have assumed a default SQL installation directory on drive C.

Deployment settings for SSRS 2008 R2 in SP2010 Integrated mode

September 25, 2010

If you have problems deploying reports/data sources etc to Sharepoint 2010 and are certain that permissions etc are all setup correctly it may help to check your project deployment settings. If these are not correct you can get the SSRS login prompt coming up indefinitely when you try to deploy, regardless of the correct user/pass being supplied.

When deploying to Sharepoint integrated reporting servers (and this applies to WSS/MOSS07 as well) you need to explicitly specify the full path to the sharepoint site AND the other deployment folders. As an example, i have a sample BI site running under http://2010VM/sites/BI and have a report library Reports sitting under there. I want to deploy the Adventureworks samples there, so here is a screen shot of basic settings that should work: