SSIS 2008 – Solving DB2 connectivity with manual Connectionstring edits

February 4, 2011

Yesterday at a client i came across a new situation with SSIS 2008. Well, this wasnt really something to do with SSIS itself, and i am sure this problem could manifest with other ETL tools as well.

 

Background:

Need to build a package that connects to an IBM DB/2 server. The DB/2 connection uses a specifc username and password, embedded in the package, protected with a package password (i.e. not User Key). Build and test this on a developer workstation. Then deploy to formal DEV box for integation testing.

 

The problem:

So we build the package and tested it successfully locally on a workstation. Once deployed to the formal DEV server we tried to schedule it via a job. This job was setup to use a specific SQL Server proxy with SSIS related permissions granted. All standard thus far.

Now we run the job and find the DB/2 Connection Manager (using the Native IBM OLEDB DB2 provider) is throwing an error. Unfortunately i cant remember the exact SSIS error message and code, but it was along the lines of “could not acquire connection” and the specific error string was “Access is Denied”. This threw us off!

This led me to believe that we had not got our SQL proxy permissions correct or had not deployed the package with sensitive information correctly. So we double checked this on the developer workstation, re-deploying the package using “Rely on Server Storage for Encryption” when deploying. The job ran fine for the developer workstation.

So i got together with a DBA and we ran a DB/2 trace – trying both our job and also a package that was actually developed on the server over RDP. Strangely we could not even get the trace to start with our job, yet the package developed on the server was fine and could read rows from DB/2 and wrote to the trace.

Next we compared connectionstrings and then found the issue!

Our connection string had: Provider=IBMDB2.1

The working package had: Provider=IBMDB2.DB@COPY1

 

Very strange indeed. Turns out someone had some issues with the DEV server with both the Microsoft and IBM DB2 providers installed at one time. Their package had not worked for some reason so they removed the Microsoft provider but in the process also uninstalled an IBM 32-bit DB2 driver then installed a 64-bit version after. In the process the uninstalls may not have been clean, so the latest install had named the provider IBMDB2.DB2COPY1.

 

The Temprary Solution:

Edit the connectionstring property of the Connection Manager to match the provider string on ths server, then redeploy. Viola! However this is not a permanent solution. We want to have all environments standard and not have to change the provider name in the connectionstring (even if via configuration). it just feels wrong to do this. Normally only the server location, catalog and authentication details are changed via configs.

If anybody has some experience on how to get the provider fixed on the server i would love to hear the suggestions!

Advertisements

Dynamically Processing SSAS Objects with AMO in SSIS

December 13, 2009

Ive been meaning to write about my experience with this for some time and now that im finally blogging once again lets not waste any more time!

One of the drawbacks of the “Analysis Services Processing” Task in SSIS in SQL 2005 is that it does not honour dynamic connections i.e. those set at runtime via configurations, variables and/or expressions. Debugging the package will show that the runtime ConnectionString is indeed set, however the design time setting is used! At this time im not sure if this applies to SQL 2008 as well.

The whole point of having dynamic connections is to allow a single package to be used in multiple environments (DEV, TEST, PROD etc) by using configurations to set connections. This caused much frustration to me and it took a few hours of debugging and tracing my SSAS server to figure out what was going on.

I started huting for a better solution and came across the following posts describing the use of a Script Task and Analysis Management Objects (AMO).

This technique works wonderfully and has the added advantage of catering for newly added dimensions automatically if you simply loop over all dimensions in the database. If you dont see the AMO namespace make sure you are patched up to the latest Service Pack. Also be sure to consider if you want to do a full process or not as this is usually not the default setting. Use the ProcessType.ProcessFull parameter to deal with this.

 

EDIT: If you get the “Type Microsoft.AnalysisServices.Server is not defined” error check out this post which guides you on adding a reference to Script Task project


SQL Service Account permission considerations

December 13, 2009

Recently ive had to do a lot of work concerning the setup and configuration of service accounts for SQL 2005 and 2008. An interesting issue surfaced when trying to get SSIS to connect to an Oracle server using the Microsoft OLEDB provider for Oracle (MSDORA.1).

Our DEV and TEST environments were working fine, but I kept on getting the “ORA-12170: TNS: Connect timeout occurred” error in the PROD environment. The error indicated that the Oracle client was successfully installed and that SSIS was able to talk to it but that something else was wrong. After a few days were spent checking logs, firewalls, re-installing the Oracle clients etc we stumbled upon a breakthrough – When an administrator account was used to execute the package in PROD it worked fine!

The service account we had setup as a credential/proxy to run the job step had all the right permissions as far as sql server was concerned.. however the issue proved to be that the service account had no rights on the machine itself. Adding the account as a user on the machine running the SSIS package solved the issue.


SSIS 2005 – Could not find part of the path

December 7, 2009

This error drove myself and a colleague up the wall recently. We were trying to do a simpel file copy in SSIS 2005 from a network location to a folder on the machien running the package. We were laoding the source and destination at runtime from a table and setting the appropriate ssis properties via expressions.

We kept on getting the “Could not find part of the path” error complaining about our UNC source. Eventually we realised the destination path was not valid as a folder in the path did not exist. This error was misleading as we spent a lot of time checking permissions etc on the source path but not the destination.. where the problem actually was!


Case Sensitivity in SSIS Lookups

September 15, 2009

Sooner or later you will come across an instance where lookups in SSIS 2005/2008 fail due to attempted matches on values in which do not have exactly the same case. This is because the lookup transform itself is case-sensitive.

A possible solution is to use partial caching which will force the comparison to be made at the database level (assuming your collation is case insensitive). This option will also take a performance hit in most instances.

The best plan is to use the UPPER function in your T-SQL (or a character map transform) for your incoming values. Suggest that the lookup query itself forces uppercase so that you are guaranteed no misses. Jamie Thompson covers this in further detail on his old blog


Using MDX queries in SSIS without OpenQuery

September 6, 2009

Recently i needed to use data from an SSAS2005 cube as a source within an SSIS data flow. I thought this was going to be pretty simple but only managed to do it nicely after a bit of searching around.

Firstly i tried the obvious way i.e. create a new OLE DB connection by right-clicking from the connections pane, and then choosing the “Native OLEDB\Microsoft OLE DB Provider for Analysis Services 9.0” Provider. In the data flow i used an OLEDB source and types in my MDX query. Preview worked fine but executing the package kept giving me an error (80004005). Darren Gosbell covers this briefly here, and also looks at the linked server with OpenQuery method which i have used elsewhere.

While the linked server solution does work it certainly isnt elegant and does require server settings to be changed. This is not always feasible in reality.. or at least not without due process being followed in production environments. Luckily i came across another solution. Instead of using the OLE DB connection, we instead use an ADO.Net connection, and then use “.Net Providers for OleDb\Microsoft OLE DB Provider for Analysis Services 9.0” provider and configure the connection to your cube. From there use a Datareader source and configure it as per normal, substituting your MDX query instead of T-SQL. The slight downside is that all fields are returned as DT_NTEXT. You will need to convert to DT_WSTR first and then perform a second conversion to the required types. Still better than openquery!