Corrupt SSAS databases – a partial solution

February 9, 2011

Today (for the second time in my expereince) i had a VM go corrupt and lost the normal operation of my SSAS 2008 R2 instance. I noticed that only one database was showing up in SSMS but it wasnt working i.e. you could not query/browse the cube due to error dialogs popping up.

Strangely other SSAS databases were queryable (i.e. dashboards worked) – these were the ones NOT showing up in SSMS.

I ran a SQL Profiler trace and noticed the following when trying to start the SSAS service:

Errors in the metadata manager. The dimension with ID of ‘Dim Customer’, Name of ‘Customer’ referenced by the ‘Customer Clusters ~MC’ cube, does not exist. Errors in the metadata manager. An error occurred when loading the Customer Clusters ~MC cube, from the file, ‘\\?\C:\Program Files\Microsoft SQL Server\MSAS10_50.MSSQLSERVER\OLAP\Data\Adventure Works DW 2008R2.0.db\Customer Clusters ~MC.3.cub.xml’.

 

I looked at my data folder (as per error above) and it seemed like somehow a number of the XML files SSAS uses to stored definitions of objects were no longer named correctly.. the “~MC” didnt seem right.

Luckily for me it was the Adventureworks database that was corrupt. I needed access to the other databases which have my demo work on them. So i did the following:

  1. Stop SSAS service
  2. Browse to the Data folder
  3. Delete the following:
  • File: Adventure Works DW 2008R2.8.db.xml
  • Folder: Adventure Works DW 2008R2.0.db

This effectively removes the corrupt Adventureworks data files. Then i restarted SSAS service and was able to work as per usual.

So this is not the ideal outcome (i.e. cube lost) but it does allow your SSAS instance return to normal behaviour if one database is causing issues and you are unable to manage it with SSMS.


Adding SSAS roles via Powershell

September 6, 2010

I needed to delve into Powershell again recently as a means to automate the creation of a long list of roles and the addition of AD groups to them.

Since there were so many i thought it would be easier to have a list of roles in a text file and just parse that to create all your roles automatically. The script posted here is a first pass that was used during development and does the following:

  • iterate through a text file containing a list of roles.
  • For each line found:
    • check if the role exists and then delete it. (the script caters for an initial role creation so the delete was more for testing)
    • Create a new role object from the current line in the source text
    • Add role to SSAS database
    • Add database permission for role
    • Add cube permissions for role to all cubes in database
    • Associated an AD group and allowed set with the role – my example uses [Organisation].[Cost Centre Code] where the member names are the same as the role names. Another assumption for is that the AD group has the same name as the role. One could extend this script easily to parse a CSV for more complex requirements)
    • log any errors found when adding new objects to the database


I got the basic code from Darren’s post and made a bunch of changes to suit my requirements. So here it is….

# Script to create roles in EIS cube databases via a text file

[System.reflection.Assembly]::LoadWithPartialName(“Microsoft.AnalysisServices”)
[System.reflection.Assembly]::LoadWithPartialName(“System.IO”)

$logFile = “PATH TO LOG FILE”
$sw = new-object System.IO.StreamWriter($logFile)

$svr = new-Object Microsoft.AnalysisServices.Server
$svr.Connect(“localhost”)

$dbname = “YOUR_SSAS_DB_NAME”
$db = $svr.Databases.item($dbname)

# Print the Database Name
“Adding Roles to Database: ” + $db.Name

#read file here
$roleLines = Get-Content “PATH TO TEXT FILE CONTAINING LIST OF ROLE NAMES”

#repeat for every line in the file
foreach ($roleName in $roleLines)
{
“Processing Role for: ” + $roleName
$groupname = “DOMAIN_NAME\” + $roleName

#check if exists and delete if so
$roleToDelete = $db.Roles.FindByName($roleName)

if ($roleToDelete)
{
“Existing role found… deleting!”
$roleToDelete.Drop(“AlterOrDeleteDependents”)

$error.clear()
if ($error[0])
{
$sw.WriteLine($rolename + “: Error deleting existing : ” + $error[0])
}

}
else
{

#Create Role and add AD group
[Microsoft.AnalysisServices.Role] $roleToCreate = new-Object([Microsoft.AnalysisServices.Role])($rolename)
$roleToCreate.Members.Add($groupname)

#Add to db
“… ‘” + $roleToCreate.Name + “‘ adding to database”
$db.Roles.Add($roleToCreate)

$error.clear()
$roleToCreate.Update()

if ($error[0])
{
$sw.WriteLine($rolename + “: Error Adding to DB: ” + $error[0])
}
else
{

#Create Database permission
“… ‘” + $roleToCreate.Name + “‘ updating database permissions”
$dbperm = $db.DatabasePermissions.Add($roleToCreate.ID)
$dbperm.Read = [Microsoft.AnalysisServices.ReadAccess]::Allowed

$error.clear()
$dbperm.Update()

if ($error[0])
{
$sw.WriteLine($rolename + “: Error updating DB Perms: ” + $error[0])
}
else
{

# add role to each cube
foreach ($cube in $db.Cubes)
{

#create Cube permission
“… ‘” + $roleToCreate.Name + “‘ adding to permissions in : ‘” + $cube.Name + “‘ cube”
$cubeperm = $cube.CubePermissions.Add($roleToCreate.ID)
$cubePerm.Read = [Microsoft.AnalysisServices.ReadAccess]::Allowed

$error.clear()
$cubeperm.Update()

if ($error[0])
{
$sw.WriteLine($rolename + “: Error updating Cube Perms: ” + $error[0])
}
}

if(!$error[0])
{

#create dimension permission
“… ‘” + $roleToCreate.Name + “‘ adding to Org Structure Dim permissions”
$dim = $db.Dimensions.FindByName(“Organisation”)
$att = $dim.Attributes.FindByName(“Cost Centre Code”)
$dim.DimensionPermissions.Add($roleToCreate.ID)

$dimperm = $dim.DimensionPermissions.add($roleToCreate.ID)
$dimperm.Read = [Microsoft.AnalysisServices.ReadAccess]::Allowed

“… ‘” + $roleToCreate.Name + “‘ adding cost centre restriction”
$attPerm = $dimPerm.AttributePermissions.Add($att.ID)
$attPerm.AllowedSet = “{[Organisation].[Cost Centre Code].&[” + $roleName + “]}”

$error.clear()
$dimPerm.Update()

if ($error[0])
{
$sw.WriteLine($rolename + “: Error updating Dimension Perms: ” + $error[0])
}
}

}

}

}

}

#cleanup
$sw.close()
$svr.Disconnect()


Dynamically Processing SSAS Objects with AMO in SSIS

December 13, 2009

Ive been meaning to write about my experience with this for some time and now that im finally blogging once again lets not waste any more time!

One of the drawbacks of the “Analysis Services Processing” Task in SSIS in SQL 2005 is that it does not honour dynamic connections i.e. those set at runtime via configurations, variables and/or expressions. Debugging the package will show that the runtime ConnectionString is indeed set, however the design time setting is used! At this time im not sure if this applies to SQL 2008 as well.

The whole point of having dynamic connections is to allow a single package to be used in multiple environments (DEV, TEST, PROD etc) by using configurations to set connections. This caused much frustration to me and it took a few hours of debugging and tracing my SSAS server to figure out what was going on.

I started huting for a better solution and came across the following posts describing the use of a Script Task and Analysis Management Objects (AMO).

This technique works wonderfully and has the added advantage of catering for newly added dimensions automatically if you simply loop over all dimensions in the database. If you dont see the AMO namespace make sure you are patched up to the latest Service Pack. Also be sure to consider if you want to do a full process or not as this is usually not the default setting. Use the ProcessType.ProcessFull parameter to deal with this.

 

EDIT: If you get the “Type Microsoft.AnalysisServices.Server is not defined” error check out this post which guides you on adding a reference to Script Task project


Using MDX queries in SSIS without OpenQuery

September 6, 2009

Recently i needed to use data from an SSAS2005 cube as a source within an SSIS data flow. I thought this was going to be pretty simple but only managed to do it nicely after a bit of searching around.

Firstly i tried the obvious way i.e. create a new OLE DB connection by right-clicking from the connections pane, and then choosing the “Native OLEDB\Microsoft OLE DB Provider for Analysis Services 9.0” Provider. In the data flow i used an OLEDB source and types in my MDX query. Preview worked fine but executing the package kept giving me an error (80004005). Darren Gosbell covers this briefly here, and also looks at the linked server with OpenQuery method which i have used elsewhere.

While the linked server solution does work it certainly isnt elegant and does require server settings to be changed. This is not always feasible in reality.. or at least not without due process being followed in production environments. Luckily i came across another solution. Instead of using the OLE DB connection, we instead use an ADO.Net connection, and then use “.Net Providers for OleDb\Microsoft OLE DB Provider for Analysis Services 9.0” provider and configure the connection to your cube. From there use a Datareader source and configure it as per normal, substituting your MDX query instead of T-SQL. The slight downside is that all fields are returned as DT_NTEXT. You will need to convert to DT_WSTR first and then perform a second conversion to the required types. Still better than openquery!