Missing “Report Library” template in Sharepoint 2010

February 23, 2011

When working with the BI features of Sharepoint 2010 you may come across a situation where you configure the SSRS Integration and want to then add a Report Library. If you go to your Libraries and hit Create you get the template selection modal dialog (needs Silverlight installed.)

Here, under the Data category you would expect to see the “Report Library” template along with other BI templates. If you dont, it probably means that you dont have the Enterprise features enabled for the Site or Site Collection. Information on how to solve this issue is not forthcoming on the web. Here is what you need to do to get it to appear.

  1. Go to Central Administration
  2. Click Site Actions -> Site Settings:
  3. Under Site Collection Administration, select “Site Collection Features”
  4. Activate “Sharepoint Server Enterprise Site Collection Features”

This should fix your problem and enable the Report Library and other templates, which you can now add to the site. You may need to edit the library to allow the Report content types (effectively allowing the .rdl extension).

Advertisements

Corrupt SSAS databases – a partial solution

February 9, 2011

Today (for the second time in my expereince) i had a VM go corrupt and lost the normal operation of my SSAS 2008 R2 instance. I noticed that only one database was showing up in SSMS but it wasnt working i.e. you could not query/browse the cube due to error dialogs popping up.

Strangely other SSAS databases were queryable (i.e. dashboards worked) – these were the ones NOT showing up in SSMS.

I ran a SQL Profiler trace and noticed the following when trying to start the SSAS service:

Errors in the metadata manager. The dimension with ID of ‘Dim Customer’, Name of ‘Customer’ referenced by the ‘Customer Clusters ~MC’ cube, does not exist. Errors in the metadata manager. An error occurred when loading the Customer Clusters ~MC cube, from the file, ‘\\?\C:\Program Files\Microsoft SQL Server\MSAS10_50.MSSQLSERVER\OLAP\Data\Adventure Works DW 2008R2.0.db\Customer Clusters ~MC.3.cub.xml’.

 

I looked at my data folder (as per error above) and it seemed like somehow a number of the XML files SSAS uses to stored definitions of objects were no longer named correctly.. the “~MC” didnt seem right.

Luckily for me it was the Adventureworks database that was corrupt. I needed access to the other databases which have my demo work on them. So i did the following:

  1. Stop SSAS service
  2. Browse to the Data folder
  3. Delete the following:
  • File: Adventure Works DW 2008R2.8.db.xml
  • Folder: Adventure Works DW 2008R2.0.db

This effectively removes the corrupt Adventureworks data files. Then i restarted SSAS service and was able to work as per usual.

So this is not the ideal outcome (i.e. cube lost) but it does allow your SSAS instance return to normal behaviour if one database is causing issues and you are unable to manage it with SSMS.


Incorrect format for date params (SSRS Integrated mode or using Report Viewer web part)

February 5, 2011

Yesterday I had a client ring me up saying some reports i deployed for testing were not functioning. The issue was that an input paramter of Date type was not accepting the date they chose – if they used the calendar picker. They got the error: “The vvalue provided for the report Parameter ‘Date’ is noty valid for its type.:

I am in Australia and we use the non-US (and more generally used) dd/mm/yyyy format in our dates. Closer inspection showed that after the users picked the date it was actually filling in the parameter using the incorrect (i.e. US) format: mm/dd/yyyy. We checked the locale settings in the report definition and the server itself and all was well: en-AU

The fix for this was found via google and another Aussie seems to have run across this problem

Essentially you need to edit the ReportViewer.aspx file and add the culture setting. The file is here by default:

C:\program files\microsoft SQL Server\MSQL.3\Reporting Services\ReportServer\Pages

At the top the page directive needs a change, as shown:

<%@ Page Language=”C#” AutoEventWireup=”true” Inherits=”Microsoft.ReportingServices.WebServer.ReportViewerPage” Culture=”en-AU”%>

Check out the comments on the link for some alternate solutions too.. it seems this fix did work for everyone.


A PowerPivot success story

February 4, 2011

Ive always been impressed with and realised the (forgive the pun) “power” of PowerPivot. However recently ive been charged with leading my organisation down a path of Microsoft BI across the enterprise – 1000+ employees.

So now ive really forced myself to join the PowerPivot revolution in a more hands-on fashion, seeing it as an ideal tool to give to our managers while we build the formal warehouse. ive used it to prototype chargables and utilisation tracking across our timesheets and im happy with the result so far. Now that ive had more experience with it and am much more comfortable with it i had agreat success story today at a client.

Background

The client called me in to advise on some best practices around Microsoft BI and help them address some issues around their attempt at building a cube. I also ran them through Sharepoint 2010/Excel Services/PerformancePoint (whic hthey LOVED)! But thats besides the point.

The current situation saw them loading masses of data straight from production servers into classic Excel pivot tables (e.g. 1gb+) and refreshing that daily to then carry around – more sales force oriented. They also have a number of power users who are very comfortable with Excel but were hampered by its classic pivot limitations. They also manage to bring down their prod DB every now and then. Not good for an ERP (or any system really)!

 

Prototype Solution:

So I fixed up their cube to make it work (though hit was far from best practice – based on as single denormalised table!) and then i pulled those rows into PowerPivot – which they had never seen before.

Asked them to take a look at the row count – 1.7 million

Asked them to save the file and take a look at the size – 6.65Mb! “How does it do that?”  was the question. Enter explanation of Vertipaq 🙂

Once they understood the flexibility of multiple data sources, custom measures, plus the compression factor they went nuts! Add to that the web based delivery via Sharepoint – sold! They now have a lot to play with and think about and to add to this they are much closer to making a decision to implement Sharepoint 2010. All this in 3 hours!

I think the best thing Microsoft did was make the PowerPivot add-on free. Now thats value 🙂


SSIS 2008 – Solving DB2 connectivity with manual Connectionstring edits

February 4, 2011

Yesterday at a client i came across a new situation with SSIS 2008. Well, this wasnt really something to do with SSIS itself, and i am sure this problem could manifest with other ETL tools as well.

 

Background:

Need to build a package that connects to an IBM DB/2 server. The DB/2 connection uses a specifc username and password, embedded in the package, protected with a package password (i.e. not User Key). Build and test this on a developer workstation. Then deploy to formal DEV box for integation testing.

 

The problem:

So we build the package and tested it successfully locally on a workstation. Once deployed to the formal DEV server we tried to schedule it via a job. This job was setup to use a specific SQL Server proxy with SSIS related permissions granted. All standard thus far.

Now we run the job and find the DB/2 Connection Manager (using the Native IBM OLEDB DB2 provider) is throwing an error. Unfortunately i cant remember the exact SSIS error message and code, but it was along the lines of “could not acquire connection” and the specific error string was “Access is Denied”. This threw us off!

This led me to believe that we had not got our SQL proxy permissions correct or had not deployed the package with sensitive information correctly. So we double checked this on the developer workstation, re-deploying the package using “Rely on Server Storage for Encryption” when deploying. The job ran fine for the developer workstation.

So i got together with a DBA and we ran a DB/2 trace – trying both our job and also a package that was actually developed on the server over RDP. Strangely we could not even get the trace to start with our job, yet the package developed on the server was fine and could read rows from DB/2 and wrote to the trace.

Next we compared connectionstrings and then found the issue!

Our connection string had: Provider=IBMDB2.1

The working package had: Provider=IBMDB2.DB@COPY1

 

Very strange indeed. Turns out someone had some issues with the DEV server with both the Microsoft and IBM DB2 providers installed at one time. Their package had not worked for some reason so they removed the Microsoft provider but in the process also uninstalled an IBM 32-bit DB2 driver then installed a 64-bit version after. In the process the uninstalls may not have been clean, so the latest install had named the provider IBMDB2.DB2COPY1.

 

The Temprary Solution:

Edit the connectionstring property of the Connection Manager to match the provider string on ths server, then redeploy. Viola! However this is not a permanent solution. We want to have all environments standard and not have to change the provider name in the connectionstring (even if via configuration). it just feels wrong to do this. Normally only the server location, catalog and authentication details are changed via configs.

If anybody has some experience on how to get the provider fixed on the server i would love to hear the suggestions!


WMWare workstation – copying, snapshots, cloning

February 4, 2011

A lot of people these days use Virtual Machines for testing and development due to their flexibility and versatility.

I make heavy use of them to host demos and also to have an instance of various environments at hand. For example, i have a the following VMs:

  • Windows XP
  • Windows Server 2003/SQL 2005/Sharepoint 2007/PPS07
  • Windows Server 2003 R2/SQL 2008
  • Windows Server 2008 R2/SQL 2008R2/Sharepoint 2010
  • Windows Server 2008R2/Denali

Sometime when building an image i like to keep a complete “snapshot” backed up at various points so i can revert to it if need be. I also want to use the “snapshot” as a base for future work. For example, once i have Windows server 2008 R2 installed i could store that away so that next time i have this ready to go to build a different VM.

I say “snapshot” in quotes here because i did not use the Snapshot feauture of VMware workstation. This lets you keep a single physical copy of a VM with snapshots stored as part of the whole VM defnition. Note many native VMware snapshots on a VM can degrade performance over time due to the differential nature of the storage after snapshotting. Anyway, I specifically want a separate image that i can pick up and at any time and work with independently.

So, in order to satisfy my requirement i would simply copy my VM folder to a backup location after each signficiant point of development. All good? Not really! One day i had some corruption in a VM and needed to revert to a previous copy. I booted it up only to find it was behaving strangely and wasnt the copy i expected it to be. I thought somehow it had gotten corrupt/confused as well.

After a few days i looked at the VM settings and figured it out. When you copy a VM it still points to the virtual hard disk at the original locations! So my pristine backup was botting off the corrupted virtual disk! There are two ways out of this.

  1. Reconfigure the backup VM to point to the virtual hard drive in its own folder
  2. The preferred method – to avoid this situation in the first place – dont simply copy the VM to back it up. Use the Clone option in Workstation and it will make a proper independent copy. You can also use this to rename a Virtual Machine safely.

This may seem like a trivial and newbie mistake – it probably is. But hey, if it took me years to run into this situation perhaps this info will hep someone else!