SQL Server Analysis Services (SSAS) multidimensional cube: How to create a role playing dimension?

Standard

Summary:

I was recently asked by a SSAS developer about how to set up a role-playing dimension so I did a brief demo to show how to set it up. In this post, I am document the steps to help you get started:

What is role-playing dimension?

What if you want to see sales data by Order Date as well as Shipped Date? You can one date dimension that is used more than once for seeing sales data by order date and shipped date. In this case, date dimension becomes a Role Playing Dimension.

Steps:

1. Open SQL Server Data Tools and SSAS project that you are working on.

2. Go to Data Source Views from the solution explorer.

3. Make sure that the relationships are specified between dimension (e.g. date dimension) and Fact Table (e.g. Fact Sales). Example: If you’ve OrderDateSK and ShippedDateSK in your fact table, then establish a relationship between the both columns with the Primary Key of the Date dimension.

SSAS Data Source View Role playing dimension

Note: If you do not have the relationships specified in the data source views, there’s an alternative to go to dimension usage for a cube and setup role playing dimensions. since this is targeted to get you started, I am not covering the steps for that.

4. Now go to the Dimensions section in the solution explorer and add a Dimension that uses the existing data source view from step #3.

5. switch to Cubes.

Now, if you don’t have a cube created yet then you can use the cube wizard if you want and as long as you have the relationships specified at DSV’s, it should create a role-playing dimension.

Alternatively, you can also create a new dimension. In the Design pane, lower left section, you’ll see dimensions.

Add Dimensions Cube SSAS

select your dimension that you created in step #4 that’s a role-playing dimension from the list and click on OK. And it creates a role-playing dimension for you! So all you had to do was to make sure the relationships are created between the fact and (role-playing) dimension table and SSAS does the rest of the work for you

SSAS Role Playing Dimension

 

Note: for the demo I didin’t rename the dimension attributes but you do all that good stuff too when you are building the cube for users!

Conclusion:

In this post you got started on how to add a role-playing dimension in a SSAS cube.

SSAS Multidimensional cube: How to solve duplicate attribute key error when attribute is non key?

Standard

Problem:

In SSAS multidimensional cube, while processing a cube, you get an error/warning for a non-key “price” column that says:

Errors in the OLAP storage engine: A duplicate attribute key has been found when processing: Table: ‘Table Name’, Column: ‘Price’, Value: ’29’. The attribute is ‘Price’.

You see, if it would complain about a key column having duplicate attribute that would have been fine but it’s complaining about duplicate attribute key when an attribute is non key. How do you solve this?

Solution:

mostly this is because that the view/table that you are using in your data source views have blanks/nulls.

so in your source, use ISNULL/COALESCE TSQL function before bringing data in the cube OR if it makes sense for you then you can filter out blank data too.

I hope that helps.

SQL Server Reporting Services: How to Solve Divide by Zero Errors?

Standard

Problem:

you thought you wrote an if condition to handle “divide by zero” errors in SSRS but still get the run time error?

Example. my expression is:

[code language=”sql”]
=IIF(Fields!denominator.Value=0,0,
CINT(Fields!numerator.Value/Fields!denominator.Value))
[/code]

And I still get the error:
Divide by Zero Error SSRS

Note: if you do not have integers to divide and you allow data conversion to double, it would show on SSRS as “infinity” instead of “#Error”.

Solution:

Let’s understand why does it do it?

So even though you are using IIF condition, the vb IIF condition evaluates every function in every expression before it renders and that’s why the “False” condition that I have in my condition gets run which results in #Error.

Ok, armed with that knowledge, let’s solve the problem.

So here’s a modified version of the expression, have a look:

See what we did there! We added one more IIF condition in the “false” condition of the parent IIF.

[code language=”sql”]
=IIF(Fields!denominator.Value=0,0,
CINT(Fields!numerator.Value
/IIF(Fields!denominator.Value<>0,Fields!denominator.Value,1)))
[/code]

That should solve the problem:
Divide by zero fix customer code ssrs

There’s also an alternative to this especially if you have a lot of expressions that does this. You can write your custom code and call it SSRSDIVIDE or you can come up with a better name! Here’s a post that talks about how to do that: http://salvoz.com/blog/2011/11/25/ssrs-checking-for-divide-by-zero-using-custom-code/

Conclusion:
In this post, we saw how to solve the divide by zero errors in SSRS.

#sqlpass #msbi online event: “The Accidental Report Designer: Data Visualization Best Practices in #SSRS”

Standard

PASS Business Analytics VC presents “The Accidental Report Designer: Data Visualization Best Practices in SSRS” by Meagan Longoria. Here are the meeting details:

Date & Time: Thu, June 19 2014 12:00 Eastern Daylight Time

RSVP URL:

http://bit.ly/PASSBAVC061914

Session Abstract:
Whether you are a DBA, a developer, or an analyst, there is a good chance that you will have to create reports for coworkers, company executives, or clients. As with any UI design, careful consideration should be given to your data visualization design to ensure you are effectively communicating the intended message and providing a good user experience. While the principles are applicable across reporting platforms and tools, this session will contain demos and examples implemented in Reporting Services using SQL Server Data Tools. Learn how to make information (not just data) the focus of your report and provide your audience with something better than just shiny!

Session Level:
Intermediate

Speaker BIO:
Meagan Longoria is a BI consultant with Valorem Consulting in Kansas City, Missouri. She has over 6 years of experience with the SQL Server BI stack and enjoys sharing her knowledge and experiences at SQL Saturdays. She is also one of the coordinators for SQL Saturday in Kansas City.
Contact URL: http://datasavvy.wordpress.com

See you there!

Paras Doshi
Chapter Co-Leader, PASS BA VC

SSAS Tabular: How to mark a table as Date Table?

Standard

Problem:

For time intelligence in SSAS Tabular models, it’s important to mark a date table. So, how do you do that?

Solution:

Here are the steps:

1. After you have the data imported in a table, in the model designer, select the date table

2. Also, make sure that necessary relationships are created between date table and fact tables.

3. Now, on the menu bar, select Table > Date > Mark as Date Table

SSAS Tabular mark as date table4. A dialog box should ask you for a unique identifier of the date table, so it could be a date column which has the unique columns for the date table that you imported.

Also, make sure that the unique identifier does not have blank values because the model designer will not allow you to select it as your unique identifier and so you may have to go back to your table properties and put a logic to filter out blank values or make sure that the a row with blank value has a value assigned to it.

After you do that, you’re all set to go! I hope this helps.

SQL server Integration services: How to solve “The value violated the integrity constraints for the column” error?

Standard

problem:

you are working on an SSIS package to load a table from a source system and you get an error “The value violated the integrity constraints for the column error” – how do you solve it?

solution:

one the things that the error message should also tell you would be column name. What you want to do is check the table definition of the destination table for any integrity constraints like NOT NULL or PRIMARY KEY. Now once you have that information, go back to your source and figure out if it’s trying to add NULL values in a column that has NOT NULL integrity constraint. Or may be ETL logic is trying to insert duplicate value to the column that has primary key constraint.

Also, the don’t alter the destination table to accept NULL’s or remove integrity constraint. You want to put a logic in your ETL OR fix the data integrity at source. You can use TSQL functions like NULLIF to handle NULL values while querying source systems.

Conclusion:
In this post, we saw how to solve the “The value violated the integrity constraints for the column” error in SSIS.

SQL server analysis services MDX: Current Fiscal Period vs Previous year same Fiscal Period Query

Standard

Problem:

Comparing data current fiscal period vs previous year same fiscal period query is a very common problem. Here’s a sample query.

Solution:

Please use the query as a staring point for current fiscal period vs previous year same fiscal period. I used the Contoso Cube to come with the query. The key part is to use the PARALLELPERIOD function.

[code language=”SQL”]

with member [Sales Amount Previous Year Fiscal Month] as
SUM(ParallelPeriod([Date].[Fiscal Month].[Fiscal Month],12,[Date].[Fiscal Month].CurrentMember ),[Measures].[Sales Amount] )

select
{[Measures].[Sales Amount],[Sales Amount Previous Year Fiscal Month]} on 0,
NON EMPTY {[Date].[Fiscal Month].[Fiscal Month]} on 1
from
[Sales]

where
[Date].[Fiscal Year].&[2008]

[/code]

MDX current year vs previous year same fiscal periodConclusion:

In this post, I shared a MDX query that you can use for current fiscal period vs previous year same fiscal period.

SQL Server Integration services: How to write a package that does Set based updates?

Standard

Problem:

if you’ve a sizable number of rows that need update in SSIS, then you don’t want to do a row based update commands because it won’t be efficient. if you’ve good number of rows that need to be updated then you can use the SET Based updates. it’s a common design pattern for loading dimensions in a data warehouse.

Find the steps below:

Solution:

There are two main steps to achieve this:

1) Populate the “update” table with rows that have been changed. Note that a new table needs to be created.

2) Run the SQL command to do a SET based update

1. SSIS Set based Updates Integration ServicesLet’s see each step in detail:

1)  Populate the “update” table with rows that have changed.

For this step, first make sure that you have a table that can hold the rows that have been updated.

Then create a Data Flow that take the source data and lookups the data that has changed and puts it in an update/staging table:

2 SSIS Populate the Table with Rows that changed

Note: I’ve used a small table for demo purpose but you won’t use this method if you don’t have a more rows to update because as you can see this method adds an overhead of putting the data in the update table first.

2) Run the SQL command to do a SET based update

Here’s the sample query:

[code language=”sql”]
— run the update command
Update Dim
Set
[Column1]=Upd.[Column1]
[Column2]=Upd.[Column2]
— [Column3]=Upd.[Column3]
— …
From dbo.DimDestination Dim
Inner Join dbo.Destination_update Upd
on Dim.Destination_sk=Upd.Destination_sk

–Truncate update table
Truncate table dbo.Destination_update
[/code]

Conclusion:

In this post, we saw how to write a package in SSIS that does SET based updates.

Analyzing user survey data using Text Mining techniques for a Startup

Text mining analytics
Standard

Business Summary:

needed a way to extract insight from 80,000 words (120 pages).

Technical Summary:

I used Text mining techniques to deliver analysis in 1100 words (1.5 pages) from 80,000 words (120 pages). That’s 72x less than original size.

The analysis gave insight into a key competitive advantage that they had against established players in the market.

Text mining analytics

 

 

 

 

SQL server Analysis Services: How to import cube meta data to SQL server Data Tools?

Standard

Problem:

Someone on your team deployed a cube and gave you access to it. Now, as an IT developer on the team you want to know the calculation that was used for some of the measures. How do get it? You can email the contact person, right? Sure, but let me show you how you can decrease email back and forth using the following steps.

Note: it’s applicable to both multidimensional model as well as tabular model.

Solution:

1. Open SSDT

2. File > Project > New

3. Installed Templates > Business Intelligence > Analysis services > Import from Server

SQL Server Analysis Services Business Intelligence Import

4. You’ll get a dialog box, enter server name and the cube/model metadata that you want to import

Import Microsoft Business Intelligence analysis cube

5. In the solution explorer, you should be able to see the cube meta data! Now, you can go look at the measure calculation that you wanted to see, or may be the KPI rule? or something else in the cube that you needed to know.

Conclusion:

In this post, I described how you can import SQL server analysis services cube/model from the server.