I was talking with a database administrator about different categories that SQL Commands fall into — and I thought it would be great to document here. So here you go:
ACRONYM
DESCRIPTION
SQL COMMANDS
DML
Data Manipulation Language: SQL Statements that affect records in a table.
SELECT, INSERT, UPDATE, DELETE
DDL
Data Definition Language: SQL Statements that create/alter a table structure
CREATE, ALTER, DROP
DCL
Data Control Language: SQL Statements that control the level of access that users have on database objects
GRANT, REVOKE
TCL
Transaction Control Language: SQL Statements that help you maintain the integrity of data by allowing control over transactions
COMMIT, ROLLBACK
BONUS (Advance) QUESTION:
Is Truncate SQL command a DDL or DML? Please use comment section!
I spend a lot of time writing SQL code — and as a reader of this blog, You might be in the same boat. So any productivity gains that we could get here could go a long way. On that note, here’s a quick productivity tip: Learn to comment/uncomment multiple lines of SQL code using keyboard shortcut.
If you are using SQL Server Management Studio, it’s “CTRL-K followed by CTRL+C” for commenting AND “CTRL+K followed by CTRL+U” for uncommenting.
If you are using some other Data Management Software tool, I am sure you can find it using their HELP section or googling around.
Either ways, these shortcuts go a long way in making you more productive! What is your favorite productivity tip?
Analyzing customer complaints in crucial for customer service & sales teams. It helps them increase customer loyalty and fix quality issues. To that end, here’s a mockup:
Note: Drill down reports are not shown, details are hidden to maintain confidentiality and numbers are made up.
Need a daily report delivered in sales team’s inbox that shows Sales Team’s Bookings vs Quota for current & next month.
Brief Description:
Ability to see Bookings vs Quota in near real-time is a key to effectively manage performance for any sales team. Before the project, analyst(s) would have to manually put together this report and since the report took more than a day to put together they couldn’t afford to run it daily and so they delivered this report bi-weekly/monthly basis to the sales team. After the project, the process was automated and the sales team received an email with a report on a daily basis and this helped them see Bookings vs Quota in near real-time. As a famous saying goes “if you can’t measure it, you can’t improve it” (by Peter Drucker) – in this case, the report helped them measure their actual numbers against their goals and helping them improve their sales numbers which directly hits their top-line!
Tools used: SharePoint report subscription, SQL server analysis services, SQL Server Integration services, SQL server reporting services & Excel.
Mockup:
Note: Drill down reports are not shown and the numbers are made up.
Profitability equals revenue minus costs – To that end, A supply chain executive is mostly focused on optimizing cost elements to drive profitability. Here’s a mock up of a dashboard created for an executive to help him keep an eye on the overall health while making sure he gets alerted for key cost categories.
The Dashboard was created using profitability data-set & also had drill down capabilities to analyze numbers for cost buckets like Raw materials, manufacturing & logistics.
SSAS Actions are powerful! You can open web pages, open sql server reporting services, customize drill through reports among other things using actions. In this post, you will see a common requirement from users to navigate to a corporate intranet site from the cube – and usually it needs to be dynamic.
For example, user is interested in seeing the Order Entry Page hosted on the corporate intranet site by using the Order ID from the SSAS cube.
Here’s how you can set it up:
1. Open SSAS Cube in SQL Server Data Tools:
2. Navigate to Actions tab:
3. Here you’ll see three types of action that you can configure
a. Standard (this have five subtypes including the URL action)
b. Drill Through
c. report action
4. For the purpose of this blog post, let’s focus on standard action:
5. Once you click on the “New Action” it will ask you to configure the action:
a. Name: Enter the desired name here
b. Target Type: In this case, Order ID is an attribute member but you will have to choose appropriate target type for your scenario
c. Target Object: In this case, it’s something like [Order].[Order ID] – in your case, you’ll have to choose an appropriate target object
d. Type: URL in this case (also don’t forget to check books online for what other types can do as well)
e. Action Expression: the format of the Action Expression if it’s driven by a parameter would go something like:
f. Additional Properties: I like to set the Caption to clearly indicate the user that they are opening the “Order Form for Order ID 123999”. You can do that by setting the caption property. The format goes like this:
[code language=”SQL”] "Open Order Entry page for Order ID: "+[Order].[Order ID].currentmember.member_caption [/code]
Also set the caption is MDX to True if you are using above format.
That’s about it, don’t forget to test it (after deploying the cube) using excel or other end-user tool of your choice. In the Pivot Table, use the Order ID attribute in Row/Column labels > Right Click on any attribute member of Order ID attribute > Additional Actions > The caption with dynamic order id should show by here for users to click and navigate to the specified URL:
How do create a measure that does an average over a field from fact table? You can’t find it the “usage” property while trying to create a new measure:
Solution:
Before i show you the solution, I want you to know that this is a Level 100 solution to get you started – so depending on the complexity of your cube the calculated measure that you are about to create may or may not perform well – if it does not perform well, you might have to dig a little deeper and here’s one blog post to get you started: URL
OK, back to topic! Here are the steps.
SCENARIO: you need average of Sales Amount.
1. Create a SUM OF SALES AMOUNT measure
Steps: Open cube > Cube Structure > Right click on Measure Group > New Measure > Usage: “SUM” > Source Table: Pick your Fact Table. In this case let’s say it’s Fact Sales > Source Column: In this case, lets say it’s SALES AMOUNT
2. Create a COUNT OF SALES measure (important: row count vs. non empty count – this is not a developer’s choice, a business user needs to define that)
Steps: Open cube > Cube Structure > Right click on Measure Group > New Measure > Usage: count of rows OR count of non empty values (again this is not developer’s choice, a business user needs to define this) > Source Table: Pick your Fact Table. In this case let’s say it’s Fact Sales > Source Column: In this case, lets say it’s SALES AMOUNT
3. Create a Calculated Measure that equals (SUM OF SALES/COUNT OF SALES)
3a. Switch to Calculations section > create a new calculated member:
3b. Complete Name, Format String & Associated Measure Group. For the Expression, use the following expression. Please use this as a starting point for your measure:
[code language=”SQL”] IIF([measures].[COUNT OF SALES]=0,0,[measures].[SUM OF SALES AMOUNT]/[measures].[COUNT OF SALES]) [/code]
4. Before you test it, if you don’t need the SUM OF SALES AMOUNT and COUNT OF SALES measures than don’t forget to hide them!
Conclusion:
In this post, you saw how to define a measure with average aggregation is SSAS.
Convert the following source data into a schema shown below:
Solution:
Here’s the code that uses PIVOT function to get to the solution, please use this as a starting point.
Note the use of aggregation function avg – this will depend on the requirement. In the example, the Test_value need to be average if more than one tests were performed.
[code language=”SQL”]
— source data SELECT [Product_ID],[Test_Desc],[Test_Val] FROM [dbo].[Address] go
— Destination data using PIVOT function select * from [dbo].[Address] pivot( avg(test_val) for test_Desc IN (Test1,Test2,Test3,Test4,Test5)) as Tests
As a part of Business Intelligence projects, we spend a significant amount in extracting, transforming and loading data from source systems. So it’s always helpful to know as much as you can about the data sources like NULLS, keys, statistics among other things. One of the things that I like to do if the data is unknown is to make sure that I get the candidate keys correct to make sure the key used can uniquely identify the rows in the data. It’s really helpful if you do this upfront because it would avoid a lot of duplicate value errors in your projects.
So here’s a quick tutorial on how you can check the candidate key profile using data profiling task in SSIS, You need to perform two main tasks: 1. Generate the xml file using the Data profiling task in SSIS 2. View the content of the xml file using the Data Profile Viewer Tool or using the Open Profile Viewer option in the Data Profiling task editor in SSIS.
Here are the steps: 1a. Open SQL Server Data Tools (Visual Studio/BIDS) and the SSIS project type 1b. Bring in Data Profiling Task on Control Flow 1c. Open the Data Profiler Task editor and configure the destination folder that the tasks uses to create the XML file. You can either create a new connection or use an existing one. If you use an existing connection, make sure that you are setting the OverwriteDestination property to True if you want the file to be overwritten at the destination.
1d. Click on Quick Profile to configure the data source for the data profiler task
1e. In the quick profile form, you’ll need to select the connection, table/view and also specify what you to need to computer. For candidate key profile, make sure that the candidate key profile box is checked.
1f. Run the Task and a XML file should be placed at the destination you specified in step 1C.
Now, It’s time to view what profiler captured.
2a. you can open “Data Profile Viewer” by searching for its name in the start button.
2b. once it opens up, click on open and browse to the xml file generated by the data profiling task.
2c. once the file opens up, you can the candidate key profiles.
2d. Alternatively, You can also open the data profile viewer from the “Data Profiling Task” in SSIS. Go to the Editor > Open Profile Viewer:
Conclusion: In this post, you saw how to profile data using the Data Profiling Task in SSIS.