Tuesday, November 15, 2011

Errors in the back-end database access module. Nested table keys in a SHAPE query must be...

While working with SQL Data Mining, I got the below error when I tried to insert data to the mining structure.

Errors in the back-end database access module. Nested table keys in a SHAPE query must be sorted in the same order as the parent table. The nested query may be missing an ORDER BY clause.

But I checked the query and all are in the correct way. I create an SSIS package to find whether there is an issue with data and found that the above error message was raised due to some inaccurate data.
When I ran the query by trimming the sorting column, query executed successfully. Since there are spaces in the sorting column, it thinks that columns are not sorted in the correct order and gives the above error message.


Thursday, October 27, 2011

Connecting to Pervasive SQL using SSIS

I got an requirement from a client as connect to Pervasive SQL database system and retrieve data from it.

I was able to achieve that requirement using SSIS. For that we have to install the OLEDB driver for Pervasive SQL. After that create a SSIS package and add a OLEDB Source in the Data Flow Task. Then create a New OLEDB connection.
Select the Provider as Native OLEDB\Pervasive PSQL OLEDB Provider.
Then you have to provide the server or file name and the location. But there some confusion about those values. Because for this provider you have to provide the server which contains the Pervasive SQL database as the Location and the Pervasive SQL database as the Server or file name value. Then you have to provide the Initial Catalog value as well. Figure is shown below


Then click on All and under Pooling yow will see OLEDB Services. Select the value as Disable All as shown in the below figure.


Otherwise you will get and error as shown below figure while trying to preview data.


Then select that connection in the OLEDB Source and use SQL Command to retrieve data. Because I tried to get the list of tables and unable to do it due to memory exception.

Monday, October 3, 2011

Hiding Report in the SQL Report Server Tile View...

Sometimes we may want to hide a report from the report server and allow drill down from an another report to the hidden report. For that we can use Hide in tile view property.

For that, move the mouse pointer on top the report name in the report server folder and you will get a down arrow. Click on it and then select Manage. In the Manage page put a tick at the Hide in tile view and click on Apply button.

Then go to the report server folder and you will not see that report anymore. But still you can drill down to that report from another report.

If you want to show that report in the Details view again, click on Details View to get the detail view of those reports and all the report are visible there. Then follow the steps as you did earlier and get the Manage page. In that remove the tick at Hide in tile view property.

Wednesday, September 28, 2011

Advanced MDX Ranking

There was a situation that I want to get the Rank value considering three dimension levels. In one of earlier post (MDX Rank post), I mentioned how to get the rank value for two dimension levels.
This post will shows the MDX query to get the rank value for three dimension levels.

Lets consider a scenario that we want to get the Rank value of a department considering the stores of a particular region. Below MDX query will give the required rank value.

WITH
MEMBER [Measures].[RankSales] AS
RANK(([Store—Region_Store].CurrentMember,[Store—Region].CurrentMember,[Item—Department].CurrentMember),
ORDER(({[Store—Region_Store].[Store].MEMBERS*[Store—Region].CurrentMember*[Item—
Department
].CurrentMember}),
[Measures].[Sales],
BDESC))

SELECT
{[Measures].[RankSales]} ON 0,
{[Store—Store].[Store].MEMBERS*[Item—
Department
].[
Department
].MEMBERS
} ON 1
FROM
[Sales]
where {[Store—Region].[All Store—Region].[Region1]}*{[Item—
Department_Sub
].[All Item—
Department_Sub
].[
Department1
]}

Likewise we can expand the query to any dimension levels. Hopes this will help you...

Non-Windows user credentials were supplied for a non-http connection to Analysis Services...

While trying to view a SSRS report in the report server outside from the server, I got the below error message.

Non-Windows user credentials were supplied for a non-http connection to Analysis Services. The Microsoft SQL Server Analysis Services data extension requires Windows Integrated Security unless connecting via http or https.

This because when we access the data source which use for the report from outside, it cannot connect to the cube due to the user permission issue.

  • When we want to provide access for non domain users to the report, we provide credentials of an authorized user for the data source. 
  • Go to the Data Source folder in the report server and then click on the relevant data source. 
  • Then you will get set of properties and in that you can see Connect using: property. 
  • In that select Credentials stored securely in the report server option and provide a valid user name and password. 
  • But if you click Apply button by just providing that user name and password, when you are going to view the report you will get the above error message. 
  • To solve it you just want put a tick on Use as Windows credentials when connecting to the data source just below the Password. 
  • Then test the connection and click Apply after that.


Then view the report and you will be able to view it...

Friday, September 16, 2011

Case Sensitivity with SSRS Grouping ?

When I created a SSRS report to retrieve SSAS cube data, I found that the dimension members which contains same name but has different case, has been missing. As an example if you have two dimension members of Item dimension as item1 and Item1, then only one will be shown.
The reason was that it does not support case sensitivity with grouping.

To have the case sensitivity in the report I had to do two changes. One is in the cube level and the other one is in the report data set level.

At the dimension level you have to allow case sensitive. To do that open the dimension in the dimension editor using BIDS, the select the dimension and view the properties. At the top of the Properties window you will find the Collation property. Click on the button and put a tick in front of Case Sensitive value. Then process the dimension and then process the cube.

Even though you change this, you will not get the expected result in the report. To get it, select the Dataset in the Report Data tab and double click on it. Then go to the Options section and then select True from Case sensitivity: drop down list. Click Ok and preview the report. Then you should get the both item1 and Item1 records.

Tuesday, September 13, 2011

MDX Rank

Normally we use MDX Rank function to get the rank value considering just one dimension level. But I face an situation that Rank value needed to be taken considering two dimension levels. As an example needed to get the Rank of Sales value of stores, which related to a Region.
I was unable to find a direct solution and after doing some researches I was able to build the MDX query.
The query is given below.

WITH
MEMBER [Measures].[Rank] AS
RANK(([Store—Region].CurrentMember,[Store—Region_Store].CurrentMember),
ORDER(({[Store—Region_Store].[Store].MEMBERS*[Store—Region].CurrentMember}),
[Measures].[Sales],
BDESC))
SELECT
{[Measures].[Rank], [Measures].[Sales]} ON 0,
{[Store—Region_Store].[Store].MEMBERS} ON 1
FROM
[Sales]
where {[Store—Region].[All Store—Region].[Region1]}

If we give the region value as Region1, then what this query does is it rank only the stores which are related to the Region1 region. If the Region1 region contains 5 stores, the highest rank value you will get is 5.
Hope this will help for you as well...

Monday, August 29, 2011

Adding two Filter conditions in MDX WHERE statement

I wanted to add two conditions which contains different dimensionality in MDX WHERE statement. After trying several syntax I was able to find the way.

SELECT {[Measures].[Sales]} ON ROWS, {[Store—Store].[Store].ALLMEMBERS} ON ROWS
FROM [SalesCube] WHERE {[Customer].[All Customer].[CustomerName]} * {[Store—Region_Store].[All Store—Region_Store].[RegionName]}

If you entered the WHERE statement incorrectly, you will get an error message as
The function expects a tuple set expression for the 1 argument. A string or numeric expression was used.

Wednesday, August 17, 2011

MDX to get the parent level summation for a Measure with LastNonEmpty as aggregation

I faced an situation recently related to Inventory Calculations. In that we keep average inventory value in the Item level. It's parent level is category level. Aggregate function for average inventory measure is LastNonEmpty. In the report we have to show the summation of average inventory measure group by category level.
But the issue is if we take only the category dimension hierarchy and the average inventory measure, the value is incorrect. Because the aggregate function of the average inventory measure is LastNonEmpty, not the Sum.
We can resolve that issue by just adding the Item dimension hierarchy to the report data set and show the summation of average inventory measure value in the report using expressions.
But this will increase the report execution time if there are large number of items exist in the cube. Therefore in such situation report will take more time to load.

The best solution is add a calculated member to the cube which stores the summed value of items relevant to the category.
Just add a calculated member and add the Expression as

Sum(([Item—Category].currentmember.children), [Measures].[AverageInventory])

In this [Item—Category] is the hierarchy in the cube.

This resolves the issue with high report performance.

Friday, August 5, 2011

Filter SSAS Cube Data for a Date Range

When we retrieve cube data for SSRS reports, we may want to filter those data for a particular date range. In the below example I filter the Sales data of Stores for a Month range.

SELECT [Measures].[Sales] ON COLUMNS,
[Stores].[Store].members on ROWS from [Sales]
WHERE ([Date—Year_Month_Day].[All Date—Year_Month_Day].[" & Parameters!FromYear.Value & "].[" & Parameters!FromMonth.Value & "]:[Date—Year_Month_Day].[All Date—Year_Month_Day].[" & Parameters!ToYear.Value & "].[" & Parameters!ToMonth.Value & "])


That's all. It takes the From and To Year, Month values from those parameters and filter the Sales measure value. In this example I used a created hierarchy([Date—Year_Month_Day]) for the Date dimension.

Thursday, August 4, 2011

Two sets defined in the function have different dimensionality

When I tried to write a custom MDX query to retrieve Cube data, I got an Error message saying "Two sets defined in the function have different dimensionality".

The reason was we have to keep the same order of the Measures and Dimensions in the tuples, otherwise we will get the above error message

Wednesday, July 27, 2011

Securing SSIS Packages in the File System

One of the common concern we are facing is how to secure data and the design of the SSIS ETL packages deployed in the file system. Because if there are no security restrictions, users can open the SSIS packages and if they required do the enhancement for those. Also they can get some sensitive information's stored in those packages.

To secure SSIS packages, we can use package's ProtectionLevel property. What that property does is encrypts the package definition information in the relevant XML file.
We have to set this property while we are editing the package in the BIDS.

There are different protection level options that we can use according to our security requirement.
  • DontSaveSensitive
    - Sensitive information like connection passwords are not saved in the package
  • EncryptAllWithPassword
    -Entire package is encrypted with a password that is set in the PackagePassword property
  • EncryptAllWithUserKey
    -Entire package is encrypted based on the current user and computer
  • EncryptSensitiveWithPassword
    -Sensitive information in the package is encrypted based on a package password, which is set in the PackagePassword property
  • EncryptSensitiveWithUserKey
    -Sensitive information like connection passwords in the package are encrypted based on the current user and computer
There is another protection level option as ServerStorage, which can be used only when a package is stored in a SQL Server database

Friday, July 22, 2011

Faulting application OSQL.EXE...

For last couple of days I had a issue in SQL Server which is installed in Windows Server 2008 computer.
We are having a trigger to track the new inserts to a table and when it fires it executes few Stored Procedures as background processes using OLE Automation.
But sometimes even though that trigger was fired, those stored procedures did not executed.
When I checked the Windows Log in the Server Manager, I was able to find the below message

Faulting application OSQL.EXE, version 2009.100.1600.1, time stamp 0x4bb67996, faulting module kernel32.dll, version 6.0.6001.18000, time stamp 0x4791adec, exception code 0xc0000142, fault offset 0x00000000000b1188, process id 0x2274, application start time 0x01cc48354094868e.

I tried to find a solution by browsing the internet but couldn't find any solution.
After trying several workarounds I was able to find a solution to that issue.

The Solution is:
  • Open the Server Manager.
  • Expand Configuration section, then click on Services.
  • Then restart the Windows Management Instrumentation service.
  • After that restart the relevant SQL Server service.
I was able to solve the issue by restarting those two services and I hope this will helps to solve some other issues as well...

Tuesday, July 12, 2011

Changing Backup location for a SSAS Cube

When we try to get a backup of a SQL Server Analysis Service Cube, it only shows the default backup location and we are unable to change it.
But we can add a path to that default locations by adding that path to the AllowedBrowsingFolders server property. You can add multiple paths by using pipe symbol as the delimiter.

To add a new path to the AllowedBrowsingFolders server property, connect to the analysis server instance. Then right click on the analysis server instance and select Properties. Then you will get window. But if the Show Advanced (All) Properties check box is not selected, you will not see the AllowedBrowsingFolders property. If that check is not selected, then select it and you will get a window as below.


Then you can add paths as required by delimiting those using pipe symbol ("|" ).
Those paths should be local paths.

Sunday, July 3, 2011

I cannot view the Object Explorer in SSMS...

When I tried to view object explorer in SQL Server Management Studio, I couldn't get it. I was able to get it by resetting window layout.
To do that select Reset Window Layout under Window menu.

Monday, March 28, 2011

Decomposition Tree in PerformancePoint Sever 2010

Decompostion Tree is a new feature available in PerformancePoint 2010 and it is a very useful and smart feature for BI applications.

But there are few tricks need to be followed to have the decomposition tree in a scorecard.
  • You need to create KPIs from PerformancePoint. If you import KPIs from Analysis Sevices Cube, Decomposition Tree will not work.
  • Need to provide values from a Measure. If it is provided from a MDX expression Decomposition Tree will not work.
  • Therefore if you need to do some calculations for the KPI, create a Calculated Member which having the required calculation and set the Parent Hierarchy of the calculated member as Measures.
  • Then you can use that calculated member as a measure to create the required KPI and the Decomposition Tree will be available for it.
  • Calculation property for the Actual and the Target should be Data value.
  • Also to enable the Decomposition Tree feature in the Scorecard, Silverlight need to be installed in the client machine.
  • Also when I added those created KPIs by creating a blank scorecard from PerformancePoint, Decomposition Tree did not work. But when I created the scorecard by importing KPIs from analysis service and modified the KPIs as above, then the Decomposition Tree works fine.

Wednesday, March 23, 2011

The URL is not available, does not reference a SharePoint site, or you do not have permission to connect...

When I tried to connect to a SharePoint URL in dashboard designer options, it prompt a message saying "The URL is not available, does not reference a SharePoint site, or you do not have permission to connect"




When I checked the Event Log under Administrative Tools -> Event Viewer -> Windows Logs -> Application, message says that
"Cannot open database "WSS_Content_
GuidOrGivenName" requested by the login. The login failed.
Login failed for user 'domain\username'."

The reason is
that the the Service Account under which the PerformancePoint Service runs, does not have permissions for the SQL Server Content database of the SharePoint Web Application.

To fix this issue , add the domain user which uses to run PerformancePoint services as the database owner of the Content database of the web application (
WSS_Content_GuidOrGivenName).

After that you will not get that message...

Wednesday, March 2, 2011

Problem Encountered in SSIS Script Component...

While using SSIS Script Component, Data Flow Task was in Yellow color for few minutes and finally provided an exception from the Script Component as

at Microsoft.SqlServer.Dts.Runtime.Wrapper.IDTSVariableDispenser100.GetVariables(IDTSVariables100& ppVariables) at Microsoft.SqlServer.Dts.Pipeline.ScriptComponent.LockReadWriteVariables(String readWriteVariables) at Microsoft.SqlServer.Dts.Pipeline.ScriptComponentHost.PostExecute()

I found the reason for this issue at the below link
http://www.sqldev.org/sql-server-integration-services/memory-error-in-ssis-transformation-script-component-4422.shtml

According to that it explains the reason for the above exception and the solution as

"The read-write lock is "granted" in (or before) the PreExecute phase of the Data Flow, and only released AFTER the PostExecute phase. All components' PreExecutes are run, THEN the data flow runs, then ALL the components' PostExecutes are run. SSIS does not run the PostExecute of one component in the flow as soon as the last row moves through it. SSIS does not guarantee the execution order of PostExecute calls to the components in the data flow. In effect, it can't - because it can't guarantee that component A will be completely finished before component B starts, and adding any such guarantee would completely muck up any optimization/parallelism it could attempt.

Therefore - you can not have two Scripts - or even two Row Count transforms (or a combination of the two) in a single Data Flow thatwrite to the same variable. Because in each of those cases, SSIS "reserves" the variable from the time the flow starts until it ends.

In order to account for that, you'll need two (or more) variables - one for each "case" you want to report on."

Tuesday, February 22, 2011

Stored Procedure to Populate Date Dimension Data

In this post I created a stored procedure to insert data to Date Dimension tables.
I have used four tables to have the Date structure. They are DimYear, DimQuarter, DimMonth and DimDay.

Script used to generate Dimension tables is given below.

CREATE TABLE dbo.DimYear
(
YearKey smallint IDENTITY(1,1) NOT NULL,
[Year] smallint NOT NULL,
CONSTRAINT PK_DimYear PRIMARY KEY (YearKey)

)
GO


CREATE TABLE dbo.DimQuarter
(
QuarterKey smallint IDENTITY(1,1) NOT NULL,
YearKey smallint NOT NULL,
[Quarter] smallint NOT NULL,
QuarterName varchar(20) NOT NULL,
CONSTRAINT PK_DimQuarter PRIMARY KEY (QuarterKey),
CONSTRAINT FK_DimQuarter_DimYear FOREIGN KEY (YearKey)
REFERENCES DimYear (YearKey)

)
GO


CREATE TABLE dbo.DimMonth
(
MonthKey smallint IDENTITY(1,1) NOT NULL,
QuarterKey smallint NOT NULL,
MonthNumber smallint NOT NULL,
[MonthName] varchar(20)NOT NULL,
[Month] varchar(20) NOT NULL,
CONSTRAINT PK_DimMonth PRIMARY KEY (MonthKey),
CONSTRAINT FK_DimMonth_DimQuarter FOREIGN KEY (QuarterKey)
REFERENCES DimQuarter (QuarterKey)

)
GO


CREATE TABLE dbo.DimDay
(
DayKey smallint IDENTITY(1,1) NOT NULL,
MonthKey smallint NOT NULL,
[Day] smallint NOT NULL,
[DayName] varchar(20) NOT NULL,
WeekNoForMonth smallint NOT NULL,
WeekNoForYear smallint NOT NULL,
CONSTRAINT PK_DimDay PRIMARY KEY (DayKey),
CONSTRAINT FK_DimDay_DimMonth FOREIGN KEY (MonthKey)
REFERENCES DimMonth (MonthKey)
)
GO



A Stored Procedure was used to fill those dimension tables. Below I put the script of the Stored Procedure



IF OBJECT_ID('dbo.PopulateDateDimension') IS NOT NULL
DROP PROC dbo.PopulateDateDimension
GO

CREATE PROCEDURE dbo.PopulateDateDimension @Year smallint
AS
BEGIN

DECLARE @StartMonth tinyint --set the current month value
DECLARE @EndMonth tinyint --set the last month for a quarter
DECLARE @YearKey smallint --YearKey value in the DimYear table
DECLARE @Quarter smallint --Quarter value
DECLARE @LastQuarter smallint --Last Quarter value and it is equal to 4
DECLARE @QuarterKey smallint --QuarterKey value in the DimQuarter table
DECLARE @StartDay tinyint --current prossesing date
DECLARE @EndDay tinyint --last date for the month
DECLARE @MonthKey smallint --MonthKey value in the DimMonth table
DECLARE @MonthName varchar --Month Name in the YYYY-Month format
DECLARE @WeekNoForMonth smallint --Value of the processing week for the month
DECLARE @WeekNoForYear smallint --Value of the processing week for the year
DECLARE @WeekDayName varchar(20) --Weekday string (Sunday, Monday,...)

SET @StartMonth = 1
SET @EndMonth = 3
SET @Quarter = 1
SET @LastQuarter=4
SET @StartDay=1
SET @EndDay=31


BEGIN TRY
BEGIN TRANSACTION

IF NOT EXISTS (SELECT * FROM dbo.DimYear WHERE [Year] = @Year) --check whether the processing year is already filled or not
BEGIN

SET @WeekNoForYear=1 -- Starts the week value for the year

INSERT INTO dbo.DimYear (Year) VALUES (@Year)
SELECT @YearKey = SCOPE_IDENTITY() --Get the last inserted YearKey


IF NOT EXISTS(SELECT * FROM dbo.DimQuarter WHERE [Quarter] = @Quarter AND YearKey = @YearKey) --Check whether the quarter value for the given year is exist or not
BEGIN
WHILE @Quarter<=@LastQuarter --check the all the quarter values are inserted or not
BEGIN
INSERT INTO dbo.DimQuarter
(YearKey, Quarter, QuarterName)
VALUES
(@YearKey,@Quarter, CONVERT(varchar(4), @Year) + ' - ' + 'Q'+CONVERT(varchar(4),@Quarter))
SELECT @QuarterKey = SCOPE_IDENTITY() --Get the last inserter QuarterKey



WHILE @StartMonth <= @EndMonth --check all the months are inserted or not
BEGIN



IF NOT EXISTS(SELECT * FROM dbo.DimMonth WHERE MonthNumber = @StartMonth AND QuarterKey = @QuarterKey) --check the processing Month record is inserted or not
INSERT INTO dbo.DimMonth
(QuarterKey,[Month], MonthName, MonthNumber)
VALUES
(@QuarterKey, DATENAME(mm, CONVERT(datetime, CONVERT(varchar(2), @StartMonth) + '/1/2000', 101)),CONVERT(varchar(4), @Year) + ' - ' + DATENAME(mm, CONVERT(datetime, CONVERT(varchar(2), @StartMonth) + '/1/2000', 101)), @StartMonth)

SELECT @MonthKey=SCOPE_IDENTITY() --Get the last inserted MonthKey

SET @StartDay=1

--Set the number of days for the months

IF (@StartMonth=1 OR @StartMonth=3 OR @StartMonth=5 OR @StartMonth=7 OR @StartMonth=8 OR @StartMonth=10 OR @StartMonth=12)
SET @EndDay=31
ELSE IF (@StartMonth=4 OR @StartMonth=6 OR @StartMonth=9 OR @StartMonth=11)
SET @EndDay=30
ELSE
SET @EndDay=28

SET @WeekNoForMonth=1 --set the week no for the month
WHILE @StartDay<=@EndDay --Check all the days are inserted for the Month
BEGIN
IF NOT EXISTS (SELECT * FROM dbo.DimDay WHERE Day=@StartDay AND MonthKey=@StartMonth) --Check whether the processing Day is inserted or not


SET @WeekDayName=CONVERT(varchar(20),DATENAME(WEEKDAY,CONVERT(datetime,CONVERT(varchar(4),@Year)+'-'+CONVERT(varchar(4),@StartMonth)+'-'+CONVERT(varchar(4),@StartDay)))) --Get the processing Day string (Sunday, Monday,...)


INSERT INTO dbo.DimDay
(MonthKey,Day,DayName,WeekNoForMonth, WeekNoForYear)
VALUES
(@MonthKey,@StartDay,CONVERT(varchar(4), @Year)+'/'+CONVERT(varchar(2),@StartMonth)+'/'+CONVERT(varchar(2),@StartDay),@WeekNoForMonth,@WeekNoForYear)

SET @StartDay=@StartDay+1 --Increment the processing day value
IF (@WeekDayName='Sunday') --Check whether the week is completed or not (For this one consider the week is starting from Monday)
BEGIN
SET @WeekNoForMonth=@WeekNoForMonth+1 --If the week is completed increment both the WeekNo values
SET @WeekNoForYear=@WeekNoForYear+1
END


END


SET @StartMonth = @StartMonth + 1 --Increment the Month Value

END
SET @EndMonth=@EndMonth+3 --Increment the end month value for a Quarter
SET @Quarter=@Quarter+1 --Increment the Quarter value
END
END
END
COMMIT TRANSACTION
END TRY
BEGIN CATCH

DECLARE @ErrMsg varchar(1000)
IF @@TRANCOUNT > 0
BEGIN

SELECT @ErrMsg = ERROR_MESSAGE()
ROLLBACK TRANSACTION
PRINT @ErrMsg
END
END CATCH
END

Tuesday, February 8, 2011

Failed to create the configuration database in SharePoint 2010 Configuration Wizard

When I was trying to run the configuration wizard in SharePoint 2010, I got a error message saying
"Failed to create the configuration database
An exception of type System.InvalidOperationException was thrown. Additional exception information: This implementation is not part of the windows platform FIPS validated cryptographic algorithms"


The reason is in the system registry FIPS is enabled.
In the registry there are three keys which are related to FIPS.

1. HKLM\SYSTEM\ControlSet001\Control\LSA\FipsAlgorithm
2. HKLM\SYSTEM\ControlSet002\Control\LSA\FipsAlgorithm
3. HKLM\SYSTEM\CurrentControlSet\Control\LSA\FipsAlgorithm

Set the value for all the keys as "0" to disable FIPS.

Then re-run the configuring wizard after deleting the created tables related to SharePoint in the SQL Server Database.

It works for me and configuration was completed successfully.