SQL Database Email is not working...

I have configured Database Mail in SQL Server 2016.
But when I Send Test E-Mail, I did not receive any email.
Actually I have followed the same exact steps I used to configure it in a different server and all the things are fine including credentials.

Then I used the below query to check the status of it
SELECT * FROM msdb..sysmail_mailitems WHERE send_request_date > DATEADD(DAY, -1,GETDATE())

According to that sent_status=0. That means that request is new or not sent.
Then I find the DatabaseMail application file in the below path and double click on it.
\Program Files\Microsoft SQL Server\MSSQL13.MSSQLSERVER\MSSQL\Binn


 Then I got the below message



I checked the Features and .Net Framework 3.5 is not installed yet.



When it is installed, Database Mail is started to work fine.

Converting string value with "dd-MMM-yy hh.mm.ss.nnnnnn a" format to DateTime in SSIS

I had a requirement to convert a string value which contains a date time value with the format as "dd-MMM-yy hh.mm.ss.nnnnnn a" to DateTime.
I initially tried to convert it with the millisecond value, but it throws an error.
For my requirement I do not need the milliseconds, therefore I tried to convert it by removing the milliseconds part and it worked.

Below is the Expression used in the Derived Column component

ISNULL(DATA_VALUE) || LTRIM(RTRIM(DATA_VALUE)) == "" ? NULL(DT_DBTIMESTAMP) : (DT_DBTIMESTAMP)((DT_WSTR,10)((DT_DBDATE)SUBSTRING(DATA_VALUE,1,9)) + (DT_WSTR,1)" " + REPLACE(SUBSTRING(DATA_VALUE,27,2) == "AM" ? (DT_WSTR,12)SUBSTRING(DATA_VALUE,11,8) : (DT_WSTR,2)((DT_I4)SUBSTRING(DATA_VALUE,11,2) + 12) + (DT_WSTR,10)SUBSTRING(DATA_VALUE,13,6),".",":"))

Hope this will be helpful.

Power BI Schedule Refresh error...

There was an error thrown in PowerBI Schedule Refresh as

[0] -1055784932: Credentials are required to connect to the SQL source. (Source at xxxx;xxxxx.). The exception was raised by the IDbCommand interface.
[1] -1055784932: The command has been canceled.. The exception was raised by the IDbCommand interface.
[2] -1055784932: The command has been canceled.. The exception was raised by the IDbCommand interface.
[3] -1055784932: The command has been canceled.. The exception was raised by the IDbCommand interface.
[4] -1055784932: The command has been canceled.. The exception was raised by the IDbCommand interface.
[5] -1055784932: The command has been canceled.. The exception was raised by the IDbCommand interface.
[6] -1055784932: The command has been canceled.. The exception was raised by the IDbCommand interface.
[7] -1055784932: The command has been canceled.. The exception was raised by the IDbCommand interface.


I checked the credentials of the data source and everything is fine.
I referred the below link
https://community.powerbi.com/t5/Report-Server/Error-when-Refreshing-SQL-Data/td-p/317866/page/3

As mentioned in the above link,
"For each data source, check the name of the server and the database, it seems to be case sensitive when deployed to the server, but not when using the Power BI Desktop."

I had the same scenario and in my case it was with the database name.
I changed all the database names to exactly same name based on the case sensitivity and deployed the report.
After that schedule refresh is working fine.

Unable to load the data model for your power bi report...

There was a Power BI report which was working fine and suddenly it throws the below error
"Unable to load the data model for your Power BI report".


Actually there was no change applied on the server.
As suggested in some posts, I checked below scenarios and both of them worked.

1. Download the PBIX file and upload it back again. Actually I uploaded it with a different name and report is working fine.

2. Then I renamed the existing PBIX report and after that it worked fine as well. Again I renamed it to the original file name and the report is working fine after that as well.

Reference:
https://community.powerbi.com/t5/Report-Server/Unable-to-load-the-model-for-published-report/td-p/439131

Hope this will be helpful.

How the transformed data is written to an output file with column headers in U-SQL...

While working with U-SQL language, I noticed that there are few ways of writing data to an output file.
Let's assume the SalesDetails.csv file in the Azure Data Lake Store includes the Count and the Unit Price of an item. We want to calculate the Total Sale value and save it to the TotalSales.csv file, which will be stored in the Azure Data Lake Store.

DECLARE @inputFilePath string="/InputFiles/SalesDetails.csv";
DECLARE @outputFilePath string="/OutputFiles/TotalSales.csv";

@salesDetails=EXTRACT ItemId Int, Item string, UnitPrice decimal, Count int FROM @inputFilePath USING Extractors.Csv(skipFirstNRows:1);

To get the Total Sales values, we have to multiply Unit Price with the Count in the @salesDetails Rowset. But if the @salesDetails Rowset contains the column headers, it will cause issue while multiplying. Therefore to skip the column headers, skipFirstNRows:1 is used.

Below transformation will calculate the Total Sales value using the @salesDetails Rowset

@totalSales= SELECT ItemId AS ItemId, Item AS Item, UnitPrice AS UnitPrice, Count AS Count, UnitPrice*Count AS TotalSales FROM @salesDetails;

In the above, alias for each column is added since I need to add the column names for the output file.
Then the last step is to write the transformed data to the output file with column headers.

OUTPUT @totalSales
TO @outputFilePath
USING Outputters.Csv(outputHeader:true);

outputHeader:true is used to write the column headers to the output file. Anyway if you just only use the above statement, it will quote the header values.
To avoid it, the below statement can be used.

OUTPUT @totalSales
TO @outputFilePath
USING Outputters.Csv(outputHeader:true,quoting:false);

Hope this will be helpful...



Stop running SSIS Script Task after some time...

There is a SSIS Script Task which use to run another program. But due to the errors in the other program, some times script task is in the running state and it will not get stop.
Since I know the maximum duration which takes to complete the other program, I wanted to stop that script task if it exceeds the maximum duration.

For that I followed the below approach
- For the Script Task added a Timer to track the duration of the execution
- Timer interval is set for the  maximum duration
- When the timer is elapsed, there should be a way to stop the running process of the Script Task
- For that ThreadPool is used to execute the process and CancellationTokenSource is used to cancel the running process.

Code inside the script task is as below


System.Timers.Timer timer = null;

// Create the token source.
CancellationTokenSource cts = new CancellationTokenSource();

bool IsExecuted = false;
bool IsStopped = false;
public void Main()
{
        try
        {
                double maximumDuration = Convert.ToDouble(Dts.Variables["$Package::MaximunDuration"].Value.ToString());
                
                timer = new System.Timers.Timer(maximumDuration);
                timer.Elapsed += new System.Timers.ElapsedEventHandler(timer_Elapsed);
                timer.Enabled = true;
                
                ThreadPool.QueueUserWorkItem(new WaitCallback(ExecuteProcess), cts.Token);
                while(IsExecuted== false && IsStopped==false)
                {
                    Thread.Sleep(2500);
                }

            }

            catch(Exception ex)
            {
                Dts.TaskResult = (int)ScriptResults.Failure;
            }
        }

        void ExecuteProcess(object obj)
        {
            ****Code for the ExcuteProcess****

            IsExecuted = true;
            Dts.TaskResult = (int)ScriptResults.Success;
        }
        void timer_Elapsed(object sender, System.Timers.ElapsedEventArgs e)
        {
            Dts.TaskResult = (int)ScriptResults.Failure;
            IsStopped = true;
            timer.Stop();
            timer.Dispose();
            cts.Cancel();
            Thread.Sleep(2500);
            cts.Dispose();
        
        }
}

Hope this will be helpful...

Automating Excel Report Refresh...

The requirement is to refresh an Excel file in the SQL Report Server by running the Macro in the Excel file.
In this post, I'll summarize the approach and the issues faced and in another post, I'll explain the final implementation.

The key areas of the implementation is
1. Download the Excel file from Report Server
2. Run the Macro to refresh the Excel file and Save the file
3. Upload the Excel file to Report Server

For the implementation, SQL Server Integration Service is used.
If I summarize the approach followed for each steps
1. Download the Excel file using the DownloadFile method in WebClient.
2. Run the Macro using Excel COM Objects
3. Upload the Excel file to Report Server using Report Server Web Service

For 1st and 3rd steps, I could not find any resource directly for Excel files in Report Server, but I was able to achieve it by applying the relevant property values for the Excel files.
To run the Macro (2nd step), initially I used the Microsoft.Office.Interop.Excel namespace using C#, inside a SSIS Script Task. But I faced an issue while focusing on the production deployment. Because for Interop libraries, Excel needs to be installed on the running computer. That means in the Production server where the package is deployed, needs to have Excel installed.
But in the production servers it is not recommend to install Excel due to security reasons. Therefore I have to find an alternative approach.

Due to that initial approach was changed little bit.
Since the Excel needs to be installed on the computer where we run the Macro, we configured a separate VM with Excel installed. Actually this approach reduced any security risk which may occur due to the configuration changes, which will be described later.

Then to run the Macro, if I try to use Microsoft.Office.Interop.Excel namespace using C#, inside a SSIS Script Task, that means I have to deploy the package in the VM, where Excel is installed.
But that will add an additional cost since we have to install SSIS on the VM.

Due to that I choose to use PowerShell Command to run the Macro inside the Excel file, since it is possible to run a PowerShell command on a remote machine. To download the Excel files to the VM, a shared folder is created as well.
PowerShell command is executed from the Database Server, on the configured VM using the PowerShell Remoting. For this PowerShell Remoting needs to be enabled on the VM as well.

PowerShell Remoting can be configured using the below PowerShell command
Enable-PSRemoting -Force

Reference: https://www.howtogeek.com/117192/how-to-run-powershell-commands-on-remote-computers/

PowerShell command on the remote computer can be executed using inovke-command, and you have to provide the relevant credentials as well.
The provided user needs to have Administrator privileges on the remote computer to run the PowerShell command remotely.
But there are ways that we can skip this condition, but it is not recommended to do that due to security reasons.
The complete PowerShell command for Excel Macro execution is as below:

$Username = 'EnterUserName'
$Password = 'EnterPassword'
$pass = ConvertTo-SecureString -AsPlainText $Password -Force
$Cred = New-Object System.Management.Automation.PSCredential -ArgumentList $Username,$pass
invoke-command -computername VMExcel -ScriptBlock {$excel = new-object -comobject excel.application
                                    $filePath = "\\VMExcel\ExcelFiles\ExcelReport.xlsm"
                                    $workbook = $excel.Workbooks.Open($filePath)
                                    $worksheet = $workbook.worksheets.item(1)
                                    $excel.Run("ExcelReport.xlsm!MacroName")
                                    $workbook.save()
                                    $workbook.close()
                                    $excel.quit()
} -credential $Cred

But when I tried to execute the above command from the Database server using PowerShell, I got the below error message

Retrieving the COM class factory for component with CLSID {00024500-0000-0000-C000-000000000046} failed due to the
following error: 80080005 Server execution failed (Exception from HRESULT: 0x80080005 (CO_E_SERVER_EXEC_FAILURE))

If you google this error, you will find most of the people responded saying that automating Excel refresh cannot be achieved, due to the way Microsoft Office is designed.

Reference:
https://social.technet.microsoft.com/Forums/lync/en-US/4ea691e1-fb3e-41dc-887a-0e90f7d1c38d/accessing-excel-com-interop-dll-inside-a-new-cmdlet-code-resulting-in-exception-esp-when-cmdlet?forum=winserverpowershell

But I found some posts regarding some work around for the above error.
Based on those I applied the below changes.

1. Configuration changes to be done to allow access for the COM Object access.
  • Open the 32 bit Component Services on the VM using the mmc comexp.msc /32 in the run start bar
  • Expand till get the My Computer (Component Services -> Computers -> My Computer) and Right click on it to get the Properties.
  • Under COM Security tab, in the Access Permission section, click on Edit Default.
  • Please make sure Running user or VM Administrator Group has the Local and Remote Access.
  • Do the same for Launch and Activation Permissions
  • Click Ok to apply the changes made

2. Change the Identity of Microsoft Excel Application, from Launching User to Interactive User
  • Expand the DCOM Config in the Component Service which was opened before.
  • Go to the Microsoft Excel Application, right click on it and select Properties
  • Go to the Identity tab and select The interactive user
  • Click Ok to apply those changes
Reference: http://www.timwappat.info/post/2013/09/03/Solution-to-Excel-Automation-VBNETC-error-running-as-a-service-0x800A03EC-MicrosoftOfficeInteropExcelWorkbooksOpen

Then the last thing is to enable Auto Logon for the user who is running the PowerShell command on the VM.
The reason that the auto logon needs to be enable is, since the Excel application Identity mode is Interactive User, that user needs be log in to the VM, for the PowerShell command to be executed successfully.

To enable Auto Logon on the VM, I applied the required steps in the below link
https://superuser.com/questions/28647/how-do-i-enable-automatic-logon-in-windows-7-when-im-on-a-domain


  • Type regedit in Start Search and hit Enter
  • Navigate to the following registry key:
    HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon
  • Double-click the AutoAdminLogon entry, type 1 in the Value Data box, and then click OK.
  • If there is no AutoAdminLogon entry, create a new String Value (REG_SZ) with AutoAdminLogon as the value name and type 1 in the Value Data box, and then click OK.
  • Then need configure Windows to remember the password for logging in.
  • In the run box, type control userpasswords2 
  • Ensure your domain username is in the list, if not, add it. 
  • Untick (or tick and untick): Users must enter a user name and password to use this computer.
  • Make sure your username is selected. Click Apply.
  • At this point, Windows should prompt for the password that will be used.
  • Then go back to HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon 
  • Ensure the following String Values are set, if not, set them:
    DefaultUserName: Your domain username (without the domain prefix)
    DefaultDomainName: Your domain

That is it.

In my next post, I'll write about the SSIS implementation for the Excel Report refresh.

SQL Database Email is not working...

I have configured Database Mail in SQL Server 2016. But when I Send Test E-Mail, I did not receive any email. Actually I have followed t...