SQL Database Email is not working...

I have configured Database Mail in SQL Server 2016.
But when I Send Test E-Mail, I did not receive any email.
Actually I have followed the same exact steps I used to configure it in a different server and all the things are fine including credentials.

Then I used the below query to check the status of it
SELECT * FROM msdb..sysmail_mailitems WHERE send_request_date > DATEADD(DAY, -1,GETDATE())

According to that sent_status=0. That means that request is new or not sent.
Then I find the DatabaseMail application file in the below path and double click on it.
\Program Files\Microsoft SQL Server\MSSQL13.MSSQLSERVER\MSSQL\Binn


 Then I got the below message



I checked the Features and .Net Framework 3.5 is not installed yet.



When it is installed, Database Mail is started to work fine.

Converting string value with "dd-MMM-yy hh.mm.ss.nnnnnn a" format to DateTime in SSIS

I had a requirement to convert a string value which contains a date time value with the format as "dd-MMM-yy hh.mm.ss.nnnnnn a" to DateTime.
I initially tried to convert it with the millisecond value, but it throws an error.
For my requirement I do not need the milliseconds, therefore I tried to convert it by removing the milliseconds part and it worked.

Below is the Expression used in the Derived Column component

ISNULL(DATA_VALUE) || LTRIM(RTRIM(DATA_VALUE)) == "" ? NULL(DT_DBTIMESTAMP) : (DT_DBTIMESTAMP)((DT_WSTR,10)((DT_DBDATE)SUBSTRING(DATA_VALUE,1,9)) + (DT_WSTR,1)" " + REPLACE(SUBSTRING(DATA_VALUE,27,2) == "AM" ? (DT_WSTR,12)SUBSTRING(DATA_VALUE,11,8) : (DT_WSTR,2)((DT_I4)SUBSTRING(DATA_VALUE,11,2) + 12) + (DT_WSTR,10)SUBSTRING(DATA_VALUE,13,6),".",":"))

Hope this will be helpful.

Power BI Schedule Refresh error...

There was an error thrown in PowerBI Schedule Refresh as

[0] -1055784932: Credentials are required to connect to the SQL source. (Source at xxxx;xxxxx.). The exception was raised by the IDbCommand interface.
[1] -1055784932: The command has been canceled.. The exception was raised by the IDbCommand interface.
[2] -1055784932: The command has been canceled.. The exception was raised by the IDbCommand interface.
[3] -1055784932: The command has been canceled.. The exception was raised by the IDbCommand interface.
[4] -1055784932: The command has been canceled.. The exception was raised by the IDbCommand interface.
[5] -1055784932: The command has been canceled.. The exception was raised by the IDbCommand interface.
[6] -1055784932: The command has been canceled.. The exception was raised by the IDbCommand interface.
[7] -1055784932: The command has been canceled.. The exception was raised by the IDbCommand interface.


I checked the credentials of the data source and everything is fine.
I referred the below link
https://community.powerbi.com/t5/Report-Server/Error-when-Refreshing-SQL-Data/td-p/317866/page/3

As mentioned in the above link,
"For each data source, check the name of the server and the database, it seems to be case sensitive when deployed to the server, but not when using the Power BI Desktop."

I had the same scenario and in my case it was with the database name.
I changed all the database names to exactly same name based on the case sensitivity and deployed the report.
After that schedule refresh is working fine.

Unable to load the data model for your power bi report...

There was a Power BI report which was working fine and suddenly it throws the below error
"Unable to load the data model for your Power BI report".


Actually there was no change applied on the server.
As suggested in some posts, I checked below scenarios and both of them worked.

1. Download the PBIX file and upload it back again. Actually I uploaded it with a different name and report is working fine.

2. Then I renamed the existing PBIX report and after that it worked fine as well. Again I renamed it to the original file name and the report is working fine after that as well.

Reference:
https://community.powerbi.com/t5/Report-Server/Unable-to-load-the-model-for-published-report/td-p/439131

Hope this will be helpful.

How the transformed data is written to an output file with column headers in U-SQL...

While working with U-SQL language, I noticed that there are few ways of writing data to an output file.
Let's assume the SalesDetails.csv file in the Azure Data Lake Store includes the Count and the Unit Price of an item. We want to calculate the Total Sale value and save it to the TotalSales.csv file, which will be stored in the Azure Data Lake Store.

DECLARE @inputFilePath string="/InputFiles/SalesDetails.csv";
DECLARE @outputFilePath string="/OutputFiles/TotalSales.csv";

@salesDetails=EXTRACT ItemId Int, Item string, UnitPrice decimal, Count int FROM @inputFilePath USING Extractors.Csv(skipFirstNRows:1);

To get the Total Sales values, we have to multiply Unit Price with the Count in the @salesDetails Rowset. But if the @salesDetails Rowset contains the column headers, it will cause issue while multiplying. Therefore to skip the column headers, skipFirstNRows:1 is used.

Below transformation will calculate the Total Sales value using the @salesDetails Rowset

@totalSales= SELECT ItemId AS ItemId, Item AS Item, UnitPrice AS UnitPrice, Count AS Count, UnitPrice*Count AS TotalSales FROM @salesDetails;

In the above, alias for each column is added since I need to add the column names for the output file.
Then the last step is to write the transformed data to the output file with column headers.

OUTPUT @totalSales
TO @outputFilePath
USING Outputters.Csv(outputHeader:true);

outputHeader:true is used to write the column headers to the output file. Anyway if you just only use the above statement, it will quote the header values.
To avoid it, the below statement can be used.

OUTPUT @totalSales
TO @outputFilePath
USING Outputters.Csv(outputHeader:true,quoting:false);

Hope this will be helpful...



Stop running SSIS Script Task after some time...

There is a SSIS Script Task which use to run another program. But due to the errors in the other program, some times script task is in the running state and it will not get stop.
Since I know the maximum duration which takes to complete the other program, I wanted to stop that script task if it exceeds the maximum duration.

For that I followed the below approach
- For the Script Task added a Timer to track the duration of the execution
- Timer interval is set for the  maximum duration
- When the timer is elapsed, there should be a way to stop the running process of the Script Task
- For that ThreadPool is used to execute the process and CancellationTokenSource is used to cancel the running process.

Code inside the script task is as below


System.Timers.Timer timer = null;

// Create the token source.
CancellationTokenSource cts = new CancellationTokenSource();

bool IsExecuted = false;
bool IsStopped = false;
public void Main()
{
        try
        {
                double maximumDuration = Convert.ToDouble(Dts.Variables["$Package::MaximunDuration"].Value.ToString());
                
                timer = new System.Timers.Timer(maximumDuration);
                timer.Elapsed += new System.Timers.ElapsedEventHandler(timer_Elapsed);
                timer.Enabled = true;
                
                ThreadPool.QueueUserWorkItem(new WaitCallback(ExecuteProcess), cts.Token);
                while(IsExecuted== false && IsStopped==false)
                {
                    Thread.Sleep(2500);
                }

            }

            catch(Exception ex)
            {
                Dts.TaskResult = (int)ScriptResults.Failure;
            }
        }

        void ExecuteProcess(object obj)
        {
            ****Code for the ExcuteProcess****

            IsExecuted = true;
            Dts.TaskResult = (int)ScriptResults.Success;
        }
        void timer_Elapsed(object sender, System.Timers.ElapsedEventArgs e)
        {
            Dts.TaskResult = (int)ScriptResults.Failure;
            IsStopped = true;
            timer.Stop();
            timer.Dispose();
            cts.Cancel();
            Thread.Sleep(2500);
            cts.Dispose();
        
        }
}

Hope this will be helpful...

Automating Excel Report Refresh...

The requirement is to refresh an Excel file in the SQL Report Server by running the Macro in the Excel file.
In this post, I'll summarize the approach and the issues faced and in another post, I'll explain the final implementation.

The key areas of the implementation is
1. Download the Excel file from Report Server
2. Run the Macro to refresh the Excel file and Save the file
3. Upload the Excel file to Report Server

For the implementation, SQL Server Integration Service is used.
If I summarize the approach followed for each steps
1. Download the Excel file using the DownloadFile method in WebClient.
2. Run the Macro using Excel COM Objects
3. Upload the Excel file to Report Server using Report Server Web Service

For 1st and 3rd steps, I could not find any resource directly for Excel files in Report Server, but I was able to achieve it by applying the relevant property values for the Excel files.
To run the Macro (2nd step), initially I used the Microsoft.Office.Interop.Excel namespace using C#, inside a SSIS Script Task. But I faced an issue while focusing on the production deployment. Because for Interop libraries, Excel needs to be installed on the running computer. That means in the Production server where the package is deployed, needs to have Excel installed.
But in the production servers it is not recommend to install Excel due to security reasons. Therefore I have to find an alternative approach.

Due to that initial approach was changed little bit.
Since the Excel needs to be installed on the computer where we run the Macro, we configured a separate VM with Excel installed. Actually this approach reduced any security risk which may occur due to the configuration changes, which will be described later.

Then to run the Macro, if I try to use Microsoft.Office.Interop.Excel namespace using C#, inside a SSIS Script Task, that means I have to deploy the package in the VM, where Excel is installed.
But that will add an additional cost since we have to install SSIS on the VM.

Due to that I choose to use PowerShell Command to run the Macro inside the Excel file, since it is possible to run a PowerShell command on a remote machine. To download the Excel files to the VM, a shared folder is created as well.
PowerShell command is executed from the Database Server, on the configured VM using the PowerShell Remoting. For this PowerShell Remoting needs to be enabled on the VM as well.

PowerShell Remoting can be configured using the below PowerShell command
Enable-PSRemoting -Force

Reference: https://www.howtogeek.com/117192/how-to-run-powershell-commands-on-remote-computers/

PowerShell command on the remote computer can be executed using inovke-command, and you have to provide the relevant credentials as well.
The provided user needs to have Administrator privileges on the remote computer to run the PowerShell command remotely.
But there are ways that we can skip this condition, but it is not recommended to do that due to security reasons.
The complete PowerShell command for Excel Macro execution is as below:

$Username = 'EnterUserName'
$Password = 'EnterPassword'
$pass = ConvertTo-SecureString -AsPlainText $Password -Force
$Cred = New-Object System.Management.Automation.PSCredential -ArgumentList $Username,$pass
invoke-command -computername VMExcel -ScriptBlock {$excel = new-object -comobject excel.application
                                    $filePath = "\\VMExcel\ExcelFiles\ExcelReport.xlsm"
                                    $workbook = $excel.Workbooks.Open($filePath)
                                    $worksheet = $workbook.worksheets.item(1)
                                    $excel.Run("ExcelReport.xlsm!MacroName")
                                    $workbook.save()
                                    $workbook.close()
                                    $excel.quit()
} -credential $Cred

But when I tried to execute the above command from the Database server using PowerShell, I got the below error message

Retrieving the COM class factory for component with CLSID {00024500-0000-0000-C000-000000000046} failed due to the
following error: 80080005 Server execution failed (Exception from HRESULT: 0x80080005 (CO_E_SERVER_EXEC_FAILURE))

If you google this error, you will find most of the people responded saying that automating Excel refresh cannot be achieved, due to the way Microsoft Office is designed.

Reference:
https://social.technet.microsoft.com/Forums/lync/en-US/4ea691e1-fb3e-41dc-887a-0e90f7d1c38d/accessing-excel-com-interop-dll-inside-a-new-cmdlet-code-resulting-in-exception-esp-when-cmdlet?forum=winserverpowershell

But I found some posts regarding some work around for the above error.
Based on those I applied the below changes.

1. Configuration changes to be done to allow access for the COM Object access.
  • Open the 32 bit Component Services on the VM using the mmc comexp.msc /32 in the run start bar
  • Expand till get the My Computer (Component Services -> Computers -> My Computer) and Right click on it to get the Properties.
  • Under COM Security tab, in the Access Permission section, click on Edit Default.
  • Please make sure Running user or VM Administrator Group has the Local and Remote Access.
  • Do the same for Launch and Activation Permissions
  • Click Ok to apply the changes made

2. Change the Identity of Microsoft Excel Application, from Launching User to Interactive User
  • Expand the DCOM Config in the Component Service which was opened before.
  • Go to the Microsoft Excel Application, right click on it and select Properties
  • Go to the Identity tab and select The interactive user
  • Click Ok to apply those changes
Reference: http://www.timwappat.info/post/2013/09/03/Solution-to-Excel-Automation-VBNETC-error-running-as-a-service-0x800A03EC-MicrosoftOfficeInteropExcelWorkbooksOpen

Then the last thing is to enable Auto Logon for the user who is running the PowerShell command on the VM.
The reason that the auto logon needs to be enable is, since the Excel application Identity mode is Interactive User, that user needs be log in to the VM, for the PowerShell command to be executed successfully.

To enable Auto Logon on the VM, I applied the required steps in the below link
https://superuser.com/questions/28647/how-do-i-enable-automatic-logon-in-windows-7-when-im-on-a-domain


  • Type regedit in Start Search and hit Enter
  • Navigate to the following registry key:
    HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon
  • Double-click the AutoAdminLogon entry, type 1 in the Value Data box, and then click OK.
  • If there is no AutoAdminLogon entry, create a new String Value (REG_SZ) with AutoAdminLogon as the value name and type 1 in the Value Data box, and then click OK.
  • Then need configure Windows to remember the password for logging in.
  • In the run box, type control userpasswords2 
  • Ensure your domain username is in the list, if not, add it. 
  • Untick (or tick and untick): Users must enter a user name and password to use this computer.
  • Make sure your username is selected. Click Apply.
  • At this point, Windows should prompt for the password that will be used.
  • Then go back to HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon 
  • Ensure the following String Values are set, if not, set them:
    DefaultUserName: Your domain username (without the domain prefix)
    DefaultDomainName: Your domain

That is it.

In my next post, I'll write about the SSIS implementation for the Excel Report refresh.

Getting Error Description and Error Column Name in SSIS 2016...

If you check the Error Output Path of a SSIS Data Flow component, it only contains the ErrorCode and the ErrorColumn (actually this the ID value of the column). By looking at those two values, we cannot get a clear idea about the error.

I checked how it is possible to get the Error Description and the Error Column Name and found the below detail post.
https://www.mssqltips.com/sqlservertip/4066/retrieve-error-column-in-ssis-2016/

But that post is written before the SQL Server 2016 is released and due to that there are some differences for the Code.

In this post I am going to provide the code and some enhancements done for how to get the Error Description and Error Column Name for SSIS 2016.

To get the Error Description and the Error Column Name, for the Error Output path, you have to add a Script Component.
Sample implementation will be as below, where we
- Union all the error records
- Get the Error Description and Error Column Name using a Script Component
- Write the error records with those details to a CSV file


In the Script Component, ErrorCode and ErrorColumn are selected as Input Columns


As Output Columns we need to add columns as ErrorDescription and ErrorColumnName, where the relevant values will be assigned in the code.


Then to add the require code, click on the Edit Script


In that the added code is as below

public override void Input0_ProcessInputRow(Input0Buffer Row)
    {
        /*
         * Add your code here
         */
        Row.ErrorDescription = this.ComponentMetaData.GetErrorDescription(Row.ErrorCode).Trim();
        var componentMetaData130 = this.ComponentMetaData as IDTSComponentMetaData130;

        string errorColumnName = "";

        if (componentMetaData130 != null)
        {
            if (Row.ErrorColumn > 0)
            {
                errorColumnName = componentMetaData130.GetIdentificationStringByID(Row.ErrorColumn);
            }
            else
            {
                errorColumnName = "NA";
            }

        }

        if (errorColumnName != "NA")
        {
            string searchStr = "Columns[";
            int lenSearchStr = searchStr.Length;
            int startIndex = errorColumnName.IndexOf(searchStr) + lenSearchStr;
            errorColumnName = errorColumnName.Substring(startIndex);
        }

        Row.ErrorColumnName = errorColumnName;
    }



If you just get the column name using the GetIdentificationStringByID method, the output will be as below.
OLE DB Destination.Inputs[OLE DB Destination Input].Columns[COLUMN_NAME]

Actually for the who ever managing the data, they do not need to know the SSIS Component which produced the error and they will be only interested about the column name and the error.
That is why only the Column Name is extracted using the below section in the above code

if (errorColumnName != "NA")
        {
            string searchStr = "Columns[";
            int lenSearchStr = searchStr.Length;
            int startIndex = errorColumnName.IndexOf(searchStr) + lenSearchStr;
            errorColumnName = errorColumnName.Substring(startIndex);
        }

After that error records will be written to a CSV file and we can share it with responsible parties.



Email Alerts are not received...

In one of application, Email alerts are sent using SQL sp_send_dbmail stored procedure. Email alerts were working properly, but suddenly noticed email alerts are not received.

To troubleshoot the issue, first I checked the msdb.dbo.sysmail_event_log table using the below query.

SELECT log_id, event_type, log_date, description, process_id, mailitem_id, account_id, last_mod_date, last_mod_user
FROM
msdb.dbo.sysmail_event_log 
ORDER BY
log_date DESC

If there is any error, you should see records with the event_type value as error.
In my case there are records with event_type value as error, and the description value for those records is same as below

The mail could not be sent to the recipients because of the mail server failure. (Sending Mail using Account 3 (2018-02-01T10:41:27). Exception Message: Cannot send mails to mail server. (The SMTP server requires a secure connection or the client was not authenticated. The server response was: Authentication required).  )

As per the error description, it should be due to the sender account issue.
The reason was the Sender password is changed and it is not updated in the Database Mail configurations. Once I updated the password in the Database Mail configurations, email alerts started to work properly.

Moving file from one folder to another in Azure Blob Storage using SSIS

In this post I'll explain how we can move a file from a folder in Azure Blob Storage Container to another folder in the same Container, and delete the original file.

Lets assume the Blob Container name is BlobContainer and the folder names are Folder1 and Folder2. Assume the file name as BlobFile.csv
File will be moved from one folder to another based on the file URL, by using WebClient in C#.

You can generate the File URL as below
BlobStorageURL=https://blobstorage.blob.core.windows.net
(blobstorage is the account name)

Then existing file URL is
https://blobstorage.blob.core.windows.net/BlobContainer/Folder1/BlobFile.csv

URL for the move file is
https://blobstorage.blob.core.windows.net/BlobContainer/Folder2/BlobFile.csv

As I mentioned in my previous post, you can get the file name of the file, available in the Azure Blob Storage using a Foreach Loop Container and possible to set those File URL values based on the file name.

Addition to above we need to know the Shared Access Signature value for each folder.
If you do not have those values, you can get the Shared Access Signature value as below.
To get it for Folder1, go to the Folder1 using Microsoft Azure Storage Explorer and right click on Folder1.
Then select Get Shared Access Signature.


Then you will get a window as below.



In that first things is make sure that you have provided the Expiry time accordingly.
Then put the tick on Write and Delete as well, since it is needed for our requirement.
After that click on Create and you will get the below window



To get the Shared Access Signature value, click on the Copy button under Query string.
The best things is to have that value in a Project Parameter in SSIS. Then when ever needed, you will be able to easily change it.

Follow the same steps for Folder2 as well.

As a best practice, I stored the Shared Access Signature values in two Project Parameters as SASFolder1 and SASFolder2.
Blob file URLs are store in package variables as OriginalBlobFileURL and MoveBlobFileURL

To move the file, I used a Script Task and Variables section of the Script Task Editor is as below



Click on the Edit Script to add the below code

string sasFolder1 = Dts.Variables["$Project::SASFolder1"].Value.ToString();
string sasFolder2 = Dts.Variables["$Project::SASFolder2"].Value.ToString();
string originalBlobFileURL = Dts.Variables["User::OriginalBlobFileURL"].Value.ToString();
string moveBlobFileURL = Dts.Variables["User::MoveBlobFileURL"].Value.ToString();

try
 {
        HttpClient client = new HttpClient();
        client.DefaultRequestHeaders.Add("x-ms-copy-source", originalBlobFileURL + sasFolder1);
        //copy file from Folder1 to Folder2
       HttpResponseMessage response = await client.PutAsync(moveBlobFileURL + sasFolder2, null);
       if (response.StatusCode == HttpStatusCode.Accepted || response.StatusCode == HttpStatusCode.Created)
         {
              client.DefaultRequestHeaders.Clear();
              // delete the original file in the Folder1
              HttpResponseMessage result = await client.DeleteAsync(originalBlobFileURL + sasFolder1);
              if (result.StatusCode == HttpStatusCode.Accepted || result.StatusCode == HttpStatusCode.Created)
               {
                        Dts.TaskResult = (int)ScriptResults.Success;
                        return;
               }
       }
       Dts.TaskResult = (int)ScriptResults.Failure;
}
catch (Exception ex)
{
      Dts.TaskResult = (int)ScriptResults.Failure;
}




Build the code and Save it.
Click Ok button in the Script Task Editor.
That is all...







Configuring SSIS Foreach Loop Container for Azure Blob Storage

Like we iterate through files in File System, we can iterate through files in Azure Blob Storage, using SSIS Foreach Loop Container.
If you consider about the Blob Storage structure, there will be Containers and inside those containers, you will find Folders.

Below Project Parameters are added to store those configurations information
BlobContainer - Stores the Blob Container name
BlobFolder - Stores the Folder name
ConnStr_AzureStorage - Stores the connection string value for the Azure Blob Storage
(Ex: DefaultEndpointsProtocol=https;AccountName=blobstorageaccountname;AccountKey=accountkey;EndpointSuffix=core.windows.net)

Addition to those, package variable is added to store the Blob File Name as BlobFileName.

To create a connection to the Azure Blob Storage, right click on the Connection Managers section in the Package and select New Connection

In the Add SSIS Connection Manager window, select AzureStorage type and click Add


Then you will get the Azure Storage Connection Manager Editor window.


In that select Use Azure Account and provide the relevant Storage account name and Account key values.
Then test the connection using Test Connection button.
If it is Successful, click on the OK button. Otherwise please check the provided Storage account name and Account key values.
To set the Connection String value using the ConnStr_AzureStorage Project Parameter, select the Azure Connection and go the Properties tab.
In that under Expressions section, add the ConnStr_AzureStorage project parameter value to the ConnectionString property.

Once it is done, to configure Foreach Loop Container, drag and drop Foreach Loop Container to the package Control Flow and double click on it.
In the Collection section, click on the down arrow under Enumerator, and select Foreach Azure Blob Enumerator



Then select the previously added Azure Connection for the Azure storage connection manager and to provide the relevant values for the Blob container name and Blob directory values, click on the ... in front of the Expressions.



Set the BlobContainer  and BlobPath property values using the BlobContainer and BlobFolder project parameters.



To assign the Blob File Name to the BlobFileName variable, go to the Variable Mappings section and set the variable.



Now you have configured the Foreach Loop Container for Azure Blob Storage and you can do the required operations inside the Foreach Loop Container by using the File Name available in the BlobFileName variable.



How to run UPDATE/INSERT/DELETE Statements on Azure SQL Database in Microsoft Fabric Notebook...

You can run UPDATE/INSERT/DELETE Statements on Azure SQL Database in Microsoft Fabric Notebook using Python SQL Driver - pyodbc.  For the Fa...