Wednesday 24 April 2013

Creating temp tables in SSIS seems like a straight-forward process using the Execute SQL Task, however there are a couple of properties that must be changed. In this tip, we’ll walk through creating a simple temp table in SSIS.

Creating Sample SSIS Package

First, I will drag an Execute SQL Task into my Design view and rename it Create Temp Table:
Creating temp tables in SSIS
Next, I will right click and edit and choose my connection and SQLStatement:
choose my connection and SQLStatement (The SQL statement used in this example is below)
IF OBJECT_ID('tempdb..##tmpTeams') IS NOT NULL
    DROP TABLE ##tmpTeams
    CREATE TABLE ##tmpTeams
    (
        Team VARCHAR(255),
        Mascot VARCHAR(255),
  State VARCHAR (2)
    )
    INSERT INTO ##tmpTeams VALUES
       ('Auburn', 'Tigers', 'AL'),
       ('Alabama', 'Crimson Tide', 'AL'),
       ('Mississippi', 'Rebels', 'MS'),
       ('Louisiana State', 'Tigers', 'LA'),
       ('Mississippi State', 'Bulldogs', 'MS'),
 ('Arkansas', 'Razorbacks', 'AR'),
 ('Texas A&M', 'Aggies', 'TX'),
 ('Georgia', 'Bulldogs', 'GA'),
 ('Tennessee', 'Volunteers', 'TN'),
 ('Vanderbilt', 'Commodores', 'TN'),
 ('Florida', 'Gators', 'FL'),
 ('South Carolina', 'Gamecocks', 'SC'),
 ('Missouri', 'Tigers', 'MO')
Next, I will drag a Data Flow task into my Design view, rename it Query and drag the precedence constraint between the two:
rename it Query
For the Data Flow task we are going to query the temp table and export the results to a database table. Right click the Data Flow task and choose Edit. Drag a OLE DB Source and a OLE DB Destination task into the Design view.
Right click the Data Flow task
To avoid errors when configuring the OLE DB Source we need to create the temp table first using SSMS. In SSMS run the following statement to create a global temp table:
CREATE TABLE ##tmpTeams
    (
        Team VARCHAR(255),
        Mascot VARCHAR(255),
  State VARCHAR (2)
    )
Once the table has been created, let’s go back into our SSIS package. Right click OLE DB Source and choose Edit. Choose your data source and choose SQL command for the Data access mode dropdown. In the SQL command text we will need to create our SQL statement:
SQL command text
Hit OK on the OLE DB Source window and right click OLE DB Destination and choose Edit. Choose your datasource and choose "Table or View – fast load" from the Data access mode dropdown. I already created a table called AlabamaTeams that I will use to display the data and will use this name in the dropdown for the Name of the table or the view.
Choose your datasource
Once finished configuring the OLE DB Destination hit OK. We are now ready to execute the package. Hit F5 to begin or choose Debug, Start Debugging from the Menu bar:
Start Debugging from the Menu bar
After executing the package, an error will occur on the Query task:
an error will occur on the Query task
If we look at the Progress tab we can see that the reason this error occurs is because after the temp table is created in the “Create Temp Table” task it is deleted. The “Query” task searches for the table but can’t find it.
Create Temp Table

Fixing the Issue

To fix this issue, we will need to change a few properties on the tasks and connection manager. The first property is a connection manager property. If you right click the OLEDB Connection Manager and choose properties you will see a property called RetainSameConnection. This must be set to “True”. RetainSameConnection means that the temp table will not be deleted when the task is completed.
The second property we must change is the Execute SQL Task property. If you right click on each task (Create Temp Table and Query) and choose properties you will see a property called DelayValidation. This must be set to “True”. DelayValidation means that the task will not check if the table exists upon creation.
the Execute SQL Task property
Once we have set both properties to “True”, execute the package again. The package should be successful this time:
The package should be successful
If we switch back over to SSMS and query the table, AlabamaTeams, we should return a result set:
switch back over to SSMS and query the table
**Note: I’m using global temp tables (##tmpTeams) instead of local temp tables (#tmpTeams) because when I create the table using SSMS the first time it uses a different session (SPID) therefore when I try to configure my OLE DB Source it cannot find the temp table and gives the following error:
OLE DB Source
With that said, only global temp tables should be used. Also, make sure when you are creating your global temp tables that the table name does not interfere with any other global temp tables that may be created using other processes.

Next Steps

  • I would assume that when creating a temp table inside a SSIS package that other tasks in the package will use the temp table, therefore I suggest adding another task that will drop the temp tables at the end of the package. For example, DROP TABLE ##tmpTeams.

Tuesday 9 April 2013

Rolling up multiple rows into a single row and column for SQL Server data



Problem

I need a way to roll-up multiple rows into one row and one column.  I know I can roll-up multiple rows into one row using Pivot, but I need all of the data concatenated into a single column in a single row. In this tip we look at a simple approach to accomplish this.

Solution

To illustrate what is needed, here is a sample of data in a table:
Users per sector sales
This is one example of what we want the end result to look like:
sample output
The solution proposed in this tip explores two SQL Server commands that can help us achieve the expected results.  The commands to be used are STUFF and FOR XML.

Preparing Sample Data

Before we begin, we'll create some tables and sample data which the following script will do for us.
CREATE TABLE SALES_SECTORS(
SEC_ID INT,
SEC_NAME VARCHAR(30))
GO
CREATE TABLE USRS(
USR_ID INT,
USR_NAME VARCHAR(30),
SEC_ID INT
)
GO
CREATE TABLE ADV_CAMPAIGN(
ADV_ID INT,
ADV_NAME VARCHAR(30)
)
GO
CREATE TABLE USR_ADV_CAMPAIGN(
USR_ID INT,
ADV_ID INT
)
GO
CREATE TABLE SEC_ADV_CAMPAIGN(
SEC_ID INT,
ADV_ID INT
)
GO
INSERT INTO SALES_SECTORS( SEC_ID, SEC_NAME ) VALUES ( 1, 'ENTERTAINMENT' )
INSERT INTO SALES_SECTORS( SEC_ID, SEC_NAME ) VALUES ( 2, 'CLOTHES' )
GO
INSERT INTO USRS( USR_ID, USR_NAME, SEC_ID ) VALUES ( 1, 'ANDERSON', 1 )
INSERT INTO USRS( USR_ID, USR_NAME, SEC_ID ) VALUES ( 2, 'CHARLES', 1 )
INSERT INTO USRS( USR_ID, USR_NAME, SEC_ID ) VALUES ( 3, 'DANNY', 1 )
INSERT INTO USRS( USR_ID, USR_NAME, SEC_ID ) VALUES ( 4, 'LUCAS', 1 )
INSERT INTO USRS( USR_ID, USR_NAME, SEC_ID ) VALUES ( 5, 'KEITH', 2 )
INSERT INTO USRS( USR_ID, USR_NAME, SEC_ID ) VALUES ( 6, 'STEFAN', 2 )
INSERT INTO USRS( USR_ID, USR_NAME, SEC_ID ) VALUES ( 7, 'EDUARD', 2 )
INSERT INTO USRS( USR_ID, USR_NAME, SEC_ID ) VALUES ( 8, 'BRAD', 2 )
GO
INSERT INTO ADV_CAMPAIGN( ADV_ID, ADV_NAME ) VALUES ( 1, 'SONY ENTERTAINMENT' )
INSERT INTO ADV_CAMPAIGN( ADV_ID, ADV_NAME ) VALUES ( 2, 'BEATS SOUNDS' )
INSERT INTO ADV_CAMPAIGN( ADV_ID, ADV_NAME ) VALUES ( 3, 'BOOSE' )
INSERT INTO ADV_CAMPAIGN( ADV_ID, ADV_NAME ) VALUES ( 4, 'POLO RALPH LAUREN' )
INSERT INTO ADV_CAMPAIGN( ADV_ID, ADV_NAME ) VALUES ( 5, 'LACOSTE' )
GO
INSERT INTO USR_ADV_CAMPAIGN( USR_ID, ADV_ID ) VALUES ( 1, 1 )
INSERT INTO USR_ADV_CAMPAIGN( USR_ID, ADV_ID ) VALUES ( 1, 2 )
INSERT INTO USR_ADV_CAMPAIGN( USR_ID, ADV_ID ) VALUES ( 2, 2 )
INSERT INTO USR_ADV_CAMPAIGN( USR_ID, ADV_ID ) VALUES ( 2, 3 )
INSERT INTO USR_ADV_CAMPAIGN( USR_ID, ADV_ID ) VALUES ( 3, 3 )
INSERT INTO USR_ADV_CAMPAIGN( USR_ID, ADV_ID ) VALUES ( 4, 2 )
INSERT INTO USR_ADV_CAMPAIGN( USR_ID, ADV_ID ) VALUES ( 5, 4 )
INSERT INTO USR_ADV_CAMPAIGN( USR_ID, ADV_ID ) VALUES ( 6, 5 )
INSERT INTO USR_ADV_CAMPAIGN( USR_ID, ADV_ID ) VALUES ( 7, 4 )
INSERT INTO USR_ADV_CAMPAIGN( USR_ID, ADV_ID ) VALUES ( 8, 5 )
GO
INSERT INTO SEC_ADV_CAMPAIGN( SEC_ID, ADV_ID ) VALUES ( 1, 1 )
INSERT INTO SEC_ADV_CAMPAIGN( SEC_ID, ADV_ID ) VALUES ( 1, 2 )
INSERT INTO SEC_ADV_CAMPAIGN( SEC_ID, ADV_ID ) VALUES ( 1, 3 )
INSERT INTO SEC_ADV_CAMPAIGN( SEC_ID, ADV_ID ) VALUES ( 2, 4 )
INSERT INTO SEC_ADV_CAMPAIGN( SEC_ID, ADV_ID ) VALUES ( 2, 5 )
GO

STUFF() Function

Before going to the examples, we need to understand the workings of the commands mentioned above. The STUFF() function puts a string in another string, from an initial position.  With this we can insert, replace or remove one or more characters.
This syntax is STUFF(character_expression, start, length, replaceWith_expression):
  • character_expression: string to be manipulated
  • start: initial position to start
  • length: number of characters to be manipulated
  • replaceWith_expression: characters to be used
Here is an example of the how to use the STUFF command. 
For our example we have a string that looks like this:
;KEITH;STEFAN;EDUARD;BRAD
We want to remove the first ; from the list so we end up with this output:
KEITH;STEFAN;EDUARD;BRAD
To do this we can use the STUFF command as follows to replace the first ; in the string with an empty string.
SELECT STUFF(';KEITH;STEFAN;EDUARD;BRAD', 1, 1, '')
And this returns this output:
KEITH;STEFAN;EDUARD;BRAD

FOR XML Clause

The FOR XML clause, will return the results of a SQL query as XML. The FOR XML has four modes which are RAW, AUTO, EXPLICIT or PATH.  We will use the PATH option, which generates single elements for each row returned.
If we use a regular query such as the following it will return the result set shown below.
SELECT 
   SS.SEC_NAME, 
   US.USR_NAME
FROM SALES_SECTORS SS
INNER JOIN USRS US ON US.SEC_ID = SS.SEC_ID
ORDER BY 1, 2

Users per sector sales
If we take this a step further, we can use the FOR XML PATH option to return the results as an XML string which will put all of the data into one row and one column.
SELECT
   SS.SEC_NAME,
   US.USR_NAME
FROM SALES_SECTORS SS
INNER JOIN USRS US ON US.SEC_ID = SS.SEC_ID
ORDER BY 1, 2
FOR XML PATH('')

Putting It All Together

Example 1

Now that we see what each of these commands does we can put these together to get our final result.
The example query below uses a subquery where we are returning XML data for the USR_NAME from table USRS and joining this to the outer query by SEC_ID from table SALES_SECTORS.  For each value from the inner query we are concatenating a ";" and then the actual value to have all of the data from all rows concatenated into one column.  We are grouping by SEC_NAME to show all USERS within that SECTOR. 
SELECT 
   SS.SEC_NAME, 
   (SELECT '; ' + US.USR_NAME 
    FROM USRS US
    WHERE US.SEC_ID = SS.SEC_ID
    FOR XML PATH('')) [SECTORS/USERS]
FROM SALES_SECTORS SS
GROUP BY SS.SEC_ID, SS.SEC_NAME
ORDER BY 1
The below is the output for this query.  We can see that we have the leading ; in the SECTORS/USERS column which we don't want.
In this modified example, we are now using the STUFF function to remove the leading ; in the string.
SELECT 
   SS.SEC_NAME,
   STUFF((SELECT '; ' + US.USR_NAME 
          FROM USRS US
          WHERE US.SEC_ID = SS.SEC_ID
          FOR XML PATH('')), 1, 1, '') [SECTORS/USERS]
FROM SALES_SECTORS SS
GROUP BY SS.SEC_ID, SS.SEC_NAME
ORDER BY 1
And we get this resultset:
If we also want to order the SECTORS/USERS data we can modify the query as follows:
SELECT 
   SS.SEC_NAME,
   STUFF((SELECT '; ' + US.USR_NAME 
          FROM USRS US
          WHERE US.SEC_ID = SS.SEC_ID
          ORDER BY USR_NAME
          FOR XML PATH('')), 1, 1, '') [SECTORS/USERS]
FROM SALES_SECTORS SS
GROUP BY SS.SEC_ID, SS.SEC_NAME
ORDER BY 1

Example 2

If we want this all to be in one column we can change the query a little as follows:
SELECT 
   SS.SEC_NAME + ': ' +
   STUFF((SELECT '; ' + US.USR_NAME 
          FROM USRS US
          WHERE US.SEC_ID = SS.SEC_ID
          FOR XML PATH('')), 1, 1, '') [SECTORS/USERS]
FROM SALES_SECTORS SS
GROUP BY SS.SEC_ID, SS.SEC_NAME
ORDER BY 1
And this gives us this result:
Users concatenated by sale sectors

Example 3

This example takes it a step further where we have multiple subqueries to give us data based on USERS within CAMPAIGNS within SECTORS.
SELECT 
    SS.SEC_ID, 
    SS.SEC_NAME,
    STUFF((SELECT '; ' + AC.ADV_NAME + ' (' + 
             STUFF((SELECT ',' + US.USR_NAME
                    FROM USR_ADV_CAMPAIGN UAC
                    INNER JOIN USRS US 
                          ON US.USR_ID = UAC.USR_ID
                    WHERE UAC.ADV_ID = SAC.ADV_ID
                    FOR XML PATH('')), 1, 1, '') + ')'
           FROM ADV_CAMPAIGN AC
           INNER JOIN SEC_ADV_CAMPAIGN SAC 
                 ON SAC.ADV_ID = AC.ADV_ID AND SAC.SEC_ID = SS.SEC_ID 
           ORDER BY AC.ADV_NAME
           FOR XML PATH('')), 1, 1, '') [CAMPAIGNS/USERS PER SECTOR]
FROM SALES_SECTORS SS
GROUP BY 
     SS.SEC_ID, 
     SS.SEC_NAME

Users/Campaigns concatenated by sale sectors

Conclusion

There are always several options to complete a task within SQL Server and we should take the time to explore the capabilities offered by the database before developing large and complex code. I hope this is one more of those examples that shows there are sometimes easier approaches than you think might be available.

Next Steps

  • Take this further and create simple queries and then deepen the complexity of the code.
  • Explore the commands used in this tip further to see what other things you might be able to do.
  • Some more details about the commands used above can be obtained from MSDN using the links below:

Monday 8 April 2013

Consolidating Error Logs from multiple SQL Servers using SSIS

Problem

SQL Server stores all informational messages, warning messages, and errors in operating system and application log files. As a database administrator (DBA), it is our responsibility to review the information in the error log files on a daily basis for any errors that have occurred in our SQL Server environment. However, manually reviewing the log files daily is not an easy task in a busy SQL Server environment, and it can quickly consume a large part of your day. Therefore, most DBAs tend to skip this when they are busy with other production problems. In this tip, I'll show you how we can consolidate error log entries from multiple servers into a central location, and then report on them from central location.

Solution

Before we talk about our custom solution for consolidating SQL error log, I will first state the purpose of SQL Server error log and show different ways for viewing it on SQL Server.

Purpose of SQL Server ErrorLog

SQL Server keeps logs for Database Mail, SQL Server (database server) and SQL Server Agent (error log) in operating system and application log files. SQL Server maintains a current log and archive logs. By default, it keeps six archive logs. Only logins that are members of the sysadmin and securityadmin fixed server role are able to view SQL Server logs.

Viewing SQL Server Logs Using SQL Server Management Studio

To open the Log File Viewer in Management Studio:
  1. Launch SQL Server Management Studio and then connect to a SQL Server instance.
  2. In Object Explorer, expand Management, SQL Server Logs.
  3. Right-click the current log or an archive log and choose View SQL Server Log.
errorlog_01.jpg
To open the Log File Viewer from the SQL Server Agent Error Logs:
  1. Expand SQL Server Agent, Error Logs.
  2. Right-click the current log or an archive log and choose View Agent Log.

Log File Viewer

The Log File Viewer is shown below:
errorlog_02.jpg
The Select logs window lets you choose from:
  • SQL Server
  • SQL Server Agent
  • Database Mail
  • Windows logs
You can view the Windows Application and System log  and with sufficient permissions the Security logs too. Log entry details display in the lower pane when you select a log entry. Most of the messages in the SQL Server logs are informational only and require no corrective action.

Viewing SQL Server Logs Using T-SQL

You can use sp_readerrorlog that is an undocumented stored procedure to view SQL Server error logs. It accepts log file number as a parameter. For example, to view the current log file, execute the sp_readerrorlog as follow:
EXEC [sp_readerrorlog] 0
GO

errorlog_03.jpg

Solution for Consolidating SQL Server Logs

Now that we understand the purpose of SQL Server logs, let's begin with our solution. This solution is built using SSIS, and will help you consolidate and merge error logs from multiple SQL Servers.
For the purpose of this article, create a database called ErrorLogMonitor on a SQL Server instance, where you want to consolidate the error logs.
Execute the following T-SQL script to create this database:
CREATE DATABASE [ErrorLogMonitor]
 ON  PRIMARY 
( NAME = N'ErrorLogMonitor'
, FILENAME = N'\ErrorLogMonitor.mdf' 
, SIZE = 4160KB , MAXSIZE = UNLIMITED, FILEGROWTH = 1024KB )
 LOG ON 
( NAME = N'ErrorLogMonitor_log'
, FILENAME = N'\ErrorLogMonitor_log.ldf' 
, SIZE = 1040KB , MAXSIZE = 2048GB , FILEGROWTH = 10%)
GO
Next, create the following two tables inside this database:
  • SSISServersList– The table stores the names of each SQL Server for which you want to read SQL Server error log.
  • ErrorLog– The table will store the error log data for all of the SQL Servers you are monitoring.
To create the tables, execute the following:
USE [ErrorLogMonitor]
GO
CREATE TABLE [dbo].[SSISServersList](
 [ServerName] [nvarchar](128) NOT NULL,
 [IsTest] [bit] NULL,
 [Active] [bit] NULL,
 [Port] [int] NULL) ON [PRIMARY]
GO
CREATE TABLE [dbo].[ErrorLog](
 [Server] [nvarchar](128) NOT NULL,
 [Date] [datetime] NULL,
 [ProcessInfo] [varchar](50) NULL,
 [Message] [nvarchar](4000) NULL,
 [AuditDate] [smalldatetime] NOT NULL,
 [LoadID] [int] NOT NULL
) ON [PRIMARY]
GO
Finally, we are ready to design our SSIS package. To do that launch SQL Server Data tools and create a new Integration Services project. Once the package is created, it will create an empty package. Rename this package to ConsolidateErrorLog_Package.dtsx.
Follow these steps to configure the package.

1: Defining Package Variables

Add the following two variables for the SSIS package:
  • Variable 1: SQLServer_Connection
    • Scope: ConsolidateErrorLog_Package
    • Data Type: String
    • Value:<You SQL Server instance name> (Specify the instance name where ErrorLogMonitor database is located).
  • Variable 2: SQLServer_ResultSet
    • Scope: ConsolidateErrorLog_Package
    • Data Type: Object
    • Value: System.Object
To do that, right-click anywhere on the empty panel on the Control Flow Tab, then select Variables and then click Add Variables icon (see below).
errorlog_04.jpg

2: Defining Package Connections

First create a new ADO.NET connection to the ErrorLogMonitor database on your SQL Server. To do this, click anywhere on the Connection Managers, and choose New ADO.NET Connection. Configure the ADO.NET connection as follow:
errorlog_05.jpg
Rename the connection to ADO_ErrorLogMonitor.
Next, create the dynamic OLE-DB connection, this connection allows us to connect to the different SQL Server instances. This connection is passed with SQLServer_Connection variable that contains SQL Server instance name from the SSISServerList table, which was created earlier. Rename the connection to Dynamic_ErrorLogMonitor.
Now right-click the Dynamic_ErrorLogMonitor connection and then choose Properties. First change the initial catalog to master, and then click (…) box next to expression and specify the following expression for ServerName property:
@[User::SQLServer_Connection]

errorlog_06.jpg

3: Defining Package Tasks

3.1: Configuring “Truncate ErrorLog table” – Execute SQL Task

Add an “Execute SQL Task” to the control flow. Double-click the task and configure properties in the “General” page of the task as follows:
  • Rename “Execute SQL Task” to “Get SQL Server ErrorLog”.
  • Set “ResultSet” property to “None”.
  • Set “Connection Type” property to “ADO.NET”.
  • Set “Connection” property to “ADO_ErrorLogMonitor”.
  • Set “BypassPrepare” to “True”.
  • Set “SQLStatement” property with the error log collection script below:
TRUNCATE TABLE [dbo].[ErrorLog]

errorlog_07.jpg

3.2: Configuring “Get Server Names” – Execute SQL Task

Add “Execute SQL Task” to control flow. Double-click the task and configure properties in the “General” page of the task as follows:
  • Rename “Execute SQL Task” to “Get SQL Server Names”.
  • Set “ResultSet” property to “Full result set”.
  • Set “Connection Type” property to “ADO.NET”.
  • Set “Connection” property to “ADO_ErrorLogMonitor”.
  • Set “BypassPrepare” to “False”.
  • Set “SQLStatement” to the following:
SELECT [ServerName] 
  + CASE ISNULL(CAST([Port] AS [varchar](10)), '')
   WHEN ''
    THEN ''
   ELSE ',' + CAST([Port] AS [varchar](10))
    END AS [ServerName]
FROM [dbo].[SSISServersList]
WHERE [Active] = 1
GO

errorlog_08.jpg
Click ResultSet in left pane and then click Add to add the "User::SQLServer_ResultSet" variable as shown below. Change Result Name value to 0, and then click OK to save the configuration for the task.
errorlog_09.jpg
Then in the SSIS package, connect the “Truncate ErrorLog table” task to the “Get Server Names” task.

3.3: Configuring – ForEach loop container

Drag and drop a ForEach loop container to the package control flow. Double-click the ForEach loop container, and then click Collection in left pane. Here configure the properties of this window as follows:
  • Change “Enumerator” property to “Foreach ADO Enumerator”.
  • Set “Enumeration Mode” to “Rows in the first table”.
  • Specify “ADO object source variable” to the “User::SQLServer_ResultSet” package variable. 
errorlog_10.jpg
Now click “Variable Mappings” and select “User::SQLServer_Connection” package variable to map to the collection value.
errorlog_11.jpg
Then in the SSIS package, connect the “Get Server Names” task with the “ForEach loop” container.

3.4: Configuring “Get SQL Server ErrorLog” – Execute SQL Task

Add an “Execute SQL Task” inside the ForEach loop container. Double-click the task and configure properties in the “General” page of the task as follows:
  • Rename “Execute SQL Task” to “Get SQL Server ErrorLog”.
  • Set “ResultSet” property to “None”.
  • Set “Connection Type” property to “OLE DB”.
  • Set “Connection” property to “Dynamic_ErrorLogMonitor”.
  • Set “BypassPrepare” to “True”.
  • Set “SQLStatement” property with the error log collection script below:
SET NOCOUNT ON;
SET DATEFORMAT MDY; 
DECLARE @LoadID   [int]
    ,@FileNumber  [tinyint]
    ,@ErrorLogSize [bigint]
    ,@ExtractDate [datetime]
    ,@SQLServer  [nvarchar](128)
IF NOT EXISTS (SELECT * FROM [tempdb].[sys].[objects] 
    WHERE [name] = 'ERROR_LOG' AND [type] IN (N'U'))
BEGIN
 CREATE TABLE [tempdb].[dbo].[ERROR_LOG]
  ([Server]  [nvarchar](128) NULL
  ,[LogDate]  [datetime] NULL
  ,[ProcessInfo]  [varchar](50) NULL
  ,[Text]   [nvarchar](4000) NULL)
END
ELSE 
BEGIN
 TRUNCATE TABLE [tempdb].[dbo].[ERROR_LOG]
END
IF OBJECT_ID('tempdb..#ErrorLogs') IS NOT NULL
    DROP TABLE #ErrorLogs
CREATE TABLE #ErrorLogs 
 ([ArchiveNumber]  [int]
 ,[DateLastChanged]  [datetime] NOT NULL
 ,[LogFileSizeInBytes]  [bigint])
-- Reading the error logs files details
INSERT INTO #ErrorLogs 
EXEC [master]..[xp_enumerrorlogs]
-- Assigning values to variables
SELECT @FileNumber = [ArchiveNumber]
   ,@ErrorLogSize = [LogFileSizeInBytes]
FROM #ErrorLogs
WHERE [ArchiveNumber] = 0
-- Set extract date for 1 day
SET @ExtractDate = DATEADD(dd, -1, CURRENT_TIMESTAMP)
IF (@ErrorLogSize <= 536870912)
BEGIN
 -- Loading the latest errorlog information inside table
 INSERT INTO [tempdb].[dbo].[ERROR_LOG]
      ([LogDate]
      ,[ProcessInfo]
      ,[Text])
 EXEC [sp_readerrorlog] 0
END 
ELSE 
BEGIN
 INSERT INTO [tempdb].[dbo].[ERROR_LOG]
   ([LogDate]
   ,[ProcessInfo]
   ,[Text])
 VALUES (CURRENT_TIMESTAMP
     ,'Server'
     ,'ErrorLog is two big to load.')
END
DELETE FROM [tempdb].[dbo].[ERROR_LOG]
WHERE [LogDate] < @ExtractDate
DELETE FROM [tempdb].[dbo].[ERROR_LOG]
WHERE [ProcessInfo] = 'Backup'
DELETE FROM [tempdb].[dbo].[ERROR_LOG]
WHERE [ProcessInfo] = 'Logon'
UPDATE [tempdb].[dbo].[ERROR_LOG]
SET  [Server] = CAST(SERVERPROPERTY('ServerName') AS [nvarchar](128))
SET NOCOUNT OFF;
GO

errorlog_12.jpg

3.5: Configuring “Process ErrorLog” – Data Flow Task

Add a “Data Flow Task” inside the ForEach loop container and rename it to “Process ErrorLog”. Connect the “Data Flow Task” with the “Get SQL Server ErrorLog”.
Now, double-click the “Data Flow Task”, and drag the “OLE DB Source” and “ADO NET Destination” to the “Data Flow Task”. Next, connect the “OLE DB Source” with the “ADO NET Destination. Then double-click “OLE DB Source” and specify the settings as follows:
  • Set “OLE DB Connection Manager” to “Dynamic_ErrorLogMonitor”.
  • Change “Data access mode” to “SQL command”
  • Specify the following query in “SQL command text”:
SELECT [Server]
      ,[LogDate]
      ,[ProcessInfo]
      ,[Text]
      ,CURRENT_TIMESTAMP AS [AuditDate]
FROM [tempdb].[dbo].[ERROR_LOG]
GO
Once done, click OK to save the settings and return to Data Flow.
errorlog_13.jpg
Finally, double-click “ADO NET Destination” and configure as follow:
  • Set “Connection manager” to “ADO_ErrorLogMonitor”.
  • Select “dbo.ErrorLog” table in our “ErrorLogMonitor” database, where data will be copied.
  • Then map source table to destination table. 
errorlog_14.jpg

errorlog_15.jpg
Click OK to save settings.

Package Review

All done, our package is fully configured, and it is now ready for us to execute the package. The package control flow and data flow look similar to the following figures:

Control Flow - ConsolidateErrorLog_Package.dtsx

errorlog_16.jpg

Data Flow - ConsolidateErrorLog_Package.dtsx

errorlog_17.jpg

Testing

To test the package, simply add a few servers in the SSISServerList table and then execute the package using the play button in SQL Server Data Tools (see below):
errorlog_18.jpg
Once successfully executed, query the data of dbo.ErrorLog inside our ErrorLogMonitor database to retrieve all SQL servers error logs (see below):
errorlog_19.jpg

Next Steps

THANK YOU & Visit Again (Give me your precious comments/queries)
ready to help.







Problem

Have you ever had to write a bunch of TSQL to do some data analysis on the table data in your database. If you have you'll know that this can become a fairly time consuming and tedious task? SQL Server 2012 Integration Services has a feature called the Data Profiling task that can help you perform this analysis much easier and faster (this feature is also available in SQL Server 2008). This task can help you find potential issues with your existing data as well as help you become more familiar with the data in a database that you have just started managing.

Solution

To use the Data Profiling SSIS task in SQL Server 2012 we first need to create a new Integration Services project in SQL Server Data Tools (formerly Business Intelligence Development Studio). Once you've created the project and opened it, your screen should look as follows with an empty package design tab.
Data Tools Initial Screen
Now that we have a package open we need to add a connection manager object for the database we want to run the profile against. To do this we right click "Connection Managers" in the Solution Explorer and select "New Connection Manager". Then select "ADO.NET" and click "Add...", then click "New..." and then fill in dialog box as show below. Once everything is filled in I like to click "Test Connection" just to make sure it's configured correctly. After this is successful you can click "OK" to complete the setup.
DB Connection Manager Details
Next we can add the Data Profiling task to the package by clicking and dragging it from the SSIS Toolbox onto the Control Flow panel.
Package With Data Profiling Task
We can start to configure the task by double clicking on the task in the Control Flow panel. A dialog box opens to the General Tab where we can setup our Profiler Output File connection. Click on the Destination property and select "<New File connection...>" and complete the dialog box as shown below. Once complete we can click "OK".
File Connection Manager Details
The complete General tab should now look like this.
Completed General Tab
From this panel we have two different methods we can use to configure what we are going to profile. First we can click on "Quick Profile" and we are presented with a form we can use to configure one or more tables to be profiled. For this example I am just going to select a single table and check all the profiles to be computed. A description of each profile can be found here. Once completed the form should look like the one below.
Completed Quick Profile Form
After clicking "OK" on the form above we are taken to the Profile Requests panel which is the other method of configuring this task and could have been selected when the initial form was opened. Looking at the screenshot below we can see all of the Profile Types we selected in the last form are configured here. Looking under the Data subsection of the Request Properties section we can see that we also have the ability to profile one or more columns within the table. As an example, if we were looking for a candidate key we may exclude some columns from the profile that we know for sure would not be part of the key.
Profile Requests Panel from Quick Profile
Let's add one more table/column by clicking on the next empty space in the "Profile Type" column. After selecting the profile type we can fill in the "Request Properties" section as shown below by select a connection manager, table/view and columns. In this case we are just going to select a single column. After clicking "OK" to complete the configuration we are taken back to the Control Flow tab of our package.
Profile Requests Panel
The configuration of our data profiling task is now complete and we can run it by selecting "Start Debugging" from the Debug menu or by using the F5 shortcut. Once it has completed successfully we have to select "Stop Debugging" from the Debug menu or use the Shift+F5 shortcut to be able to view the output.

The output file is just a regular xml file, here is a sample, and Microsoft provides two options for launching the Data Profile Viewer which displays this xml file in a nice formatted output. The first way is to double click on the Data Profiling task on the Control Flow tab and it will open up the General tab as it did when you were configuring the task. Once here we can click "Open Profile Viewer". Alternatively we can start the Data Profile Viewer as a stand-alone application. It can be found under the Start Menu -> Microsoft SQL Server 2012 -> Integration Services -> Data Profile Viewer. Once there simply open the output file that was created by the Data Profiling task. Regardless of the method you choose you will get the following window to display the contents of the xml file.
Data Profile Viewer

Next Steps