Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 

Importing SharePoint ULS and IIS Logs into SQL

By David Jansen,

Overview

SharePoint ULS and IIS logs can be valuable in diagnosing and resolving SharePoint issues so copies of these logs should be moved off production servers on a regular basis to ensure they will be available in the event of a catastrophic failure.  Microsoft Support will ask for these log files if you require their assistance so you should have a plan in place to save this data to a secure location. 

Various tools exist to parse these log files but one can use Microsoft’s Log Parser command line utility to write the log files to SQL Server tables.  From there the log data can easily be analyzed and used as the data source for Reporting Services reports.  Having the log data in a single queryable data structure offers a clear advantage over picking through thousands of log files when your site is down and the pressure is on to find the root issue.

Here is an example of a LogParser command to import ULS log data with a date greater than 10/1/2009 into a SQL table named ‘ULSAudit’:

"c:\Program Files (x86)\Log Parser 2.2\logparser.exe" -i:tsv -iCodepage:-1 "SELECT Timestamp, Process, TID, Area, Category, EventID, Level, Message, Correlation INTO ULSAudit FROM C:\ulslogs\*.log WHERE date > ‘2009-10-01’” -o:SQL -oConnString: "Driver={SQL Server Native Client 10.0};server=localhost\SQLEXPRESS;Database=ULS;Trusted_Connection=yes;" -iCheckPoint:C:\ulslogs\myCheckPoint.lpc

The first parameter (-i) in the command above specifies the format of the input file.  ULS logs use the TSV (tab- or space-separated values text files) format.  Through trial and error I discovered that the -iCodepage parameter set to a value of -1 appears to be necessary to successfully execute a SELECT query against TSV files.  Following the first two parameters is the very T-SQL-like SELECT query.  The output parameter specifies SQL since we are writing to a SQL Server table and your connection string follows.  The connection string can use a Trusted Connection (Windows Authentication) or you may specify a SQL Server login and password.  The -iCheckPoint parameter specifies the name and location of the file that Log Parser will use to keep track of which records were previously imported.  This ensures that duplicate data is not imported on subsequent runs.

Depending on how often your ULS logs are cycled, a batch job should be scheduled to capture log entries before the log file they are stored in is reused.  For example, if your installation of SharePoint is configured to use 20 ULS log files, each for 10 minutes at a time, you would need to run LogParser at least every 3 hours to be sure you are capturing all log entries.  Log entries can be written to a remote SQL Server or a local instance.  If you would like to minimize or eliminate the network traffic involved in writing to a remote server during business hours, you can write ULS log data to a local SQL Server Express instance and then upload the data to a central linked server on a nightly basis.  The impact of running an instance of SQL Server Express on your SharePoint front-ends can be minimized by starting the SQL instance only when it is used and stopping it when the ULS data import completes.  I also recommend setting the SQL Server minimum and maximum memory values to 512MB.  That is sufficient for this purpose.

Here is a sample batch file that starts SQL Express, imports new ULS log entries into a SQL table, calls a SQL script to transfer the SQL data to the central server, and then shuts down SQL Express:

REM Start SQL service
net start "SQL Server (SQLEXPRESS)"

REM Wait 10 seconds to be sure SQL is up and running
CHOICE /C:x /T:10 /D x > NUL

REM Import ULS logs into SQL
"c:\Program Files (x86)\Log Parser 2.2\logparser.exe" -i:tsv -iCodepage:-1 "SELECT Timestamp, Process, TID, Area, Category, EventID, Level, Message, Correlation INTO ULSAudit FROM C:\ulslogs\EMP*.log WHERE date > ‘2009-10-01’”  -o:SQL -oConnString: "Driver={SQL Server Native Client 10.0};server=localhost\SQLEXPRESS;Database=ULS;Trusted_Connection=yes;" -iCheckPoint:C:\ulslogs\myCheckPoint.lpc

REM Transfer rows to central SQL server and truncate local table
sqlcmd -S localhost\SQLEXPRESS -i C:\ulslogs\sql.sql -o c:\ulslogs\Results.txt -e

REM Stop SQL service
net stop "SQL Server (SQLEXPRESS)"

Here is SQL script referenced above:

USE [ULS]
INSERT INTO [Central01].[ULS].[dbo].[ULSAudit_Wfe-03] SELECT * FROM [ULSAudit]
INSERT INTO [Central01].[ULS].[dbo].[ULSUploadLog](WFE) VALUES('Wfe-03')
TRUNCATE TABLE [ULSAudit]

Besides moving the rows to the central server, the script updates a table (ULSUploadLog) on the central server with a timestamp so that you can monitor for failed uploads.  A scheduled SQL script can query this table each night and fire off an e-mail alert if one or more servers fail to check in.

In addition to ULS logs, Log Parser is equally adept at parsing IIS and HTTPERR log data and writing that data to SQL.  These types of logs are cycled much less frequently than ULS logs so importing the data on a daily or weekly basis is an option.

Here is an example of Log Parser command to import IIS logs:

"c:\Program Files (x86)\Log Parser 2.2\logparser.exe" -i:IISW3C "SELECT TO_STRING(date, 'yyyy-MM-dd'), TO_STRING(time, 'hh:mm:ss'), s-sitename, s-ip, cs-method, cs-uri-stem, cs-uri-query, s-port, cs-username, c-ip, cs(User-Agent), sc-status, sc-substatus, sc-win32-status INTO IISAudit FROM C:\WINDOWS\system32\LogFiles\ex*.log WHERE date > '2009-10-01'" -o:SQL -oConnString: "Driver={SQL Server Native Client 10.0};server=localhost\SQLEXPRESS;Database=ULS;Trusted_Connection=yes;" -iCheckPoint:C:\ulslogs\IISCheckPoint.lpc -Recurse:-1

The -Recurse parameter specifies the level of subfolder recursion performed by Log Parser.  A value of -1 specifies unlimited recursion so in this case Log Parser will parse all files beginning with the letters ‘ex’ and having a suffix of .log in C:\WINDOWS\system32\LogFiles\ and all subfolders below that.   To disable recursion, set the -Recurse parameter to 0.  To search to a depth of 5 folders, set the value of the parameter to 5.  Also worth noting is the use of the TO_STRING function in the SELECT query to convert the IIS timestamp to string representations of the date and time of the log entry.

And here is a sample command to import HTTPERR log data:

"c:\Program Files (x86)\Log Parser 2.2\logparser.exe" -i:HTTPERR "SELECT TO_STRING(date, 'yyyy-MM-dd'), TO_STRING(time, 'hh:mm:ss'), c-ip, c-port, s-ip, s-port, cs-version, cs-method, cs-uri, sc-status, s-siteid, s-reason, s-queuename INTO HTTPERRAudit FROM C:\WINDOWS\system32\LogFiles\HTTPERR\httper*.log WHERE date > '2009-10-01'" -o:SQL -oConnString: "Driver={SQL Server Native Client 10.0};server=localhost\SQLEXPRESS;Database=ULS;Trusted_Connection=yes;" -iCheckPoint:C:\ulslogs\HTTPERRCheckPoint.lpc

HTTPERR log data is differentiated from IIS log data by file name convention.  Be sure to use separate checkpoint files for each type of log file being parsed.

 

Total article views: 3083 | Views in the last 30 days: 2
 
Related Articles
FORUM

Parameter selection causing screen refresh

Parameter selection causing screen refresh

ARTICLE

Using XML to pass Multi-Select parameters from SSRS to SQL Server

How to use XML to pass Multi-Select parameters from Reporting Services to SQL Server.

FORUM

SQL Parser!

I need a SQL parser (with source code) where it takes as input an SQL statement, and parses it and o...

FORUM

grouping based on parameter selected

grouping based on parameter selected

BLOG

Report Parameter order in SSRS

Is the order of parameters important in SQL Server Reporting Services reports?  If you’ve got nested...

Tags
etl    
sharepoint    
 
Contribute

Join the most active online SQL Server Community

SQL knowledge, delivered daily, free:

Email address:  

You make SSC a better place

As a member of SQLServerCentral, you get free access to loads of fresh content: thousands of articles and SQL scripts, a library of free eBooks, a weekly database news roundup, a great Q & A platform… And it’s our huge, buzzing community of SQL Server Professionals that makes it such a success.

Join us!

Steve Jones
Editor, SQLServerCentral.com

Already a member? Jump in:

Email address:   Password:   Remember me: Forgotten your password?
Steve Jones