Exporting SharePoint usage log files into a database using LogParser
Asked Answered
L

5

3

So basically we have lots of SharePoint usage log files generated by our SharePoint 2007 site and we would like to make sense of them. For that we're thinking of reading the log files and dumping into a database with the appropriate columns and all. Now I was going to make an SSIS package to read all the text files and extract the data when I came across LogParser. Is there a way to use LogParser to dump data into an Sql Server database or the SSIS way is better? Or is there any other better way to use the SharePoint usage logs?

Lucan answered 8/10, 2008 at 6:24 Comment(1)
I've edited my question. Basically SharePoint usage logs are different from IIS log files.Lucan
M
2

This is the script we use to load IIS log files in a SQL Server database:

LogParser "SELECT * INTO <TABLENAME> FROM <LogFileName>" -o:SQL -server:<servername> -database:<databasename> -driver:"SQL Server" -username:sa -password:xxxxx -createTable:ON

The <tablename>, <logfilename>, <servername>, <databasename> and sa password need to be changed according to your specs.

From my experience LogParser works really well to load data from IIS logs to SQL Server, so a mixed approach is the best:

  • Load raw data from IIS log to SQL Server using LogParser
  • Use SSIS to extract and manipulate data from the temporary table containing the raw data in the final table you'll use for reporting.
Maya answered 8/10, 2008 at 8:5 Comment(0)
O
2

You'll have to write a plugin to logparser. Here is what I did:

[Guid("1CC338B9-4F5F-4bf2-86AE-55C865CF7159")]
public class SPUsageLogParserPlugin : ILogParserInputContext
{
    private FileStream stream = null;
    private BinaryReader br = null;
    private object[] currentEntry = null;
    public SPUsageLogParserPlugin() { }

    #region LogParser

    protected const int GENERAL_HEADER_LENGTH = 300;
    protected const int ENTRY_HEADER_LENGTH = 50;
    protected string[] columns = {"TimeStamp",
                                  "SiteGUID",
                                  "SiteUrl",
                                  "WebUrl",
                                  "Document",
                                  "User",
                                  "QueryString",
                                  "Referral",
                                  "UserAgent",
                                  "Command"};

    protected string ReadString(BinaryReader br)
    {
        StringBuilder buffer = new StringBuilder();
        char c = br.ReadChar();
        while (c != 0) {
            buffer.Append(c);
            c = br.ReadChar();
        }
        return buffer.ToString();
    }

    #endregion

    #region ILogParserInputContext Members

    enum FieldType
    {
        Integer = 1,
        Real = 2,
        String = 3,
        Timestamp = 4
    }

    public void OpenInput(string from)
    {
        stream = File.OpenRead(from);
        br = new BinaryReader(stream);
        br.ReadBytes(GENERAL_HEADER_LENGTH);
    }

    public int GetFieldCount()
    {
        return columns.Length;
    }

    public string GetFieldName(int index)
    {
        return columns[index];
    }

    public int GetFieldType(int index)
    {
        if (index == 0) {
            // TimeStamp
            return (int)FieldType.Timestamp;
        } else {
            // Other fields
            return (int)FieldType.String;
        }
    }

    public bool ReadRecord()
    {
        if (stream.Position < stream.Length) {
            br.ReadBytes(ENTRY_HEADER_LENGTH); // Entry Header

            string webappguid = ReadString(br);

            DateTime timestamp = DateTime.ParseExact(ReadString(br), "HH:mm:ss", null);
            string siteUrl = ReadString(br);
            string webUrl = ReadString(br);
            string document = ReadString(br);
            string user = ReadString(br);
            string query = ReadString(br);
            string referral = ReadString(br);
            string userAgent = ReadString(br);
            string guid = ReadString(br);
            string command = ReadString(br);

            currentEntry = new object[] { timestamp, webappguid, siteUrl, webUrl, document, user, query, referral, userAgent, command };
            return true;
        } else {
            currentEntry = new object[] { };
            return false;
        }
    }

    public object GetValue(int index)
    {
        return currentEntry[index];
    }

    public void CloseInput(bool abort)
    {
        br.Close();
        stream.Dispose();
        stream = null;
        br = null;
    }

    #endregion
}
Ophelia answered 8/10, 2008 at 11:58 Comment(0)
D
2

If you want more in-depth reporting and have the cash and computer power you could look at Nintex Reporting. I've seen a demo of it and it's very thorough, however it needs to continuously run on your system. Looks cool though.

Donnelldonnelly answered 16/10, 2008 at 8:5 Comment(0)
N
2

This is the blog post I used to get all the info needed. It is not necessary to go to the length of custom code.

In brief, create table script:

CREATE TABLE [dbo].[STSlog](
 [application] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [date] [datetime] NULL,
 [time] [datetime] NULL,
 [username] [varchar](255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [computername] [varchar](255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [method] [varchar](16) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [siteURL] [varchar](2048) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [webURL] [varchar](2048) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [docName] [varchar](2048) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [bytes] [int] NULL,
 [queryString] [varchar](2048) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [userAgent] [varchar](255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [referer] [varchar](2048) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
 [bitFlags] [smallint] NULL,
 [status] [smallint] NULL,
 [siteGuid] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL
) ON [PRIMARY]

Call to make log parser load in the data for a file

"C:\projects\STSLogParser\STSLogParser.exe" 2005-01-01 "c:\projects\STSlog\2005-01-01\00.log"  c:\projects\logparsertmp\stslog.csv
"C:\Program Files\Log Parser 2.2\logparser.exe" "SELECT 'SharePointPortal' as application, TO_DATE(TO_UTCTIME(TO_TIMESTAMP(TO_TIMESTAMP(date, 'yyyy-MM-dd'), TO_TIMESTAMP(time, 'hh:mm:ss')))) AS date, TO_TIME( TO_UTCTIME( TO_TIMESTAMP(TO_TIMESTAMP(date, 'yyyy-MM-dd'), TO_TIMESTAMP(time, 'hh:mm:ss')))), UserName as username, 'SERVERNAME' as computername, 'GET' as method, SiteURL as siteURL, WebURL as webURL, DocName as docName, cBytes as bytes,  QueryString as queryString, UserAgent as userAgent, RefURL as referer, TO_INT(bitFlags) as bitFlags, TO_INT(HttpStatus) as status, TO_STRING(SiteGuid) as siteGuid INTO STSlog FROM c:\projects\logparsertmp\stslog.csv WHERE (username IS NOT NULL) AND (TO_LOWERCASE(username) NOT IN (domain\serviceaccount))" -i:CSV -headerRow:ON -o:SQL -server:localhost -database:SharePoint_SA_IN -clearTable:ON
Namedropper answered 21/10, 2008 at 2:19 Comment(0)
L
1

Sorry I found out that Sharepoint Logs are not the same as IIS logs. They are different. How can we parse them?

Lucan answered 8/10, 2008 at 8:27 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.