Loading Trace Files into table

  • Hello everyone,

    I have a large amount of trace files that I want to load into a table. I have used a statement like below before in SQL 2000 without problems. Unfortunately when I ran it in SQL 2005 I maxed out the TempDB (over 20 GB). Is there a way to load a large amount of trace data into a SQL 2005 table without using the TempDB?

    SELECT * INTO TraceData

    FROM ::fn_trace_gettable('D:\TraceInformation.trc', default)

    Thanks,

    Keith

  • We use that all the time with no serious issues. But then I've never monitored tempdb while running it. Are you breaking up your trace files or creating one gigantic one? You should break up the files. It has a number of advantages, you can load the data before the trace is completed and clean up as you go to reduce disk space, others. The function is probably loading into tempd prior to loading into the table and if your file is that large...

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • The trace is broken up into several files that add up to a few Gig. I am hoping to find a way to not use the TempDB. I have plenty of log space in the database I am trying to load the files into, but I am limited on TempDB space.

    Thanks,

    Keith

  • OK. That is odd then. I use this function regularly and it doesn't bump tempdb up to 20gb (I'd notice). Any triggers or anything like that on the table that you're inserting the data into?

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • this is the code for fn_trace_gettable:

    create function system_function_schema.fn_trace_gettable

    (@filename nvarchar(256),

    @numfiles int = -1)

    returns table as

    return select * from OpenRowset(TrcTable, @filename, @numfiles)

    No mention of Tempdb anywhere, The OpenRowSet function must be using it.

  • In my example the TraceData table is created when I run the SQL statement. If I create the TraceData table ahead of time I get the error below.

    Msg 2714, Level 16, State 6, Line 1

    There is already an object named 'TraceData' in the database.

    Thanks,

    Keith

  • Hi ..

    If you say something like below:

    1.set statistics xml on

    SELECT top 100* INTO TraceData

    FROM ::fn_trace_gettable('D:\TraceInformation.trc', default

    set statistics xml off

    2.click on result and save the file with sqlplan extension somewhere on the local machine

    3.close this session

    4.drag the file you just saved over one of the opened session you have on the management studio.

    5.in the execution plan it seems that the query is using a table spool and the data is store in a temporary table.

    I believe that this temporary table is build in tempdb and not in the db where you run the query.

    Cheers,

    R

  • Excellent.

    But that doesn't explain why a couple of gb file fills up 20gb of tempdb... I still think something else must have been going on.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • ...

    I would do first a check on these files:

    select * from sys.traces

    Watch the maximum size of the files which is in MB I think and if everything is fine (which means not very big ) then it might be a different issue. I am not a dba but I assume it would be some setting issues otherwise.

    R

Viewing 9 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply