Insert data monthly into table from History Table

  • Hello Everyone,

    Problem: we are having Performance issue with the data, while working on these changes we did created a new table which holds the history of the data for month end. so when we refer this table the performance is what we need. so we decided to keep this table and insert the data monthly basis into this table. this table is having 3 INT column and one Date column. we will select only few column cased on possible of this month and then insert into table at last day of that month.

    we need to help on how we can maintain this data every monthly, what are the different method we can use to insert data into this table monthly basis, with out affecting the performance?

    Please help on this issue

    Thank you

    Yatish

  • i think you should opt batch process , move data chunk wise ( says 5000 records on every execution ) and you can schedule this process in daily running job

    you can get idea here

    SELECT IDENTITY (INT, 1, 1) AS RowID, Table1PK

    INTO #BatchControl

    FROM table1

    INNER JOIN table3

    ON table1.col1 = table3.col1

    DECLARE @Batchsize

    SET @Batchsize = 10000

    -- WHILE there are rows left

    INSERT INTO table2 (col1, col2, col3)

    SELECT table1.col1, table1.col2, table3.col3

    FROM table1

    INNER JOIN table3

    ON table1.col1 = table3.col1

    INNER JOIN (SELECT TOP (@Batchsize) Table1PK FROM #BatchControl ORDER BY RowID) b ON b.Table1PK = table1.Table1PK

    DELETE #BatchControl

    FROM (SELECT TOP (@Batchsize) RowID FROM #BatchControl ORDER BY RowID) b WHERE b.RowID = #BatchControl.RowID

    -- END

    -------Bhuvnesh----------
    I work only to learn Sql Server...though my company pays me for getting their stuff done;-)

  • It kind of depends on how much data we're talking about. If you're speaking of millions & millions of rows each month, you should look into setting up partitioning for your data, probably by month. If you're talking about less data, something like the previous poster's approach could do what you need. If even less data, then simply moving the data either through TSQL or SSIS should do what you need. It's hard to be more specific without more specifics in the requirements.

    In addition to simply moving the data, you'll need to maintain indexes. Large inserts can result in fragmentation, so you'll probably want to defrag the indexes after the inserts. Also, large amounts of changes to the data could cause sampled statistics to become out of date. You'll need to possibly perform full scans on your stats (unless the defrag process is done by an index rebuild). You'll also want to be careful about how you implement the migration because you could run into processes in your log if the transactions are overlery large.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • Hello,

    we need to insert more than 60,000 rows every months, with current numbers of users the data is 60,000 rows. it may increase in future also.

    thank you

    Yatish

  • we need to insert more than 60,000 rows every months

    this is not a heavy data migration , you can go with either batch appraoch(i posted above) or T-sql. do you need to do it on daily/weekly basis or one time in a month ?

    -------Bhuvnesh----------
    I work only to learn Sql Server...though my company pays me for getting their stuff done;-)

  • Grant Fritchey (4/26/2010)


    In addition to simply moving the data, you'll need to maintain indexes. Large inserts can result in fragmentation, so you'll probably want to defrag the indexes after the inserts. Also, large amounts of changes to the data could cause sampled statistics to become out of date. You'll need to possibly perform full scans on your stats (unless the defrag process is done by an index rebuild). You'll also want to be careful about how you implement the migration because you could run into processes in your log if the transactions are overlery large.

    i miised this stuff, very important after any data migration.

    -------Bhuvnesh----------
    I work only to learn Sql Server...though my company pays me for getting their stuff done;-)

  • we need this to be done on last day of every month.

  • yatish.patil (4/27/2010)


    we need this to be done on last day of every month.

    Do you have triggers with it ? if not then go for simple INSERT statement ( better try on test environment to see any blockage occurance )

    -------Bhuvnesh----------
    I work only to learn Sql Server...though my company pays me for getting their stuff done;-)

  • yatish.patil (4/27/2010)


    we need this to be done on last day of every month.

    It's only 60,000 rows. That's not a trivial data set, but it's not that large. You should be able to use any of the methods that have been outlined (although I would absolutely not recommend partitioning for such a small set). To make it run at the end of the month, just set a schedule within the SQL Agent.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • You can drop the index on this table. Then insert the data and recreate the index. It will reduce the time of index update durig insert.

  • Nitin-900948 (4/28/2010)


    You can drop the index on this table. Then insert the data and recreate the index. It will reduce the time of index update durig insert.

    thats the recommended approach but here , i dont think it is required as this is "one time every month" story.Approx records are 60,000, it wont create any performance issue during insertion.

    -------Bhuvnesh----------
    I work only to learn Sql Server...though my company pays me for getting their stuff done;-)

  • Bhuvnesh (4/28/2010)


    Nitin-900948 (4/28/2010)


    You can drop the index on this table. Then insert the data and recreate the index. It will reduce the time of index update durig insert.

    thats the recommended approach but here , i dont think it is required as this is "one time every month" story.Approx records are 60,000, it wont create any performance issue during insertion.

    I agree. This is an area where testing to determine what works best in the situation will serve you well.

    "The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
    - Theodore Roosevelt

    Author of:
    SQL Server Execution Plans
    SQL Server Query Performance Tuning

  • thank you everyone for your help...I will try and work with the suggestions given for my problem and will let you all know which solution helped me.

    Thank you again

    Yatish

  • Here'a another way to look at it.

    Let's say each row of data contains 24 bytes.

    60 000 * 24 / 1024 / 1014 = 1.39 MB... maybe 3 mb if it's really heavily indexed (maybe even too much)

    I don't care how slow your server is but if it can't handle that "much" data then you have a serious problem.

    May I suggest that maybe the slowness here is in querying the data from the source table rather than the insert itself?

    IV'e done a dozen projects where I had to move 10 000s of documents on a push button basis and it always ran in less than 30 seconds (even with 100 000s or a couple M rows). The slower part was almost always querying the data in the base that so I don't think we're talking about the right problem here!!

  • My suggestion is that it will be better to keep the Month End tables in De-Normalized manner.

    I assume that you will be producing reports from this data or use it for display purpose only.

    Secondly, schedule a job when there is minimum load on database server. Drop Indexes before insertion on Month end tables, Insert data and re crete indexes.

    ----------------------------------------------------------------------------------------------------------------------------------------------------------------------
    Sometimes, winning is not an issue but trying.
    You can check my BLOG
    [font="Arial Black"]here[/font][/url][/right]

Viewing 15 posts - 1 through 15 (of 17 total)

You must be logged in to reply to this topic. Login to reply