Table Refresh

  • MSSQL 2000

    I'm not sure where to post this question; so please forgive me if I have post this in the wrong place.

    I am trying to figure out what is the best practice for table(s) refresh from one server to another. For example, I have Development / Testing going to Production (for deployment) and sometimes from Production to Development/Testing server for testings.

    I have thought of / tried

    1) thought of using backup / restore. However, sometimes I just need a refresh of data from a single table or a handful of tables. I don't want to overwrite all the sp, functions, etc.

    2) I have tried Red-Gate's SQL Data Compare. I didn't really like it (from production to development / testing) because it generates a whole bunch of transaction logs and I was advised not to shrink the log file. http://www.sqlservercentral.com/articles/Administration/64582/

    3) I have tried the manual

    i) drop constraints

    ii) truncate table

    iii) Import by going to Start, All Programs, Microsoft SQL Server 2008, Import and Export Data (32-bit)

    Option 3 never got finished because I had so many errors. I'm not exactlly sure how to import / export using SSMS2008 on a MSSQL2000.

    Plus if I go to Start, All Programs, Microsoft SQL Server 2008, Integration Services, Execute Package Utility, everything in the new window is grayed out.

    So here I am, asking the experts ... what is the best approach to refresh data from different servers?

  • Is this SQL 2000 or 2008? You mention both in your post, so it's not clear.

    If you are refreshing one table, you're going to generate transaction log entries. There's no way around this. Everything, every change, is logged. You can minimize the logging using something like the bulk load recovery mode, but that may be more complex than you need.

    Are you running the dev server in full mode or simple mode? If it's in simple, you might just commit your batches in stages. Data Compare will let you do this. Whether you use Data Compare or SSIS, they're about the same in terms of what it happening and the logging. You can use smaller batches and run log backups (or checkpoints) to keep the log from growing too large.

    There isn't a big issue with shrinking the log file, but if you are using the space, there isn't much sense in shrinking it over and over. Shrink issues come into play more with data files than log files.

  • Thanks for the reply.

    It's on MSSQL 2000 database. My client is SSMS 2008. Sorry for the confusion.

    Dev is in full backup mode.

    Edit: In Oracle, there's a data pump utility; if I remember correctly, I was able to dump 1 mil records to a table in about 30 mins or so. I have been running refresh in SQL Data Compare for 1 1/2 hours for 100k changes and it's only at 60%. I can't remember the hardware specifics for the Oracle; but, I believe both are Win2k3 32bit and 4gb RAMs.

    Is there a different / quicker way of doing table refresh in MSSQL2000?

  • You should examine why you might be having a slow insert.

    - Is the log growing now? If so, since you have to zero out the file before data can be written., that might be the issue. In that case, you should pre-size the log file (and data file). Shrinking log or data files between refreshes might appear to be giving you space, but if you use it on the next refresh, you haven't really recovered the space and you're forcing more work. I don't think it's Data Compare, but I can't be sure. You might open a ticket with Red Gate and see if there is an issue.

    - SSIS can run 1TB in 30 minutes, but it still depends on the log files and storage subsystem to receive that much data. (http://msdn.microsoft.com/en-us/library/dd537533%28v=sql.100%29.aspx)

    - The bulk insert task can load data quickly. Again, depending on how you have things configured. The recovery model can affect this, as well as the storage subsystem.

    - Do you have lots of indexes on the table? In ETL situations, it can make a difference at times. You might be faster by dropping all indexes and recreating them. Again, you'd have to test this, but it can help.

    There's probably other tricks you can try. I'm not an ETL expert, especially at high loads and edge cases, but these are things I've seen/done that have helped in the past.

    Disclosure: I work for Red Gate

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply