October 21, 2005 at 1:18 pm
Hello,
We have an issue on our application. It is a COM process that
converts data into SQL and then validates it and loads it into a good sized database (about 250 tables - we use about 50% of it actively). When we insert a a payload of data that is large (about 400K) of raw data, the log files are growing to about 5 gigabytes and we also see quite the degradation in performance (due to the memory being allocated to the logs possibly).
The database is about 60MB in size and the transaction log file is initially about 100MB.
Can anyone suggest or refer me to know issues with SQL Server 2000 (SP2 with the MDAC version 2.8 hotfix)
related to this problem?
Our settings on the database are as follows:
Full Recovery model
Auto-update statistics
Auto-create statistics
Torn page detection
October 21, 2005 at 1:24 pm
Do you guys do any back up? if so, shrink the log file using DBCC_SHRINKFILE
or
after the execution by the com processs, shrink the log file manually either by using Enterprise manager or using DBCC_SHRINKFILE
Hope this helps
thanks
Deepak
October 21, 2005 at 1:37 pm
Hi and thanks for the quick reply. We do do backups and we have used the DBCC_SHRINKFILE. The issue only started occurring though after our first big data load. From that point on, it continues if that is any other useful information.
I have noted your reply though and will see if we can retrofit it into our code, but that may be problematic also. In the interim, if there are any other comments/notes you or the rest of the monitoring board have, please feel free.
October 21, 2005 at 1:44 pm
Is this a bulk insert ?
If this is a bulk insert, you can make minimal transaction logging by changing the database option to bulk logged .
Once when you are done with the com process, change the option back to what you have.
You will trade off certain advantages because of changing the database option to bulk logged
Viewing 4 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply