Viewing 15 posts - 2,401 through 2,415 (of 2,427 total)
It can be surprising how many times a tape backup needs a second tape, but this tape never gets mounted. The backp then hangs until someone notices it, often...
December 23, 2003 at 5:13 am
Knowing the memory usage for a database is not particularly useful.
I have worked on DB2 for z/OS, where information of this type can be obtained. The really useful information...
December 16, 2003 at 6:56 am
If you have custom DTS tasks, the backup and restore will affect the package definition, but will of course not include the custom DLL files. Restoring an old MSDB...
December 16, 2003 at 6:17 am
Are you using a Job Output File (seen in the Advanced tab on thejob step)? We had a problem where the proxy account could not open this file, even...
December 5, 2003 at 4:44 am
Why is the plan bad? Is it using parallelism? Have you tried turning this off?
Also, if you have set SQL memory to the maximum on the box, you...
December 4, 2003 at 7:30 am
Your problem may be related to adjacent-key locking.
When you do a insert or delete, SQL locks the adjacent key as well as the record you are processing. This is...
October 24, 2003 at 4:30 am
My understanding of the theory is:
1) You need to read what BOL says about this
2) You need to be running W2K or above & SQL2K
3) You need to talk to...
October 24, 2003 at 4:11 am
The main problem about increasing transaction log backup frequency to cope with peak loads is that you are imposing a high frequency of log backups when load is low.
We backup...
October 16, 2003 at 3:41 am
You can also give your user DDLADMIN authority in TempDB. This will allow a user with only Bulkadmin authority to BCP (etc) into a temporary table.
You then have the...
October 10, 2003 at 2:50 am
Looks like Measure Up points are best.
Working on the principle in the documentation that SQL7 can receive replicated data from SQL2K, but SQL2K cannot receive replicated data from SQL7, then...
October 1, 2003 at 6:09 am
You almost certainly need the DB2 ODBC driver. These are free in the DB2 install media. You will get better performance if you use a product called 'DB2...
September 19, 2003 at 3:08 am
A slightly OT comment on NTFS compression, at least on NT4...
If you copy a file over 2GB over the network to a NTFS compressed folder it is VERY...
September 18, 2003 at 2:20 am
I agree that while defrag is running, performance may be bad. This could be a reason to not use auto-defrag products. However, if you run the defrag in...
September 17, 2003 at 7:54 am
I may be missing something... why do you think that Windows defrag will be a problem with RAID arrays
All information provided is a personal opinion that may not match reality.
September 17, 2003 at 6:55 am
Viewing 15 posts - 2,401 through 2,415 (of 2,427 total)