Viewing 15 posts - 1,531 through 1,545 (of 3,011 total)
You can also delete the history with a maintenance plan.
There is a pre-defined task just for that, the History Cleanup Task.
November 13, 2009 at 9:12 am
50 MB may be a little small. I usually use at least 100 MB, but for a database that I expect to grow fairly often, 500 MB or more...
November 11, 2009 at 11:56 am
Piotr.Rodak (11/10/2009)
...There is no point in filtering out rows that did not change from the update statement in my opinion, as updating all rows will not change rows that are...
November 10, 2009 at 4:10 pm
You can use the following script to shrink the data file.
Shrink DB File by Increment to Target Free Space
http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=80355
After you shrink the datafile, you should:
1. Defragment the indexes and update...
November 10, 2009 at 2:27 pm
Code-1029433 (11/10/2009)
Duh! Thats what I get for over thinking things. Thanks. 😀
Glad to help. As least you didn't post back to complain that my solution was too much work....
November 10, 2009 at 12:45 pm
Create multiple monthly schedules for the job: one that runs on day 5, one that runs on day 12, one that runs on day 19, and one that runs on...
November 10, 2009 at 12:22 pm
There may be jobs where knowing the answers to those questions would be important. I have encountered consulting companies that seem more interested in the “star quality” of their...
November 10, 2009 at 12:13 pm
There is no reason to run a shrink, and that is likely the cause of the large diff backups.
A better daily maintenance would be: 1. Update statistics, 2. Integrity check,...
November 10, 2009 at 9:54 am
Put simply, testing is part of development, so if you don’t unit test, the job is not done.
The ability to properly test is probably the most important skill for a...
November 10, 2009 at 9:30 am
The typical de-normalization process is this:
1. Create database design without knowing or attempting to follow normalization rules.
2. Claim that you de-normalized the design for performance.
November 6, 2009 at 9:00 am
This script gets the file information for every database on a server, and inserts it into temp table that is queried multiple ways to give various levels of analysis of...
November 5, 2009 at 12:42 pm
select
database_id,
CAST(name as varchar(66)) logical_name,
sum(size*8/1024) mb_size
from
master.sys.master_files
where
physical_name NOT LIKE '%.ldf'
group by
database_id,
name
with rollup
having
name is not nullor
database_id is null
order by
case when database_id is null then 1 else 0 end,
database_id,
name
November 5, 2009 at 12:35 pm
Read this article to understand the subject:
Detecting and Ending Deadlocks
http://msdn.microsoft.com/en-us/library/ms178104.aspx
However, in many cases, changing the database to read_committed_snapshot will solve your deadlocking problems:
use master
alter database [MyDatabase] set allow_snapshot_isolation ...
November 5, 2009 at 8:03 am
You need to use decimal math, instead of integer math.
select
Wrong= (76075/119027)*100 ,
Correct= (76075./119027.)*100.
Results:
Wrong Correct ...
November 4, 2009 at 3:03 pm
This works OK for me.
select
a.*
from
openrowset('SQLOLEDB','SERVER=(local);Trusted_Connection=yes;',
'
set fmtonly off;
exec master.dbo.sp_who
') a
November 4, 2009 at 2:27 pm
Viewing 15 posts - 1,531 through 1,545 (of 3,011 total)