Viewing 15 posts - 886 through 900 (of 1,292 total)
Jeff Moden (11/25/2009)
November 25, 2009 at 10:19 am
In addition to Sabya's question, have you checked if the the Job is running under the context of SQL Agent and with Admin rights?
November 25, 2009 at 10:16 am
Task Manager does not always show the correct memory usage of SQL Server.
The results show that SQL Server is using the 55 odd Gigs you have configured it to.
Any clue...
November 25, 2009 at 10:10 am
Run this query and post what's the values returned.
select * from master.dbo.sysperfinfo where counter_name LIKE '%Memory%'
and [object_name] like '%SQLServer:Memory Manager%'
November 25, 2009 at 9:49 am
Is your problem solved, what do mean by "already tried that one"
Which one did you try already?
No clarity in either of your posts 😉
November 25, 2009 at 9:45 am
Look at the performance monitor for the Memory SQL Server is utilizing.
November 25, 2009 at 9:38 am
Tom West (11/25/2009)
All servers are Windows Server 2003...
November 25, 2009 at 9:21 am
Before someone might suggest, let me tell you that you can also use Except clause.
SELECT town FROM table1
EXCEPT
SELECT town FROM table2
November 25, 2009 at 9:13 am
Can't help much without description about the tables / column names.
But let me give a try...
SELECT column list from table1 where town NOT IN (SELECT DISTINCT town FROM table2)
November 25, 2009 at 9:00 am
dbit (11/25/2009)
thanks for your replies..and of course with the restart, it should be during off-hours correct?
That's obvious, and if it's a production server, inform the application team / owners...
November 25, 2009 at 8:12 am
Use having clause at the end.
having [Purchase_%] > 50
I did not try on your query since I don't have the table / column details.
November 25, 2009 at 8:01 am
vikas bindra (11/25/2009)
EXEC msdb.dbo.sp_send_dbmail@profile_name = 'ProfName',
@recipients = '',
@query = 'SELECT 1,1,1,1,1',---specify your select query here
...
November 25, 2009 at 7:57 am
Thanks Jason, Lynn, Grant and GSquared on your welcoming replies.
As Grant adviced, I am all set and ready for the ride.
Congrats Jason on joining the top 100.
November 25, 2009 at 7:13 am
Explanation is understood, still, each data file of 1 GB may cause way too many files.
SQL Server does not create a data file automatically. You need to do a custom...
November 25, 2009 at 5:28 am
Viewing 15 posts - 886 through 900 (of 1,292 total)