July 27, 2006 at 5:14 pm
Hello all,
Ok, I am having an issue where I am using Insert Exec and it runs on and on and never ends but if I run the Exec part by itself it runs in less than a second. Example below:
Create Table #tmpReport2(
JobID varchar(200),
JobServer varchar(200),
JobName varchar(200),
JobEnabled int,
JobDescription varchar(200),
JobStartStep int,
JobCategory varchar(200),
JobOwner varchar(200),
JobNotifyEventlog int,
JobNotifyEmail int,
JobNotifyNetsend int,
JobNotifyPage int,
JobEmailOperator varchar(200),
JobNetsendOperator varchar(200),
JobPageOperator varchar(200),
JobDeleteLevel int,
JobCreateDate varchar(30),
JobModifyDate varchar(30),
JobVersion int,
JobLastRunDate varchar(10),
JobLastRunTime varchar(10),
JobLastOutcome int,
JobNextRunDate varchar(10),
JobNextRunTime varchar(10),
JobNextRunID int,
JobCurrentStatus int,
JobCurrentStep varchar(20),
JobCurrentRetry int,
JobHasStep int,
JobHasSchedule int,
JobHasTarget int,
JobType int)
Insert #tmpreport2
exec OTGETL02.msdb..sp_help_job @job_id = '{EFA62BDF-16DD-4E5C-B4C5-B7953151A349}',
@job_aspect = N'job'
If you run:
exec OTGETL02.msdb..sp_help_job @job_id = '{EFA62BDF-16DD-4E5C-B4C5-B7953151A349}',
@job_aspect = N'job'
It runs quickly. I am very puzzled.
Thank you
July 27, 2006 at 6:37 pm
July 28, 2006 at 9:05 am
Thank Sergiy for the reply. Yes, it is a distributed transaction, but does it have to take forever to run? I am trying to gather the status of all jobs from every server we have here for a report I have to produce each day. Instead of going through each server and manually checking the status of the jobs I was hoping to automate this process. We have about 40 SQL server boxes.![]()
July 31, 2006 at 9:28 am
Anyone else have any ideas that may help me on this? It has to be possible, I would think....
Thanks
July 31, 2006 at 11:58 am
A little off-the-wall but I've run into similar problems with other things... the way I got around it was to run this type of thing on each server with an output to a text file in a common area. Then, I'd do a bulk insert for the files in the common area... nasty fast. A little extra work but well worth it.
--Jeff Moden
Change is inevitable... Change for the better is not.
July 31, 2006 at 12:00 pm
Thanks Jeff. I guess I am going to have to go that route too. It is not like this on all servers, but just a couple of them and they are also linked. So who knows. Thanks again...
July 31, 2006 at 4:55 pm
If it's not slow on all the servers, then you may be having some router or hub problems. For example, if the port and card the server is on are setup to auto-negotiate the speed, it will always choose the worst case. If the duplex is incorrectly set (I forget if it should be full or half), big problems with speed.
Make a list of the "good" and "bad" servers and have your network guys take a peak...
--Jeff Moden
Change is inevitable... Change for the better is not.
July 31, 2006 at 5:00 pm
Thanks again. I did go and make a DTS package on each server to put the results in to a single access database for now. And that works rather quickly. But I think I will go ahead and see if the network team can look into this.
Kam
Viewing 8 posts - 1 through 8 (of 8 total)
You must be logged in to reply to this topic. Login to reply