Click here to monitor SSC
SQLServerCentral is supported by Red Gate Software Ltd.
 
Log in  ::  Register  ::  Not logged in
 
 
 
        
Home       Members    Calendar    Who's On


Add to briefcase ««12

"primary key" in Table Variable Expand / Collapse
Author
Message
Posted Thursday, May 3, 2007 3:45 PM
Grasshopper

GrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopperGrasshopper

Group: General Forum Members
Last Login: Sunday, November 22, 2009 1:26 AM
Points: 13, Visits: 22
Forget my original question...
Make believe that this has nothing to do with cursors, stored procs or anything else. I just want to know if anyone has ever seen "Primary Key" assigned to a column in a table var, crash a stored proc.

That's it. That's the entire question.

I don't mean to sound ungratefulm but many people have posted answers to a question I am not asking. I am not attempting to speed up a stored proc, I am not attempting to alter my code. The code I have works fine. I am just asking if anyone else has ever seen this behavior in SQL 2000.

Thanks
Post #363159
Posted Thursday, May 3, 2007 3:50 PM


SSC-Insane

SSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-InsaneSSC-Insane

Group: General Forum Members
Last Login: Thursday, November 27, 2014 4:23 AM
Points: 20,584, Visits: 9,624
Never seen this.
Post #363161
Posted Thursday, May 3, 2007 3:57 PM
Hall of Fame

Hall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of FameHall of Fame

Group: General Forum Members
Last Login: Monday, December 15, 2014 4:01 PM
Points: 3,221, Visits: 2,374

No.

I was just curious if it was version/service pack/hot fix related since it only occurred on 1 of 3 SQL Servers - hence the request for version information looking for a difference.





Regards
Rudy Komacsar
Senior Database Administrator

"Ave Caesar! - Morituri te salutamus."
Post #363164
Posted Thursday, May 3, 2007 6:32 PM
SSCrazy

SSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazySSCrazy

Group: General Forum Members
Last Login: Tuesday, August 10, 2010 5:07 AM
Points: 2,661, Visits: 23,078

Everyone is trying to help, it is just difficult to give a real answer without being able to reproduce the issue.  If you have lost the source code for this, I think you are basically finished in your quest to find a reason.

As far as everyone trying to help you with optimization and your process itself, think of it like this.  If you took your car in to get your brakes replaced and the mechanic did not tell you that you had a bad tie rod end, wouldn't you be pissed when your steering failed a couple of days later?

Please don't get frustrated and fault people for trying to help.

Post #363192
Posted Wednesday, September 26, 2007 6:56 AM
Valued Member

Valued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued MemberValued Member

Group: General Forum Members
Last Login: Monday, October 5, 2009 1:08 PM
Points: 73, Visits: 162
A late response to the original post,





Assigning a Primary Key to a table variable can be beneficial and harmful. How it is use, the results may vary.



I have previous experience where Primary Keys on a column for table variables can be beneficial. This was used for counting orders activated/assigned/created/completed/cancelled between time intervals.



Brief Summary:



I had used a "Tally" or "Numbers" table variable. (from a SQL CookBook) I also created a "Temp" table variable to stored the calculated time intervals, example every 5 minutes for the last 7 days. Updating the "Temp" table variable with data gathered from our "Orders" table. In my case, I created an index on the "Temp" table variable by using Primary Key on "StartTime" and "EndTime." For a result-set of 2880 rows, it took less than 1 second. For testing purposes, I removed the primary key declarations and I basically had to cancel the query after 5 seconds. Unfortunately, in my case, without the use of primary keys, my code took roughly 7-10 seconds to return the 2880 rows.





My Sample Code:



DECLARE @EndDate DateTime

DECLARE @NumDaysAgo INT

DECLARE @MinsApart INT



SET @EndDate = '09/01/2007'

SET @NumDaysAgo=10

SET @MinsApart=5





declare @Temp table

(



ID INT IDENTITY,

StartTime Datetime,

EndTime DateTime,

OrdersCreated INT,

OrdersActivated INT,

OrdersAssigned INT,

OrdersCancelled INT,

OrdersCompleted INT

,

PRIMARY KEY

(

StartTime ASC,

EndTime ASC

)

)





DECLARE @Numbers TABLE

(

ID INT PRIMARY KEY

)



DECLARE @Helper TABLE

(

ID INT

)



INSERT INTO @Helper

SELECT ID FROM

((SELECT 1 AS ID)

UNION

(SELECT 2 AS ID)

UNION

(SELECT 3 AS ID)

UNION

(SELECT 4 AS ID)

UNION

(SELECT 5 AS ID)

UNION

(SELECT 6 AS ID)

UNION

(SELECT 7 AS ID)

UNION

(SELECT 8 AS ID)

UNION

(SELECT 9 AS ID)

UNION

(SELECT 10 AS ID)

) AS A



INSERT INTO @Numbers

SELECT (1000*(Thousands.ID-1)) + (100*(Hundreds.ID-1)) + (10*(Tens.ID-1)) + (Ones.ID)

From @Helper Thousands,@Helper Hundreds,@Helper Tens, @Helper Ones

Order by Thousands.ID ASC, Hundreds.ID ASC, Tens.ID ASC, Ones.ID



SET @EndDate = datediff(d,@NumDaysAgo-1,@EndDate)





INSERT INTO @Temp(StartTime, EndTime)

SELECT DATEADD(mi,@MinsApart*(n.ID-1),@EndDate),DATEADD(mi,@MinsApart*n.ID,@EndDate)

FROM @Numbers n

WHERE n.ID<CONVERT(INT,(@NumDaysAgo*24.0*60/@MinsApart) + 1)



-- Note: For the purpose of this posting, the "modified" code below is not the original.

-- The original contains 4 individual/distinct update sets for each type of count.



UPDATE t

SET OrdersCreated = Helper.HelpCount

FROM @Temp t

INNER JOIN

(SELECT t2.ID, COUNT(o.OrderID) AS helpCount

FROM @Temp t2, dbo.Order o

WHERE o.Created BETWEEN t2.StartTime AND t2.EndTime

GROUP BY t2.ID

) AS Helper

ON t.ID = Helper.ID



-- ....

-- Update other counts

--...





SELECT StartTime, EndTime

, ISNULL(OrdersCreated,0) AS Created

, ISNULL(OrdersActivated,0) AS Activated

, ISNULL(OrdersAssigned,0) AS Assigned

, ISNULL(OrdersCancelled,0) AS Cancelled

, ISNULL(OrdersCompleted,0) AS Completed

FROM @Temp







For case of the original post, since each row from the table variable is being processed one at a time, the Primary Key will not be beneficial in improving performance.



However, I would suggest checking the size of the tempdb(s) in for the Production environment to see if there was enough space allocated at the time when the Primary Key was present in the table variable. Just my assumptions, but the tempdb(s) may not have been configured with the appropriate space allocation. Usually, Production environments have larger tempdb(s) primarily because there are more activities than Development and/or Beta environments.



Regarding performance of the stored procedure, you may need to add additional logic to "the WHILE Loop" such as find a way to do it "SET-based" oriented which will be beneficial.





Regards,
Wameng Vang


MCTS
Post #402989
Posted Saturday, September 29, 2007 9:39 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 8:47 PM
Points: 35,770, Visits: 32,436
Paul (5/3/2007)
Forget my original question...
Make believe that this has nothing to do with cursors, stored procs or anything else. I just want to know if anyone has ever seen "Primary Key" assigned to a column in a table var, crash a stored proc.

That's it. That's the entire question.

I don't mean to sound ungratefulm but many people have posted answers to a question I am not asking. I am not attempting to speed up a stored proc, I am not attempting to alter my code. The code I have works fine. I am just asking if anyone else has ever seen this behavior in SQL 2000.

Thanks

Yes, I've seen it... many times... it usually happens when someone updates the primary key column of a table using a join with that same table and some parallelism takes place. It usually only occurs on production boxes because they are much more likely to spawn the right kind of parallelism.

And that's why we wanted you to post the code... so we can check.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #404572
Posted Monday, November 10, 2008 4:48 AM
Forum Newbie

Forum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum NewbieForum Newbie

Group: General Forum Members
Last Login: Wednesday, March 19, 2014 10:21 AM
Points: 6, Visits: 26
Hi All,

I also ran into the exact problem (SS 2000)

I have a report which is timing out in all the environments (Offshore, Dev and QA).
The ASP report consists of a query which is using a table variable.

If I run the Report WITH primary key on the table variable it is taking 5 mins.
In the execution plan, the cost of table variable population is 16%
In the main query, cost of clustured index scan on table variable is 38% (even though table variable has only one row.)


If I run the query WITH OUT primary key on the table variable it is taking just 3 seconds.
In the execution plan, the cost of table variable population is 0.1%
In the main query, cost of table scan on table variable is 0%

This behavior is consistent across all DEV, QA and Prod Environments. It is very strange!!!!

Satya

Post #599750
Posted Monday, November 10, 2008 6:05 AM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 8:47 PM
Points: 35,770, Visits: 32,436
That sounds more like problems with a table variable. Why aren't you using a temp table, instead?

--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #599796
Posted Monday, November 10, 2008 6:09 AM


SSC-Forever

SSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-ForeverSSC-Forever

Group: General Forum Members
Last Login: Today @ 9:13 AM
Points: 40,609, Visits: 37,070
China Satya Rao T (11/10/2008)

In the execution plan, the cost of table variable population is 0.1%
In the main query, cost of table scan on table variable is 0%


The cost of operations against table variables cannot be considered accurate. Accurate costing requires that the optimiser knows how many rows will be affected and the optimiser cannot know that with table variables because they have no statistics.



Gail Shaw
Microsoft Certified Master: SQL Server 2008, MVP
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

We walk in the dark places no others will enter
We stand on the bridge and no one may pass

Post #599799
Posted Monday, November 10, 2008 9:25 PM


SSC-Dedicated

SSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-DedicatedSSC-Dedicated

Group: General Forum Members
Last Login: Today @ 8:47 PM
Points: 35,770, Visits: 32,436
Jeff Moden (11/10/2008)
That sounds more like problems with a table variable. Why aren't you using a temp table, instead?


Heh... sorry... I forgot... you're happy with the code you have.

To answer your question... yes, I've seen the problem. It's usually in conjuction with Table Variables for when I've seen it and I don't know why.


--Jeff Moden
"RBAR is pronounced "ree-bar" and is a "Modenism" for "Row-By-Agonizing-Row".

First step towards the paradigm shift of writing Set Based code:
Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column."

(play on words) "Just because you CAN do something in T-SQL, doesn't mean you SHOULDN'T." --22 Aug 2013

Helpful Links:
How to post code problems
How to post performance problems
Post #600369
« Prev Topic | Next Topic »

Add to briefcase ««12

Permissions Expand / Collapse