SQL Clone
SQLServerCentral is supported by Redgate
 
Log in  ::  Register  ::  Not logged in
 
 
 


Temp tables vs "permanent" temp table vs User Table Type


Temp tables vs "permanent" temp table vs User Table Type

Author
Message
PiMané
PiMané
Hall of Fame
Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)

Group: General Forum Members
Points: 3002 Visits: 1351
Hi,

We have an SP that receives an XML field with loads of data (sometime it has over 1.000 records).
The XML data is used in the SP and other SPs called from the "main" SP and used to join with other tables.
What's the best way to store the XML?
1. A #temp table knowing it will always be created and dropped not taking advantage of statistics (can have PK and IDX to improve the joins);
2. A permanent temp table, this is, a table created like all the others but the data is inserted and deleted, just for processing porpoise, and the key has a SessionId. This can take advantage of statistics and the table isn't created and dropped whenever the SP is called;
3. A user defined table variable to be used as parameter (can have the PK to improve joins).
If the software was made in c#, that can use the UDT type, i'd probably go with it but since it's VB6 it doesn't support UDT types so the data has to passed as one big XML chunk.

Thanks,
Pedro



If you need to work better, try working less...
Dennis Post
Dennis Post
Ten Centuries
Ten Centuries (1K reputation)Ten Centuries (1K reputation)Ten Centuries (1K reputation)Ten Centuries (1K reputation)Ten Centuries (1K reputation)Ten Centuries (1K reputation)Ten Centuries (1K reputation)Ten Centuries (1K reputation)

Group: General Forum Members
Points: 1035 Visits: 547
What's the best way to store the XML?

An XML data type

http://msdn.microsoft.com/en-us/library/ms187339%28v=sql.120%29.aspx



For better, quicker answers on T-SQL questions, read Jeff Moden's suggestions.

"Million-to-one chances crop up nine times out of ten." ― Terry Pratchett, Mort
PiMané
PiMané
Hall of Fame
Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)

Group: General Forum Members
Points: 3002 Visits: 1351
Dennis Post (9/4/2013)
What's the best way to store the XML?

An XML data type

http://msdn.microsoft.com/en-us/library/ms187339%28v=sql.120%29.aspx


The XML parameter used is already a XML data type.
To use the XML with other tables and use it's data it has to be converted to a table, with the nodes() statement or the sp_xml_preparedocument SP. Since it's used more than once it has to stored in a table to be faster.
It's the type of table that I'm questioning about...



If you need to work better, try working less...
Grant Fritchey
Grant Fritchey
SSC Guru
SSC Guru (100K reputation)SSC Guru (100K reputation)SSC Guru (100K reputation)SSC Guru (100K reputation)SSC Guru (100K reputation)SSC Guru (100K reputation)SSC Guru (100K reputation)SSC Guru (100K reputation)

Group: General Forum Members
Points: 100973 Visits: 33014
#temp tables have statistics. It's table variables that do not have statistics.

Here's a question. Will more than one user be running this query? If so, using a permanent table will require you to also have a mechanism to separate out each person's data so that you're not stepping on each other.

In general, when doing this kind of work, assuming the secondary processing needs statistics (meaning, you filter on the data after loading it) then I would use temporary tables. If you don't need statistics (no filtering of ANY kind including JOIN operations), then I would use table variables. But then, if you don't need to do secondary processing, I'd just use XQUERY to access the XML directly.

----------------------------------------------------
The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood...
Theodore Roosevelt

The Scary DBA
Author of: SQL Server Query Performance Tuning and SQL Server Execution Plans
Product Evangelist for Red Gate Software
PiMané
PiMané
Hall of Fame
Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)

Group: General Forum Members
Points: 3002 Visits: 1351
Grant Fritchey (9/4/2013)

Here's a question. Will more than one user be running this query? If so, using a permanent table will require you to also have a mechanism to separate out each person's data so that you're not stepping on each other.

That why I mentioned I needed to add a SessionId column, to filter for each user that uses the SP. The SP can be used 10 or more times at the same time and over 400 times a day (it's the SP for adding orders and recalculating stocks).

Grant Fritchey (9/4/2013)

In general, when doing this kind of work, assuming the secondary processing needs statistics (meaning, you filter on the data after loading it) then I would use temporary tables. If you don't need statistics (no filtering of ANY kind including JOIN operations), then I would use table variables. But then, if you don't need to do secondary processing, I'd just use XQUERY to access the XML directly.

The data is used more than one in the "main" SP and in the other SPs as well, that's why we're using temp tables. Isn't it faster to store the data in a table rather than using .nodes() over and over again or sp_xml_preparedocument? In the past I also made a test comparing nodes and sp_xml_preparedocument: there's no big difference in performance (time) but if the XML uses attributes and there are over 50 or so the nodes is a lot more slower.
Also the data is filtered, used to update some other table data.

Thanks,
Pedro



If you need to work better, try working less...
Grant Fritchey
Grant Fritchey
SSC Guru
SSC Guru (100K reputation)SSC Guru (100K reputation)SSC Guru (100K reputation)SSC Guru (100K reputation)SSC Guru (100K reputation)SSC Guru (100K reputation)SSC Guru (100K reputation)SSC Guru (100K reputation)

Group: General Forum Members
Points: 100973 Visits: 33014
Loading & querying XML is expensive, especially with regards to memory. But, if you're already loading it, then it's not that much more expensive to use it twice. However, if you are processing through filtering, you may simply be better off going to a temporary table. Your overhead is focused into tempdb rather than into a table that you'll have to index carefully to avoid blocking issues. You'll still have statistics available for the filtering. You don't have to write extra code to clean up the data when you're done with it (and again, looking at blocking & resource issues around a single, permanent table).

----------------------------------------------------
The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood...
Theodore Roosevelt

The Scary DBA
Author of: SQL Server Query Performance Tuning and SQL Server Execution Plans
Product Evangelist for Red Gate Software
PiMané
PiMané
Hall of Fame
Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)

Group: General Forum Members
Points: 3002 Visits: 1351
Grant Fritchey (9/4/2013)
Loading & querying XML is expensive, especially with regards to memory. But, if you're already loading it, then it's not that much more expensive to use it twice. However, if you are processing through filtering, you may simply be better off going to a temporary table. Your overhead is focused into tempdb rather than into a table that you'll have to index carefully to avoid blocking issues. You'll still have statistics available for the filtering. You don't have to write extra code to clean up the data when you're done with it (and again, looking at blocking & resource issues around a single, permanent table).


Thanks Grant,

The filtering we do is when joining with other tables, for example, the articles table to get some extra information about the article.
Since there can be over 1.000 rows I think that the stats and indexes can be useful.

Thanks,
Pedro



If you need to work better, try working less...
PiMané
PiMané
Hall of Fame
Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)

Group: General Forum Members
Points: 3002 Visits: 1351
Just one more thing, probably a "stupid" question but here it goes anyway ;-)
Temp tables created with SELECT .. INTO #temp FROM ... and then create clustered index ... on #temp (id) and create index ... on #temp (...) is slower than CREATE TABLE #temp (....) ... and then the INSERT INTO #temp SELECT ... FROM?
And are the statistics on both methods the same?

Thanks,
Pedro



If you need to work better, try working less...
Grant Fritchey
Grant Fritchey
SSC Guru
SSC Guru (100K reputation)SSC Guru (100K reputation)SSC Guru (100K reputation)SSC Guru (100K reputation)SSC Guru (100K reputation)SSC Guru (100K reputation)SSC Guru (100K reputation)SSC Guru (100K reputation)

Group: General Forum Members
Points: 100973 Visits: 33014
PiMané (9/4/2013)
Just one more thing, probably a "stupid" question but here it goes anyway ;-)
Temp tables created with SELECT .. INTO #temp FROM ... and then create clustered index ... on #temp (id) and create index ... on #temp (...) is slower than CREATE TABLE #temp (....) ... and then the INSERT INTO #temp SELECT ... FROM?
And are the statistics on both methods the same?

Thanks,
Pedro


Two general options:

1) The table and indexes are created, then the data is loaded
2) The data goes in, then the indexes go on

The second choice is likely to have better statistics. Likely, not definitely (as in 100%). Creating an index results in a full scan for statistics where as adding data to existing indexes you're subject to the auto update processes and sampled updates by default.

----------------------------------------------------
The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood...
Theodore Roosevelt

The Scary DBA
Author of: SQL Server Query Performance Tuning and SQL Server Execution Plans
Product Evangelist for Red Gate Software
PiMané
PiMané
Hall of Fame
Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)Hall of Fame (3K reputation)

Group: General Forum Members
Points: 3002 Visits: 1351
Thanks for the enlightenment :-)

Pedro



If you need to work better, try working less...
Go


Permissions

You can't post new topics.
You can't post topic replies.
You can't post new polls.
You can't post replies to polls.
You can't edit your own topics.
You can't delete your own topics.
You can't edit other topics.
You can't delete other topics.
You can't edit your own posts.
You can't edit other posts.
You can't delete your own posts.
You can't delete other posts.
You can't post events.
You can't edit your own events.
You can't edit other events.
You can't delete your own events.
You can't delete other events.
You can't send private messages.
You can't send emails.
You can read topics.
You can't vote in polls.
You can't upload attachments.
You can download attachments.
You can't post HTML code.
You can't edit HTML code.
You can't post IFCode.
You can't post JavaScript.
You can post emoticons.
You can't post or upload images.

Select a forum

































































































































































SQLServerCentral


Search