March 30, 2010 at 10:56 am
I notice table variables is recommended to store upto 2 to 3 Megs.
Does this also apply CTE?
Thanks,
Jeffrey
March 30, 2010 at 10:59 am
CTEs and table variables are totally different.
A table variable, like a temp table, is a table in TempDB, kept in memory if possible, spilt to disk if not.
A CTE, despite its name, is not a table. It's essentially a named subquery. The rows in a CTE are not (necessarily) stored anywhere, just like when you use a subquery within a normal query.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
March 30, 2010 at 11:50 am
keepintouch2b (3/30/2010)
I notice table variables is recommended to store upto 2 to 3 Megs.Does this also apply CTE?
There's no such recommendation.
A (non-recursive) CTE is exactly equivalent to a derived table in the FROM clause - it is not a 'real' table.
March 30, 2010 at 10:32 pm
Thanks for affirming what tempdb/table variables/CTE differences.
Based on what I read, Table variables and CTE are using memory (RAM).
If a store proc uses either one of them, then developer must think of the
100% RAM utilization consequences of storing 1 million records in CTE or table variables. On the other hand, if storing 1 million records in a temp table, RAM utilization is not an issue.
If all store proc uses CTEs/Tables Variables throught out the entire database, 1 million records equivalent RAM usage can also be accumulated with hundreds of users calling different CTE/Table Variables storing 10,000 records concurrently.
Is this 100% RAM utilization caused by CTE/Table variable just a scary thought or a real possible scenario Microsoft or DBA not thought of yet?
Thanks for your time and expertise.
Jeffrey
March 30, 2010 at 10:58 pm
keepintouch2b (3/30/2010)
Thanks for affirming what tempdb/table variables/CTE differences.Based on what I read, Table variables and CTE are using memory (RAM).
If a store proc uses either one of them, then developer must think of the
100% RAM utilization consequences of storing 1 million records in CTE or table variables. On the other hand, if storing 1 million records in a temp table, RAM utilization is not an issue.
If all store proc uses CTEs/Tables Variables throught out the entire database, 1 million records equivalent RAM usage can also be accumulated with hundreds of users calling different CTE/Table Variables storing 10,000 records concurrently.
Is this 100% RAM utilization caused by CTE/Table variable just a scary thought or a real possible scenario Microsoft or DBA not thought of yet?
Thanks for your time and expertise.
Jeffrey
Temp Tables and table variables both will be stored in ram if possible and then shelved to disk if not. Your generalization that one is completely in RAM and the other is completely on DISK is inaccurate.
As for the performance impact from memory of using a CTE, I have used CTE's frequently (even in a recursive nature) to store millions of records and not ever even see a blip of a problem on memory pressure. You would have to test each method and tune the query appropriately.
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
March 30, 2010 at 11:38 pm
Jason,
Thanks for your experience that CTE 1 million records is not a problem.
That is for 1 user right? assuming 8k per row x 1 million rows is about 8 gigabytes which is ok for a 16 gigabyte server.
I think the row limit is your server's RAM capacity divide by 8k per row.
Assuming CTE is 100% RAM usage right? Then if 2 or more users would cause 100% RAM utilization: 2 users x 1 million rows x 8k per row is about 16 gigabytes.
Thanks for affirming that Table variable uses RAM if possible else use Disk as well.
Are you saying Temp table uses RAM? I thought it is stored in DISK in tempDB?
Or are you saying when there is a select from any CTE/Table variables/Temp table,
RAM is used no matter what?
Thanks,
Jeffrey
March 31, 2010 at 1:32 am
keepintouch2b (3/30/2010)
Assuming CTE is 100% RAM usage right? Then if 2 or more users would cause 100% RAM utilization: 2 users x 1 million rows x 8k per row is about 16 gigabytes.
That's an incorrect assumption. CTEs are not tables. They do not (necessarily) store results. They behave the same as subqueries and are not necessarily materialised at all.
Thanks for affirming that Table variable uses RAM if possible else use Disk as well.
Are you saying Temp table uses RAM? I thought it is stored in DISK in tempDB?
Yup. The disk/memory behaviour of temp tables and table variables is the same. Kept in memory if possible, spilt to disk if not. They are both allocated space in TempDB, they both have entries in the TempDB system tables
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
March 31, 2010 at 2:54 am
A (non-recursive) CTE is exactly equivalent to a derived table in the FROM clause - it is not a 'real' table.
Perhaps an example will help. The following two queries are logically exactly the same:
-- Using a common table expression (CTE)
WITH Tables
AS (
SELECT name
FROM sys.tables
)
SELECT *
FROM Tables
;
-- Using a derived table (table subquery)
SELECT *
FROM (
SELECT name
FROM sys.tables
) Tables
;
March 31, 2010 at 2:59 am
Are you planning to do something that requires loading one million rows into a temporary object or one kind or another?
You should probably forget about the distinction between RAM and disk for a minute. There is precious little difference in the way RAM and disk is used for any type of table - permanent, temporary, or variable.
As always, the more you tell us about the reason for your question, the better advice you will get.
March 31, 2010 at 8:54 am
Jason,
The purpose is to modulize all my big store procedures into smaller functions or store procedures.
Each store procedures has many #temp tables.
The question is which is best suited for passing table as input and output into a function/store proc.
Since CTE and table variable may be memory hog, I opt to use
create #temp (definiton)
insert into #temp
Exec smallerStoreProc
but the maintenance of #temp definition is a hassle if their are many store proc.
if there is no RAM memory over-utilization issues., CTE and table variables both would be great alternative to #temp table.
The suggested 2-3M eg memory limit on a table variable is stated here in this article,
http://searchsqlserver.techtarget.com/tip/0,289483,sid87_gci1267047_mem1,00.html
"Beware, though, when many users start using table variables -- large amounts of RAM are used because all temporary tables are stored and processed directly in memory. Table variables should hold no more than 2 Megs to 3 Megs of data each (depending on user load and system memory).
"
Thanks for the prompt response,
Jeffrey
March 31, 2010 at 9:02 am
keepintouch2b (3/31/2010)
"Beware, though, when many users start using table variables -- large amounts of RAM are used because all temporary tables are stored and processed directly in memory. Table variables should hold no more than 2 Megs to 3 Megs of data each (depending on user load and system memory)
Don't believe everything you read on the Internet!
As far as passing table-like objects to stored procedures is concerned, have a read of this:
How To Share Data Between Procedures
That's regarded as one of the definitive articles on that subject. Please do take the time to read it.
March 31, 2010 at 9:12 am
keepintouch2b (3/31/2010)
Since CTE and table variable may be memory hog, I opt to usecreate #temp (definiton)
insert into #temp
Exec smallerStoreProc
but the maintenance of #temp definition is a hassle if their are many store proc.
if there is no RAM memory over-utilization issues., CTE and table variables both would be great alternative to #temp table.
Except that you cannot insert into a CTE. It is not a table. Think of it as a temporary view, a temporary named select statement. Nothing more.
http://searchsqlserver.techtarget.com/tip/0,289483,sid87_gci1267047_mem1,00.html
"Beware, though, when many users start using table variables -- large amounts of RAM are used because all temporary tables are stored and processed directly in memory. Table variables should hold no more than 2 Megs to 3 Megs of data each (depending on user load and system memory)."
Total and complete garbage.
http://support.microsoft.com/kb/305977
It's listed as a SQL 2000 article, but it's mostly true for 2005 and 2008 as well.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
March 31, 2010 at 9:30 am
GilaMonster (3/31/2010)
Except that you cannot insert into a CTE.
Someone's going to post this...may as well be me 🙂
DECLARE @T
TABLE (A INTEGER NOT NULL);
WITH CTE
AS (SELECT A FROM @T)
INSERT CTE
VALUES (1);
Total and complete garbage.
An excellent, and concise summary.
March 31, 2010 at 9:40 am
Paul White NZ (3/31/2010)
GilaMonster (3/31/2010)
Except that you cannot insert into a CTE.Someone's going to post this...may as well be me 🙂
DECLARE @T
TABLE (A INTEGER NOT NULL);
WITH CTE
AS (SELECT A FROM @T)
INSERT CTE
VALUES (1);
I would argue that you're inserting into a table variable through a CTE. The CTE is not the thing storing the data.
The point I was trying to make is that the CTE doesn't store rows like temp table/table variable do.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
March 31, 2010 at 9:48 am
GilaMonster (3/31/2010)
I would argue that you're inserting into a table variable through a CTE. The CTE is not the thing storing the data. The point I was trying to make is that the CTE doesn't store rows like temp table/table variable do.
Agreed 🙂
Just a bit of fun.
Viewing 15 posts - 1 through 15 (of 20 total)
You must be logged in to reply to this topic. Login to reply