Ok... I realize that you said "quick and dirty", but I keep looking at this article and wondering... why?
Why would someone need to generate a giga-byte sized test table with virtually no useful test data in it? The only thing I can think of (maybe) is to use it for testing backups.
And, that's a lot of RBAR in that code... it can be done without any RBAR using the following...
--===== If the test table already exists, drop it so we can rebuild it
IF OBJECT_ID('dbo.BigTable','U') IS NOT NULL
DROP TABLE dbo.BigTable
--===== Define a handful of dynamic SQL variables
DECLARE @SQL1 VARCHAR(8000),
--===== Define the "fixed" portion of the SELECT list
SELECT @SQL1 = '
SELECT TOP 265000
IDENTITY(INT,1,1) AS RowID,
GETDATE() AS EntryDate,
' + CHAR(9)
--===== Define the 220 GUID columns we want to build.
-- This uses the variable overlay technique of concatenation.
SELECT @SQL2 = COALESCE(@SQL2+','+CHAR(10)+ CHAR(9),'') + 'NEWID() AS Val' + CAST(Number AS VARCHAR(10))
FROM Master.dbo.spt_Values --Could use a Tally or Numbers table here
WHERE Type = 'P'
AND Number BETWEEN 1 AND 220
--===== Define the "Into" and the cross join that will spawn the rows.
-- This uses the same cross-join technique usefull in building Tally and Numbers tables
SELECT @SQL3 = '
FROM Master.dbo.SysColumns sc1 WITH (NOLOCK)
CROSS JOIN Master.dbo.SysColumns sc2 WITH (NOLOCK)'
--===== Show what the final code looks like, just for fun...
-- PRINT @SQL1+@SQL2+@SQL3
--===== ... and then run it. (Takes about a minute on my ancient but useful box)
is pronounced ree-bar and is a Modenism for R
First step towards the paradigm shift of writing Set Based code: Stop thinking about what you want to do to a row... think, instead, of what you want to do to a column.
If you think its expensive to hire a professional to do the job, wait until you hire an amateur. -- Red Adair
How to post code problemsHow to post performance problemsForum FAQs