August 28, 2008 at 9:38 am
Comments posted to this topic are about the item Quick and dirty way to make a very large table
Gaby________________________________________________________________"In theory, theory and practice are the same. In practice, they are not." - Albert Einstein
September 4, 2008 at 7:58 pm
Gaby,
Ok... I realize that you said "quick and dirty", but I keep looking at this article and wondering... why? :blink: Why would someone need to generate a giga-byte sized test table with virtually no useful test data in it? The only thing I can think of (maybe) is to use it for testing backups.
And, that's a lot of RBAR in that code... it can be done without any RBAR using the following...
--===== If the test table already exists, drop it so we can rebuild it
IF OBJECT_ID('dbo.BigTable','U') IS NOT NULL
DROP TABLE dbo.BigTable
GO
--===== Define a handful of dynamic SQL variables
DECLARE @SQL1 VARCHAR(8000),
@SQL2 VARCHAR(8000),
@SQL3 VARCHAR(8000)
--===== Define the "fixed" portion of the SELECT list
SELECT @SQL1 = '
SELECT TOP 265000
IDENTITY(INT,1,1) AS RowID,
GETDATE() AS EntryDate,
' + CHAR(9)
--===== Define the 220 GUID columns we want to build.
-- This uses the variable overlay technique of concatenation.
SELECT @SQL2 = COALESCE(@SQL2+','+CHAR(10)+ CHAR(9),'') + 'NEWID() AS Val' + CAST(Number AS VARCHAR(10))
FROM Master.dbo.spt_Values --Could use a Tally or Numbers table here
WHERE Type = 'P'
AND Number BETWEEN 1 AND 220
--===== Define the "Into" and the cross join that will spawn the rows.
-- This uses the same cross-join technique usefull in building Tally and Numbers tables
SELECT @SQL3 = '
INTO dbo.BigTable
FROM Master.dbo.SysColumns sc1 WITH (NOLOCK)
CROSS JOIN Master.dbo.SysColumns sc2 WITH (NOLOCK)'
--===== Show what the final code looks like, just for fun...
-- PRINT @SQL1+@SQL2+@SQL3
--===== ... and then run it. (Takes about a minute on my ancient but useful box)
EXEC (@SQL1+@SQL2+@SQL3)
--Jeff Moden
Change is inevitable... Change for the better is not.
September 5, 2008 at 7:14 am
Hmmm...true. 🙂 Ran it very quickly on my box as well.
I discovered one more way to create a big table a day or so after submitting my script. Create your table with one dummy_column as default null (or default getdate()) and the rest default newid() columns.
Insert 1 row into the table with default values then loop it as so:
insert bigtable default values
set @ctr = 0
set @num_iterations = 18 -- this is a DOUBLING number, so anything past 18 and you're getting into slow territory.
while @ctr < @num_iterations
begin
insert into bigtable(dummy_column)
select dummy_column from bigtable
set @ctr = @ctr + 1
end
Dummy_column can be anything, including null, as long as the other columns have default newid() set. This is also pretty quick way to create a large table.
Gaby________________________________________________________________"In theory, theory and practice are the same. In practice, they are not." - Albert Einstein
September 5, 2008 at 6:08 pm
You still haven't answered the question though... why would anyone create such a table of 1 gig of almost nothing but NEWID()'s?
--Jeff Moden
Change is inevitable... Change for the better is not.
September 8, 2008 at 6:39 am
Jeff Moden (9/5/2008)
You still haven't answered the question though... why would anyone create such a table of 1 gig of almost nothing but NEWID()'s?
I guess it was to generate a large enough backup for a third-party backup utility (i.e. Litespeed) to see how it performed. Didn't want to toy with any production servers.
Gaby________________________________________________________________"In theory, theory and practice are the same. In practice, they are not." - Albert Einstein
September 8, 2008 at 7:05 pm
Thanks, Gaby... 🙂
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply