Here's a quick and dirty way to generate massive amounts of data. Basically the columns are in the form of: Identity(1,1), datetime, uniqueidentifier1...uniqueidentifierN
and the uniqueidentifier columns are set to populate automatically. I generated about 1 Gig (260000 rows of max rowsize about 8000) in 3.5 minutes on my simple basic desktop. I've limited it to 8000 so it would work with SQL 2000 but not sure how massive you could make it on 2005/2008. The reason for doing it this way is all the other ways that use checksum for example, take a couple hours to generate 1 Gig of Data.
I did notice when I had the columns be defaulted as part of the definition, the generation was quicker than inserting a newid() for all applicable columns.
One more thing, to speed up the generation, set the database this table is located in to grow in magnitude of 500MB chunks at a time rather than a percentage, as it will greatly speed up the generation.