December 11, 2011 at 8:40 am
1Ankit1 (12/11/2011)
If your requirement is to insert all those rows at a time into the concerned table, you might as well try using the BULK INSERT command.)
Very nice post on how to do such a thing, Ankit. 🙂
I do, however, have to ask... Why? What will using BCP or BULK INSERT buy anyone here? It will only double the size of the table which may still be quite a bit less than 400,000.
--Jeff Moden
Change is inevitable... Change for the better is not.
December 27, 2011 at 4:05 am
Jeff and Dev,
It was intended to present one more possible way 😛 . You got that right, with just 400K rows, it might be calling for more efforts. My bad :hehe:. At the same time, it does help a great deal when number of rows is really huge and actually a better insert option than the usual INSERT query (this was the purpose of the post)
December 27, 2011 at 2:25 pm
1Ankit1 (12/27/2011)
Jeff and Dev,It was intended to present one more possible way 😛 . You got that right, with just 400K rows, it might be calling for more efforts. My bad :hehe:. At the same time, it does help a great deal when number of rows is really huge and actually a better insert option than the usual INSERT query (this was the purpose of the post)
Ummm... OK... why do you thing BCP will be better than a direct INSERT query if the original data lives in a table... regardless of size?
--Jeff Moden
Change is inevitable... Change for the better is not.
December 27, 2011 at 2:49 pm
Jeff Moden (12/27/2011)
1Ankit1 (12/27/2011)
Jeff and Dev,It was intended to present one more possible way 😛 . You got that right, with just 400K rows, it might be calling for more efforts. My bad :hehe:. At the same time, it does help a great deal when number of rows is really huge and actually a better insert option than the usual INSERT query (this was the purpose of the post)
Ummm... OK... why do you thing BCP will be better than a direct INSERT query if the original data lives in a table... regardless of size?
Depending on database recovery model, bulk import might generate less log growth. Otherwise, yeah, not much benefit to bulk import over Insert Select. (Of course, the log growth is compensated for by backup size being increased, if I remember correctly.)
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
December 27, 2011 at 2:58 pm
For the original question, I recommend getting a copy of RedGate's Data Generator (http://www.red-gate.com/products/sql-development/sql-data-generator/), and using that for the rows you need.
Building 400k rows of test data for a table with the complex relationships that the AdventureWorks.Production.Product has isn't as easy as you might like. If you use random data generation that you build yourself, you'll run into constraint violations, index violations, and foreign key violations. The off-the-shelf third party solution will handle that for you, if I'm not mistaken.
So, download the trial, test it out, make sure it will do what you need, and then buy the software and use it if it will. That'll be easier than trying to build something yourself.
I could build a data generator for that table, but I'd have to bill you for it, since it's not as easy as writing a simple Insert Select statement, not for a table like that. RedGate's solution will be faster, easier, and cheaper for you, and will handle more than that one table, if you need it to.
Search online for competitors, of course. I merely recommend the RedGate one here because this is their site, and I know their product works. Note: I don't work for them, I don't get paid to recomend or endorse their products, nor do I otherwise benefit from recommending it.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
January 3, 2012 at 5:31 am
I got a task to do.I need to insert/update/delete 400,000 new rows into Adventure Works DB table inorder to test how Proactive caching works in different scenarios.
Need to show POC(Proof of Concept)
This thread has a lot of discussion about a technical solution but nothing much about the business requirements.
samyuktha you say you are new to SQL Server and I think maybe also new to data warehouse design. What is the business requirement you are trying to solve? Why are doing a POC and what do you hope to achieve?
Original author: https://github.com/SQL-FineBuild/Common/wiki/ 1-click install and best practice configuration of SQL Server 2019, 2017 2016, 2014, 2012, 2008 R2, 2008 and 2005.
When I give food to the poor they call me a saint. When I ask why they are poor they call me a communist - Archbishop Hélder Câmara
January 3, 2012 at 8:58 am
Hi
here there is a query for the data population you need to do some alter respect to what data you want and from which table you want just make neccessry changes.
DECLARE @ID INT
SELECT @ID = 1
WHILE (@ID <= 400000)
BEGIN
INSERT INTO #TEMP(ID,NAME)
SELECT @ID,'NAME' + CAST(@ID AS VARCHAR)
SET @ID = @ID + 1
END
All the best, Hope it works for you.
Thanks & Regards
Syed Sami Ur Rehman
SQL-Server (Developer)
Hyderabad
Email-sami.sqldba@gmail.com
January 3, 2012 at 10:58 am
Hi,
Thanks a lot for your reply.I need to do it to test Proactive caching.I need to test the data by inserting/updating/deleting rows and see with which Storage mode I'm achieving an immediate update in the cube.
Its just to test whether, we can use storage modes in our project.
But,I have done it in a different way by using Adventure works.
Thanks & Regards,
Samyuktha
Viewing 8 posts - 16 through 22 (of 22 total)
You must be logged in to reply to this topic. Login to reply