August 15, 2013 at 8:23 pm
hi
i want to do batch and insert records from 1 table to another.
code :
declare @min-2 int
declare @end int
set @min-2 = select min(id) from emp
set @end = select max(id) from emp
while(@min<@end)
insert into table b
select * from tablea
where id between @min-2 and @end
here when i am inserting i need to do it with increment,there is no need for
.
example,if @min-2 =1 and @max-2 = 1000.
first i want to do from 1 to 1000
then 1001 to 2000
here its taking from 1 to 10000 directly,not doing in batch
also i need to make sure that @max-2<=@end
please help me
August 15, 2013 at 10:49 pm
i think that's because your logic at the first cycle @min-2 is 1 and @end is 10000, so you insert all the record in the table to another table that's it after all you just increment the @min-2 , now in the second cycle it's 1000 to 10000 no use again the same action, and one more thing where is the begin end statement, why do you want to do using a while loop may i know please?
any way you may be change the logic like this
DECLARE @min-2 INT
DECLARE @interval INT
DECLARE @end INT
SET @min-2 = ( SELECT MIN (id) FROM emp )
SET @interval =1000
SET @end = ( SELECT MAX (id) FROM emp )
WHILE (@min < @end)
BEGIN
INSERT INTO TABLEb
SELECT *
FROM tablea
WHERE id BETWEEN @min-2 AND (@min+@interval)
SET @min-2 = @min-2 + @interval
END
Every rule in a world of bits and bytes, can be bend or eventually be broken
MyBlog About Common dialog control
A Visualizer for viewing SqlCommand object script [/url]
August 16, 2013 at 8:01 am
Why are you breaking this into batches in the first place? Why not just do something like this?
INSERT INTO TableB(id, etc)
SELECT id, etc
FROM TableA;
If you need to run them in batches, what if you skipped the cursor completely and ran something like this until everything was done?
INSERT INTO TableB(id, etc)
SELECT TOP 1000 id, etc.
FROM TableA
WHERE NOT EXISTS (SELECT 1
FROM TableB
WHERE TableB.id = TableA.id)
ORDER BY id;
Another way of duplicating the table is to do something like this:
SELECT * INTO TableB
FROM TableA;
The bottom line of what I'm suggesting is to avoid the cursor if at all possible. Many times, the simplest approach is the correct one. Not always, but many times.
August 16, 2013 at 9:46 am
One reason I could think of is the log size. I have had a similar situation where I had to move around 200 M rows from one table to another with a slightly different structure. I looped through 500K rows at a time, commit them for each iteration so that my log file does not grow lot more than I want it to.
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply