January 4, 2017 at 2:28 pm
Hello,
We are doing a migration and need to import in batches of 500 rows. The table that we are migrating has 60,000 rows.
Let's say the schema looks like this. I will keep it simple.
table name is custData.
columns custnum varchar(10), prodnum int, productdesc varcharI(50)
Now I need to query the table.
select custnum, prodnum, productdesc from custData
Now, I need to create batches or groups. I want to group the data in rows of 500. In each row, I need to have a row number or line number up to 500. At 500 rows, I need a new batch number and reset the line number to 1.
I am working toward the resultset.
Batchnum LineNum Custnum, prodnum, productdesc
1 1 124 11 table
1 2 126 12 chair
up to 500
1 500 128 13 sofa
now increment batch and reset line numbers.
2 1 129 14 shelf
I actually found the code to reset line numbers in groups of 500.
select custnum, 1 + ((row_number() over (order by custnum) - 1) % 500) as LineNumber from custdata.
My challenge is creating incrementin batch numbers for each group of 500 rows. I am pretty sure I can do this with a temp table, a while block and using a counter for an outer loop and a counter for an inner loop.
I would like to see if I can do this as one query if possible. I've tried dense_rank and ntile but it is not working. Anyone have an idea?
Thanks so much.
Tony
Things will work out. Get back up, change some parameters and recode.
January 4, 2017 at 2:30 pm
I think the batch number would be:
1 + ((row_number() over (order by custnum) - 1) / 500)
SQL DBA,SQL Server MVP(07, 08, 09) "It's a dog-eat-dog world, and I'm wearing Milk-Bone underwear." "Norm", on "Cheers". Also from "Cheers", from "Carla": "You need to know 3 things about Tortelli men: Tortelli men draw women like flies; Tortelli men treat women like flies; Tortelli men's brains are in their flies".
January 4, 2017 at 2:45 pm
Another option is to use the OFFSET function that introduced in SQL 2012 and was specifically designed for paging.
DECLARE @page_size INT = 500, @page_num INT = 1
WHILE @page_num > 0
BEGIN
SELECT custnum, prodnum, productdesc
FROM custData
ORDER BY custnum, prodnum -- the order needs to be deterministic
OFFSET (@page_num - 1) * @page_size ROWS
FETCH NEXT @page_size ROWS ONLY
IF @@ROWCOUNT < @page_size
SET @page_num = 0
ELSE
SET @page_num = @page_num + 1
END
Drew
Edit: forgot to increment the page number.
J. Drew Allen
Business Intelligence Analyst
Philadelphia, PA
January 4, 2017 at 2:53 pm
That is so excellent. I figured I was just missing something. it worked as needed. Thanks so much!
Things will work out. Get back up, change some parameters and recode.
January 4, 2017 at 2:55 pm
Thanks Drew. I actually have not seen the offset function. I will review it as well.
Thanks.
Things will work out. Get back up, change some parameters and recode.
January 10, 2017 at 12:09 pm
Hi there,
There's also another method if you need not exacly 500 rows, but 500 rows or less per page.
You can use the following statement:
SELECT NTILE(1+(SELECT COUNT(*) FROM Tablename) / 500) OVER (ORDER BY Order by statement) AS PageNr
, other columns
FROM Tablename
SQL server divides the rows in bundles of 500 or a few less rows each based on the order statement.
You will get a page number for each row and you will probably use a CTE to generate it.
After that the Pagenumber can be part of a ROW_NUMBER() OVER (PARTITION BY PageNr ORDER BY ...) statement to get rownumbers starting with 1 untill 500 (or a few less) for each page.
This way a WHILE loop is no longer needed.
Regards JJ
January 10, 2017 at 12:59 pm
And the resetted row number can be achieved my using modulo.
SELECT *,
(ROW_NUMBER() OVER( ORDER BY SomeColumn) + 499) / 500,
ISNULL( NULLIF(ROW_NUMBER() OVER( ORDER BY SomeColumn) % 500, 0), 500)
FROM dbo.SomeTable
January 10, 2017 at 3:17 pm
WebTechie (1/4/2017)
Hello,We are doing a migration and need to import in batches of 500 rows. The table that we are migrating has 60,000 rows.
Just curious, but are these new columns, batch number and line number, going to be part of the new table after the migration? If they are, then you have been given some good options. If they are not, I am just wondering why you need to move 60K rows in batches. If it were 60 million the argument could be made that this technique is necessary to minimize the log file. But 60K, is, well, nothing! :w00t:
__________________________________________________________________________________________________________
How to Post to get the most: http://www.sqlservercentral.com/articles/Best+Practices/61537/
January 10, 2017 at 7:25 pm
LinksUp (1/10/2017)
WebTechie (1/4/2017)
Hello,We are doing a migration and need to import in batches of 500 rows. The table that we are migrating has 60,000 rows.
Just curious, but are these new columns, batch number and line number, going to be part of the new table after the migration? If they are, then you have been given some good options. If they are not, I am just wondering why you need to move 60K rows in batches. If it were 60 million the argument could be made that this technique is necessary to minimize the log file. But 60K, is, well, nothing! :w00t:
You beat me to it. Why only 500 rows on a 60K row table? How wide is the table and where are you importing from?
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 9 posts - 1 through 8 (of 8 total)
You must be logged in to reply to this topic. Login to reply