February 13, 2020 at 1:48 am
I work on SQL server 2012
I have temp table get data from excel and based on data exist on excel i insert on table inside loop
temp table always have big amount of data may be at least 5000 or 10000 or 15000 or more
I need every iteration increased by 5000 rows insert from temp table
so that i need best solutions for that according to speed and memory like that
and if there are any thing not correct as logic please tell me
my Query as below :
create table #Temp(
DocumentPartID int identity(1,1),
CompanyName VARCHAR(4000),
[AffectedProduct] NVARCHAR(4000),
[ReplacementPart] VARCHAR(4000) ,
[ReplacementCompany] VARCHAR(4000) ,
[Category] VARCHAR(4000) ,
DocumentID int null,
CompanyID VARCHAR(4000) null,
PartID int null,
ReplacementPartID int null,
CategoryID int null,
[Status] VARCHAR(4000) null ,
)
insert into #Temp
(
CompanyName ,
[AffectedProduct],
[ReplacementPart],
[ReplacementCompany],
[Category]
)
values
('Nokia','RF1550','RF1550','HTS','HTS'),
('IPHONE','TF1545','TF1545','Corning Incorporated','HTS2')
DECLARE @MaxValue int = ( select Max(DocumentPartID) from #Temp)
DECLARE @Currentindex int =0
DECLARE @Rows [dbo].[Type_ValidationInPut];
while @Currentindex < @MaxValue
begin
DELETE @Rows
INSERT INTO @Rows
(
RowNumber ,
GivenPartNumber ,
GivenManufacturer
)
select TOP 5000 DocumentPartID , isnull(AffectedProduct,''), isnull(CompanyName,'') FROM #Temp where
(CategoryID = 517884 or CategoryID = 1110481) and (DocumentPartID > @Currentindex) and [Status] is null
INSERT INTO @Rows
(
RowNumber ,
GivenPartNumber ,
GivenManufacturer
)
select TOP 5000 DocumentPartID, isnull(substring(ReplacementPart,0,70),''), isnull(ReplacementCompany,'') FROM #Temp where
(DocumentPartID > @Currentindex) and [Status] is null and ReplacementPart is not null
DECLARE @NewID nVARCHAR(4000) =newID()
insert into [ls30].[validation].[dbo].PartsData (BatchID,RowNumber,GivenPartNumber,givenmanufacturer)
SELECT @NewID ,0,GivenPartNumber,GivenManufacturer from @Rows
set @Currentindex = @Currentindex +5000
DELETE @Rows
end
February 13, 2020 at 5:40 am
I need every iteration increased by 5000 rows insert from temp table
Why not something like
SELECT TOP (5000) <field list>
FROM SourceTable st
WHERE NOT EXISTS (SELECT 1 FROM DestinationTable dt
WHERE dt.ID = st.ID);
?
?
February 13, 2020 at 7:16 am
according to your solution it will be one time only
I need repeated check
so it will not be valid to use
February 13, 2020 at 9:22 am
The query will eliminate the sets that have already been added. That's what the NOT EXISTS predicate is for.
I guess if you don't like it, you could always write your own?
February 13, 2020 at 10:30 am
I need to replace while loop above with any thing else can do loop but good performance
meaning i dont need to loop using cursor or while loop
i need any thing else with good performance
February 13, 2020 at 1:14 pm
Sometimes a loop is the best way to do something. Doing something iteratively can be one of those times. Have you tested the performance of a loop? Do you know it's poor or is it just 'loop bad, must not use?'
How to post a question to get the most help http://www.sqlservercentral.com/articles/Best+Practices/61537
February 13, 2020 at 4:09 pm
Hold the phone here... inserting 5000, 10000, 15000 rows isn't a large, time consuming task for SQL Server. Why do you think you need to "chunk" the data in this fashion to begin with?
--Jeff Moden
Change is inevitable... Change for the better is not.
February 13, 2020 at 4:31 pm
Hold the phone here... inserting 5000, 10000, 15000 rows isn't a large, time consuming task for SQL Server. Why do you think you need to "chunk" the data in this fashion to begin with?
I've been increasingly "chunking" workloads for concurrency reasons. Maybe OP is too. Just a thought.
February 13, 2020 at 8:19 pm
Actually I think the existing code looks like it could end up duplicating rows from the #temp table into the PartsData table. If there is a spreadsheet that has both "ReplacementPart" rows and "CategoryID = 517884 or 1110481" rows, then each iteration through the loop could be grabbing up to 10,000 rows from the #temp table but the counter will only be incremented by 5000.
January 1, 2022 at 3:41 am
Jeff Moden wrote:Hold the phone here... inserting 5000, 10000, 15000 rows isn't a large, time consuming task for SQL Server. Why do you think you need to "chunk" the data in this fashion to begin with?
I've been increasingly "chunking" workloads for concurrency reasons. Maybe OP is too. Just a thought.
I agree but with only 15,000 rows? What size "chunks" do you normally use, Patrick?
And sorry for the wicked late response. I just saw this and I'm curious.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply