July 29, 2018 at 1:09 pm
Using row_number() can be problematic if the data can change between each individual request. There are different types of solution, which can all have different pros/cons depending on the precise circumstances:
1) Use modulo arithmetic but on some deterministic value of the data, such as an id field
2) Use OFFSET FETCH to page results in fixed size blocks
3) Use Async I/O in C# rather than threading to start processing rows before the complete result set is returned.
4) Use a queueing system to disperse the results
July 31, 2018 at 8:51 am
Jonathan AC Roberts - Friday, July 27, 2018 6:14 AMpolo.csit - Friday, July 27, 2018 5:59 AMJonathan AC Roberts - Thursday, July 26, 2018 3:25 PMpolo.csit - Thursday, July 26, 2018 3:20 AMHi All,My table have 10000 records and i create the 5 store procedures and my 5 applications exec the same time and each application load the records 2000 .it is working fine but the problem is when run 1 application it load the 2000 records and take 5 min then i run the 2 application it load 1900 records .Here the problem is 1 application are working with duplication records.How to load the 2 application 10000-2000=8000 in threading application.my sp like this below
;with P as
(
select *,ROW_NUMber() over(order by id) as rownum,
ntile(5) over(order by id) as thread
from [Records]
where not exists(
select name from
[Data]
where
Convert(varchar(10),date,103) = @date
and
[Records].name = [Data].name
)
select name from P where thread = 1Regards
PolsI had to do something similar once when sending data to a CMS system over the internet. To get enough rows sent I had to create multiple processes to read the rows as sending each row over the internet to the CMS system had a long wait for the response but if sent in parallel I could increase the rate at which they were sent. So if you are doing what I think you are doing then you can split the rows using the T-SQL mod operator: '%' using some code similar to this:
DECLARE @ThreadToFind as int
SET @ThreadToFind=1;with P as
(
select *,ROW_NUMber() over(order by id) as rownum,
id%5 as thread
from [Records]
where not exists(select name from [Data] where Convert(varchar(10),date,103) = @date
and [Records].name = [Data].name
)
select name
from P
where thread = @ThreadToFind
This will split the rows into 5 different threads (0 to 4) and you won't get duplicates as it only relies on the value of the Id column to split the rows.i am implement the id%5 but some records are not load the in thread(storeprocedure).
I would create one stored procedure that takes a parameter @ThreadToFind then each process that calls it would call the stored procedure with the thread it wants (0 to 4).
I think all data should be returned as long as you are calling the stored procedure with all the threads: 0, 1, 2, 3, 4
You need to provide a more information on what it's not returning and ideally a sample script to illustrate if you want a more specific answer..
It is working fine.But % is fixed the records but one threads is completed remaining are running but is not share the records.
July 31, 2018 at 9:05 am
polo.csit - Tuesday, July 31, 2018 8:51 AMJonathan AC Roberts - Friday, July 27, 2018 6:14 AMpolo.csit - Friday, July 27, 2018 5:59 AMJonathan AC Roberts - Thursday, July 26, 2018 3:25 PMpolo.csit - Thursday, July 26, 2018 3:20 AMHi All,My table have 10000 records and i create the 5 store procedures and my 5 applications exec the same time and each application load the records 2000 .it is working fine but the problem is when run 1 application it load the 2000 records and take 5 min then i run the 2 application it load 1900 records .Here the problem is 1 application are working with duplication records.How to load the 2 application 10000-2000=8000 in threading application.my sp like this below
;with P as
(
select *,ROW_NUMber() over(order by id) as rownum,
ntile(5) over(order by id) as thread
from [Records]
where not exists(
select name from
[Data]
where
Convert(varchar(10),date,103) = @date
and
[Records].name = [Data].name
)
select name from P where thread = 1Regards
PolsI had to do something similar once when sending data to a CMS system over the internet. To get enough rows sent I had to create multiple processes to read the rows as sending each row over the internet to the CMS system had a long wait for the response but if sent in parallel I could increase the rate at which they were sent. So if you are doing what I think you are doing then you can split the rows using the T-SQL mod operator: '%' using some code similar to this:
DECLARE @ThreadToFind as int
SET @ThreadToFind=1;with P as
(
select *,ROW_NUMber() over(order by id) as rownum,
id%5 as thread
from [Records]
where not exists(select name from [Data] where Convert(varchar(10),date,103) = @date
and [Records].name = [Data].name
)
select name
from P
where thread = @ThreadToFind
This will split the rows into 5 different threads (0 to 4) and you won't get duplicates as it only relies on the value of the Id column to split the rows.i am implement the id%5 but some records are not load the in thread(storeprocedure).
I would create one stored procedure that takes a parameter @ThreadToFind then each process that calls it would call the stored procedure with the thread it wants (0 to 4).
I think all data should be returned as long as you are calling the stored procedure with all the threads: 0, 1, 2, 3, 4
You need to provide a more information on what it's not returning and ideally a sample script to illustrate if you want a more specific answer..It is working fine.But % is fixed the records but one threads is completed remaining are running but is not share the records.
You need 5 separate processes that each call the stored procedure with a different @ThreadToFind (0 to 4).
The SQL you pasted in won't work as name is not in he select columns of the CTE.
If you can paste in the actual SQL you have it would help.
Also, you could just try running the SQL with "SELECT * FROM" the CTE in SSMS to see what values the query is returning and how it's calculating the thread value.
Viewing 3 posts - 16 through 17 (of 17 total)
You must be logged in to reply to this topic. Login to reply