August 8, 2012 at 8:08 pm
I have a store proc with 2 major parts in a while loop. A SELECT then an UPDATE
The basic store proc is liked
SET TRANSACTION ISOLATION ....
while ( @loop >0 )
BEGIN TRAN
1st part, it will look for the rows that match a criteria and create a Small table, i.e. SELECT * from LargeTableA A join LargeTableB B on A.id=B.id where thisCriteria = 'found' and columnWhileLoop = @loop
2nd part, it will update the LargeTableA based on join of SmallTable. i.e. UPDATE SET a.colume = 'found' from LargeTableA a join SmallTable b on a.id = b.id
COMMIT
SET @loop= @loop -1
end
For some reason, the count from Update (rows being updated) isn't the same as it executed in SSMS, without the loop, it is much shorter.
The database is set to snapshot as well as read_committed_on. I have tried set transaction levels to all of these:
READ UNCOMMITTED | READ COMMITTED -- would have been identical to result from SSMS; since SSMS by default is set READ COMMITTED. | REPEATABLE READ | SNAPSHOT | SERIALIZABLE
I have also tried. all of these table hints
| HOLDLOCK | PAGLOCK | READCOMMITTED | READCOMMITTEDLOCK | REPEATABLEREAD | ROWLOCK | SERIALIZABLE | TABLOCK | TABLOCKX | UPDLOCK | XLOCK
on both the SELECT and UPDATE statements.
The result is still inconsistent Count on Result when executing as a store procedure.I have removed the indexes on the join and the where criteria
The result after index removed is even worst, far fewer rows being updated.
Someone think it is because of the update is too slow. Some think it is SAN configuration.
Experts, how to fix this problem?
if it helps, here is a slightly modified query to reflect different industry. 😉
Many thanks!!
-- SET TRANSACTION ISOLATION LEVEL READ COMMITTED ;
--SET TRANSACTION ISOLATION LEVEL SERIALIZABLE ; -- 99 when loop
SET TRANSACTION ISOLATION LEVEL REPEATABLE READ ; --- 279, by itself. low cnt when loop.
--SET TRANSACTION ISOLATION LEVEL SNAPSHOT ; -- 99 when loop.
-- can be 342 for grp 11.
SET NOCOUNT ON;
declare
@ii_Days int = 3
, @RecordType varchar(50) = NULL
, @airport varchar(50) = NULL
, @airportCnt int
, @av_DalaySeconds varchar(10) = '00:00:10'
, @UpdateLoop int = 3 --- SS 3; 11=161. SS 6; 11=98
--)
--as
select @airportCnt = MAX(Airline_PK) FROM dbo.AirlineGroup
WHILE ( @UpdateLoop > 0)
BEGIN
WHILE ( @airportCnt > 0 )
BEGIN
if ( @airportCnt %3=2 )
begin
BEGIN TRAN
; with my_cte as(
SELECT
rn = ROW_NUMBER() over ( partition by p.MemberID order by p.MemberID )
, MemberID = p.MemberID
, e.CustomerID
, e.Airline_PK
, e.VisitDate
, e.DepartDate
, e.VisitID
, e.TouristTypeTag
FROM
dbo.TravelLog e
left join dbo.Customer p
on e.CustomerID = p.CustomerID
WHERE
Airline_PK = @airportCnt
and MemberID <> 'NA'
)
update
dbo.TravelLog
set WeekendTravellerCount = 1
from
(
select
thisrow.MemberID
, thisRow.CustomerID
, thisRow.VisitID as ThisVisitID
, prevRow.VisitID
, thisRow.Airline_PK
, thisRow.VisitDate as ThisVisitDate
, prevRow.VisitDate as PrevVisitDate
, thisRow.DepartDate as ThisDepartDate
, prevRow.DepartDate as PrevDepartDate
, ( thisRow.VisitDate - prevRow.VisitDate ) + 1 as delta_day
, thisRow.TouristTypeTag as ThisTouristTypeTag
, prevRow.TouristTypeTag as PrevTouristTypeTag
from
my_cte thisRow
left join my_cte nextRow
on thisRow.rn = nextRow.rn - 1
and thisRow.MemberID = nextRow.MemberID
left join my_cte prevRow
on thisRow.rn = prevRow.rn + 1
and thisRow.MemberID = prevRow.MemberID
where
thisRow.VisitDate - prevRow.DepartDate > 0
and thisRow.VisitDate - prevRow.DepartDate <= @ii_Days
)
d
join
dbo.TravelLog e
on e.VisitID = d.ThisVisitID
where
PrevTouristTypeTag = 2 -- D
and ThisTouristTypeTag = 1 -- A
OPTION (MAXDOP 1) ;
WAITFOR DELAY @av_DalaySeconds ;
COMMIT TRAN
end -- if ( @airportCnt %3=2 )
SET @airportCnt = @airportCnt - 1 ;
END -- while ( @airportCnt > 0 )
SET @UpdateLoop = @UpdateLoop - 1
END --- WHILE ( @UpdateLoop > 0)
SET TRANSACTION ISOLATION LEVEL READ COMMITTED ;
August 9, 2012 at 8:30 am
If you have a one-to-many join, the count for the SELECT will be the count for the many side of the join, but the count for the UPDATE will be the count for the one side of the join. Since you haven't provided any sample data, we can't tell if that is the case here, but it most likely is.
Drew
J. Drew Allen
Business Intelligence Analyst
Philadelphia, PA
August 9, 2012 at 12:24 pm
Hello Celko,
😀 I know the code is crappy. I love your books and need to start writing CELKO SQL. 😉
Well, each table is over millions of rows. the loop is to break them down. yea. it is sorta like a customer table with gender types of male and female, which there can be two tables to store the data. however, the types here is over 1000 union alone will be over 5000 and I won't be able to use dynamic sql.
yea, this is one of those customer behavior analysis which i need to find from a bag of data, what is the lineage of a customer's behavior and then determine the prior, current, and next behavior of that customer.
if would be great, if you can show me how best do it in a set. i'll ... email u a beer.
cheers.
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply