March 10, 2020 at 8:24 am
Below query is used for inserting and updating the tables in the SQL Server database. The XQuery is slow while executing in SSMS for first time.I am using SQL Server 2008 R2. The total time taken for 500 queries is 20 to 40 seconds. How can I optimise this query to speed up the execution?
Query
Update BalanceTable set [daily_balance].modify('insert <Row><date>2007-05-10</date><Balance>-8528</Balance><Transactiondr>835</Transactiondr><Transactioncr>9363</Transactioncr><Rowid>2</Rowid></Row> as first into (/Root)[1]') where [daily_balance].exist('/Root/Row[date=''2007-05-10''] ')=0 and [daily_balance].exist('/Root')=1 and [AccountID]=61 and [Date] = '31-May-2007';
Update BalanceTable set [daily_balance].modify('replace value of (/Root/Row[date=''2007-05-10'']/Balance/text())[1] with (/Root/Row[date=''2007-05-10'']/ Balance)[1] -3510') where [AccountID]=577 and [Date]='31-May-2007' and [daily_balance].exist('/Root/Row[date=''2007-05-10'']')=1;
Update BalanceTable set [daily_balance].modify('replace value of (/Root/Row[date=''2007-05-10'']/Transactioncr/text())[1] with (/Root/Row[date=''2007-05-10'']/ Transactioncr)[1] +3510') where [AccountID]=577 and [Date]='31-May-2007' and [daily_balance].exist('/Root/Row[date=''2007-05-10'']')=1;
Table schema
USE [Fitness Te WM16]
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [dbo].[BalanceTable](
[AccountID] [int] NULL,
[Type] [char](10) NULL,
[Date] [date] NULL,
[Balance] [decimal](15, 2) NULL,
[TRansactionDr] [decimal](15, 2) NULL,
[TRansactionCr] [decimal](15, 2) NULL,
[daily_Balance] [xml] NULL,
[AutoIndex] [int] IDENTITY(1,1) NOT NULL,
CONSTRAINT [PK_BalanceTable] PRIMARY KEY CLUSTERED
(
[AutoIndex] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
SET ANSI_PADDING OFF
GO
Execution plan
The execution plan is attached here https://www.brentozar.com/pastetheplan/?id=rkwgD7U4L
Sample data
The sample XML data for reference is given below.
<Root>
<Row>
<date>2007-05-31</date>
<Balance>-47718</Balance>
<Transactiondr>0</Transactiondr>
<Transactioncr>47718</Transactioncr>
<Rowid>7</Rowid>
</Row>
<Row>
<date>2007-05-29</date>
<Balance>-31272</Balance>
<Transactiondr>0</Transactiondr>
<Transactioncr>31272</Transactioncr>
<Rowid>6</Rowid>
</Row>
<Row>
<date>2007-05-18</date>
<Balance>-48234</Balance>
<Transactiondr>0</Transactiondr>
<Transactioncr>48234</Transactioncr>
<Rowid>5</Rowid>
</Row>
<Row>
<date>2007-05-11</date>
<Balance>-42120</Balance>
<Transactiondr>0</Transactiondr>
<Transactioncr>42120</Transactioncr>
<Rowid>4</Rowid>
</Row>
<Row>
<date>2007-05-10</date>
<Balance>-21060</Balance>
<Transactiondr>0</Transactiondr>
<Transactioncr>21060</Transactioncr>
<Rowid>3</Rowid>
</Row>
<Row>
<date>2007-05-08</date>
<Balance>-10530</Balance>
<Transactiondr>0</Transactiondr>
<Transactioncr>10530</Transactioncr>
<Rowid>2</Rowid>
</Row>
<Row>
<date>2007-05-04</date>
<Balance>-21060</Balance>
<Transactiondr>0</Transactiondr>
<Transactioncr>21060</Transactioncr>
<Rowid>1</Rowid>
</Row>
<Maxrowid>7</Maxrowid>
</Root>
March 10, 2020 at 4:28 pm
work from the top right of your plan - try and get rid of that scan on balancetable (it's in a few places and might completely adjust your plan) - if the plan changes then re-post your planΒ and i'm sure someone will give it a quick glance
you also have quite a few function calls in there - we can't see what they are doing from your posted code, but I never trust any kind of function. is there any way you can lose a few of those calls?
MVDBA
March 10, 2020 at 6:33 pm
The XQuery is slow while executing in SSMS for first time.I am using SQL Server 2008 R2.
Heh... I know... big surprise there, right? π
The total time taken for 500 queries is 20 to 40 seconds. How can I optimise this query to speed up the execution?
Get rid of the XML at the source and normalize the data as a flat file if you're importing the data.Β This is a trivial bit of data that should be entrusted to flat file technology (CSV or TSV).
What you're not seeing is that with spaces (including a shedload of trailing spaces that you're not seeing) combined with the extreme amount of tag bloat that goes with it, the data you posted occupies 2,226 bytes including end of line markers.
If you take that data down to a flat file (CSV in this case because you can't see TABs for demo purposes), you end up with this...
2007-05-31,-47718,0,47718,7
2007-05-29,-31272,0,31272,6
2007-05-18,-48234,0,48234,5
2007-05-11,-42120,0,42120,4
2007-05-10,-21060,0,21060,3
2007-05-08,-10530,0,10530,2
2007-05-04,-21060,0,21060,1
That's just 189 bytes, including the EOL characters.Β That's almost 11 times smaller (10.7777777).Β BULK INSERT will import this into a table in less than a blink of an eye and your network people will love you for reducing the data they have to store and transmit.
Just say "NO" to using XML to transfer or store data. π
--Jeff Moden
Change is inevitable... Change for the better is not.
March 10, 2020 at 7:39 pm
Is there an index on AccountID and Date columns of dbo.BalanceTable?
Those conditions are in all your UPDATE statements, but the only index you've shown us is the Primary Key on AutoIndex.
March 11, 2020 at 8:59 am
IT researcher wrote:The XQuery is slow while executing in SSMS for first time.I am using SQL Server 2008 R2.
Heh... I know... big surprise there, right? π
IT researcher wrote:The total time taken for 500 queries is 20 to 40 seconds. How can I optimise this query to speed up the execution?
Get rid of the XML at the source and normalize the data as a flat file if you're importing the data.Β This is a trivial bit of data that should be entrusted to flat file technology (CSV or TSV).
What you're not seeing is that with spaces (including a shedload of trailing spaces that you're not seeing) combined with the extreme amount of tag bloat that goes with it, the data you posted occupies 2,226 bytes including end of line markers.
If you take that data down to a flat file (CSV in this case because you can't see TABs for demo purposes), you end up with this...
2007-05-31,-47718,0,47718,7
2007-05-29,-31272,0,31272,6
2007-05-18,-48234,0,48234,5
2007-05-11,-42120,0,42120,4
2007-05-10,-21060,0,21060,3
2007-05-08,-10530,0,10530,2
2007-05-04,-21060,0,21060,1
That's just 189 bytes, including the EOL characters.Β That's almost 11 times smaller (10.7777777).Β BULK INSERT will import this into a table in less than a blink of an eye and your network people will love you for reducing the data they have to store and transmit.
Just say "NO" to using XML to transfer or store data. π
good call
MVDBA
March 11, 2020 at 6:32 pm
Jeff, are you starting an #XM(e)L2 tag π
π
Handling such simple data sets is elementary, done quite few more complex doing 10-20K transactions / second
First suggestion, normalize the data, i.e.:
SELECT
RW.DATA.value('(date/text())[1]' ,'DATE') AS [date]
,RW.DATA.value('(Balance/text())[1]' ,'INT') AS Balance
,RW.DATA.value('(Transactiondr/text())[1]' ,'INT') AS Transactiondr
,RW.DATA.value('(Transactioncr/text())[1]' ,'INT') AS Transactioncr
,RW.DATA.value('(Rowid/text())[1]' ,'INT') AS Rowid
FROM @TXML.nodes('/Root/Row') RW(DATA);
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply