February 27, 2013 at 5:04 pm
I have four tables that have a varchar data type which gets imports or daily user updates and this field is an account number field which always has a length of 10, including leading zeros. My current code uses:
UPDATE table.Statements
SET field03 = REPLACE(LTRIM(REPLACE(field03, '0', ' ')), ' ', '0')
The issue is that this runs against all four tables and takes around 5 minutes. After the code runs, I see that it touched every record, which I don't want. I just want the code to look at the rows that this field needs to be changed on and disregard the previously trimmed rows.
I know I need some kind of IF statement to accomplish this, but not sure how.
TIA, Scott
February 27, 2013 at 7:02 pm
Try
UPDATE table.Statements
SET field03 = REPLACE(LTRIM(REPLACE(field03, '0', ' ')), ' ', '0')
WHERE field03 like '0%'
This will only try to update where field03 starts with a 0. An index on this would help
However if you only have numbers in the string, I would probably go for a statement more like
UPDATE table.Statements
SET field03 = CAST(CAST(field03 as bigint) as varchar(20))
WHERE field03 like '0%'
Otherwise you could do
UPDATE table.Statements
SET field03 = SUBSTRING(field03,PATINDEX('%[^0]%',field03),20)
WHERE field03 like '0%'
February 27, 2013 at 9:34 pm
Scott Milburn (2/27/2013)
I have four tables that have a varchar data type which gets imports or daily user updates and this field is an account number field which always has a length of 10, including leading zeros. My current code uses:UPDATE table.Statements
SET field03 = REPLACE(LTRIM(REPLACE(field03, '0', ' ')), ' ', '0')
The issue is that this runs against all four tables and takes around 5 minutes. After the code runs, I see that it touched every record, which I don't want. I just want the code to look at the rows that this field needs to be changed on and disregard the previously trimmed rows.
I know I need some kind of IF statement to accomplish this, but not sure how.
TIA, Scott
If Field03 is for account number, I'm thinking that you probably have an index or two on it. The real problems may be that 1) you're updating both the table and the index(es) and 2) you're fragging the hell out of the index(es).
I'd recommend dropping the index(es), doing the update, then re-add te index(es). You're likely going to have to rebuild them after this, anyway.
--Jeff Moden
Change is inevitable... Change for the better is not.
February 27, 2013 at 9:52 pm
mickyT (2/27/2013)
However if you only have numbers in the string, I would probably go for a statement more like
UPDATE table.Statements
SET field03 = CAST(CAST(field03 as bigint) as varchar(20))
WHERE field03 like '0%'
Otherwise you could do
UPDATE table.Statements
SET field03 = SUBSTRING(field03,PATINDEX('%[^0]%',field03),20)
WHERE field03 like '0%'
Why? Please don't say "for performance reasons" until you've actually tested it. I tested all of the solutions offered so far and there's virtually no difference. Try it yourself. Here's a million row test table...
--===== Create and populate a large test table on-the-fly
SELECT TOP 1000000
Field03 = RIGHT('0000000000'+CAST(ABS(CHECKSUM(NEWID()))%2147483647+1 AS VARCHAR(10)),10)
INTO #TestTable
FROM sys.all_columns ac1
CROSS JOIN sys.all_columns ac2
;
--Jeff Moden
Change is inevitable... Change for the better is not.
February 27, 2013 at 10:43 pm
Why? Please don't say "for performance reasons" until you've actually tested it. I tested all of the solutions offered so far and there's virtually no difference. Try it yourself. Here's a million row test table...
Sorry Jeff, I wasn't suggesting them for performance reasons, just other ways of doing it. I found the method the OP put up originally difficult to decipher at first glance.
February 27, 2013 at 11:10 pm
mickyT (2/27/2013)
Why? Please don't say "for performance reasons" until you've actually tested it. I tested all of the solutions offered so far and there's virtually no difference. Try it yourself. Here's a million row test table...
Sorry Jeff, I wasn't suggesting them for performance reasons, just other ways of doing it. I found the method the OP put up originally difficult to decipher at first glance.
Ah. Got it. Thanks for the feedback.
--Jeff Moden
Change is inevitable... Change for the better is not.
August 20, 2013 at 2:43 pm
Have you tried Val?
August 20, 2013 at 3:12 pm
Scott Milburn (2/27/2013)
I have four tables that have a varchar data type which gets imports or daily user updates and this field is an account number field which always has a length of 10, including leading zeros. My current code uses:UPDATE table.Statements
SET field03 = REPLACE(LTRIM(REPLACE(field03, '0', ' ')), ' ', '0')
The issue is that this runs against all four tables and takes around 5 minutes. After the code runs, I see that it touched every record, which I don't want. I just want the code to look at the rows that this field needs to be changed on and disregard the previously trimmed rows.
I know I need some kind of IF statement to accomplish this, but not sure how.
TIA, Scott
If the majority of rows do not need to be updated, then you could just do this:
UPDATE table.Statements
SET field03 = REPLACE(LTRIM(REPLACE(field03, '0', ' ')), ' ', '0')
WHERE field03 <> REPLACE(LTRIM(REPLACE(field03, '0', ' ')), ' ', '0')
It would still have to look at every row, but the actual update would be avoided and that is the most expensive part of the process.
August 20, 2013 at 10:54 pm
If anybody cares, here's my vote:
WITH CTE AS (
SELECT Field03
FROM #TestTable
WHERE Field03 LIKE '0%')
UPDATE CTE
SET Field03 = REPLACE(LTRIM(REPLACE(Field03 COLLATE LATIN1_GENERAL_BIN , '0', ' ')), ' ', '0');
Using Jeff's test harness, the CTE seemed to give a slight performance boost over methods that didn't use the CTE to limit the target row set.
And the Binary collation sequence is a known "feature" of REPLACE:
Edit: Attached my test harness in case anyone wants to offer a critique.
My thought question: Have you ever been told that your query runs too fast?
My advice:
INDEXing a poor-performing query is like putting sugar on cat food. Yeah, it probably tastes better but are you sure you want to eat it?
The path of least resistance can be a slippery slope. Take care that fixing your fixes of fixes doesn't snowball and end up costing you more than fixing the root cause would have in the first place.
Need to UNPIVOT? Why not CROSS APPLY VALUES instead?[/url]
Since random numbers are too important to be left to chance, let's generate some![/url]
Learn to understand recursive CTEs by example.[/url]
[url url=http://www.sqlservercentral.com/articles/St
Viewing 9 posts - 1 through 8 (of 8 total)
You must be logged in to reply to this topic. Login to reply