August 1, 2007 at 11:04 am
I have a cursor that basically walks through a table called CONTENT_BLOB updating a BLOB column called CONTENT_BLOB using UPDATETEXT. It replaces double apostrophes with single apostrophes in the text data type column. Here's an example of data that currently has this in this column:
“As a favorite waitress at Joe''s Diner, Jenna is also a "pie genius," naming her tantalizing confections after the tumultuous events and emotions of her daily life. She''s hoping that one of her pastries, like her "Kick In The Pants" Pie, might even change her life…”
Here is the structure of the table:
content_blob_id INT IDENTITY
content_id INT
blob_type_id INT
content_blob Text
upd_tmstmp Datetime
PK is on content_id, blob_type_id
Total rows in table: 99,000+
Affected rows from below cursor query: 16.000+
This below update runs about 30 minutes in my QA environment. I would like to speed it up before putting it in our production environment. Any ideas on doing this? Thanks Travis.
DECLARE @id INT
-- first I pull some records out of Content_blob table.
DECLARE CursorQuery CURSOR FOR
SELECT content_blob_id FROM CONTENT_BLOB WHERE ( (blob_type_id =3 OR blob_type_id =9) AND content_blob LIKE '%''''%')
OPEN CursorQuery
BEGIN TRAN T
-- I'm now going to fetch our record into the ID variable which we'll use for updating a related record.
FETCH NEXT FROM CursorQuery
INTO @ID
PRINT 'Record Status' + CAST(@@FETCH_STATUS as varchar)
WHILE @@FETCH_STATUS = 0
BEGIN
-- now update the content blob column.
DECLARE @ptrval varbinary(16)
DECLARE @index INT
SELECT @ptrval = TEXTPTR(content_blob) , @index = PATINDEX( '%''''%',[content_blob])
FROM [CONTENT_BLOB] WHERE [content_blob_id] = @ID
WHILE @index > 0
BEGIN
IF @index =1
BEGIN
UPDATETEXT [CONTENT_BLOB].[content_blob] @ptrval @index 1 ''
SELECT @ptrval = TEXTPTR(content_blob) , @index = PATINDEX( '%''''%',[content_blob])
FROM [CONTENT_BLOB] WHERE [content_blob_id] = @ID
END
ELSE
BEGIN
WHILE @index > 1
BEGIN
UPDATETEXT [CONTENT_BLOB].[content_blob] @ptrval @index 1 ''
SELECT @ptrval = TEXTPTR(content_blob) , @index = PATINDEX( '%''''%',[content_blob])
FROM [CONTENT_BLOB] WHERE [content_blob_id] = @ID
END
END
END
FETCH NEXT FROM CursorQuery
INTO @ID
END
CLOSE CursorQuery
DEALLOCATE CursorQuery
IF @@ERROR = 0
BEGIN
COMMIT TRAN T
PRINT 'Execution Successful '
END
ELSE
BEGIN
PRINT 'An error has occurred -execution unsuccessful '
ROLLBACK TRAN T
END
GO
August 1, 2007 at 9:39 pm
What the max DATALENGTH of the text column, please?
--Jeff Moden
Change is inevitable... Change for the better is not.
August 2, 2007 at 12:15 pm
SELECT max(DATALENGTH(content_blob))
FROM content_blob
WHERE (BLOB_TYPE_ID IN(3,9)
and CONTENT_BLOB LIKE '%''''%')
RETURNS 5006
August 2, 2007 at 1:14 pm
FYI, I have made some chenges to this that I belive will also help:
create table #id (content_blob_id int not null primary key)
insert into #id (content_blob_id)
SELECT content_blob_id FROM CONTENT_BLOB (NOLOCK) WHERE ( (blob_type_id IN (3,9)) AND content_blob LIKE '%''''%')
DECLARE CursorQuery CURSOR FAST_FORWARD FOR
SELECT content_blob_id FROM #id
Pretty sure this will speed it up alot..Thanks again.Travis..
August 2, 2007 at 5:02 pm
That returns the max data length for those items you want to update... not all items. That not-with-standing, it looks to me like you have well under the 8000 characters that can fit into a VARCHAR(8000) column... why not make your life a lot easier and convert the TEXT column to a VARCHAR(8000) column (even if it has to be in a "sister" table to keep from violating the 8060 byte max of a row)?
Then, your replace (and a pot wad of other things) would be as simple as this...
UPDATE Content_Blob
SET Content_Blob = REPLACE(Content_Blob,'''','')
--Jeff Moden
Change is inevitable... Change for the better is not.
August 3, 2007 at 11:19 am
Nope, I wish it was that easy, there are some rows in that table that have more than 8000 chars in them, that is why varchar(8000) was not used in the design. Thanks anyway.
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply