September 17, 2012 at 9:08 am
DBCC CHECKDB with Data_Purity consistency errors
This is the first time I ran this command and had errors. I am new to the company and they never had a DBA before. There SQL server has been upgraded all the way back from SQL 7 to 2008 R2.
I ran the DBCC CHECKDB with Data_Purity consistency errors and here are a few if someone could point me in the correct direction to try and fix some of these which there are 11000
They all are similar to these
Msg 2570 Level 16 State 3 Line 1
Page (1:275392) slot 2 in object ID 536649255 index ID 1 partition ID 72057594088390656 alloc unit ID 72057594089635840 (type "In-row data"). Column "LTV" value is out of range for data type "real". Update column to a legal value.
Msg 2570 Level 16 State 3 Line 1
Page (1:275392) slot 8 in object ID 536649255 index ID 1 partition ID 72057594088390656 alloc unit ID 72057594089635840 (type "In-row data"). Column "LTV" value is out of range for data type "real". Update column to a legal value.
Msg 2570 Level 16 State 3 Line 1
Page (1:275392) slot 15 in object ID 536649255 index ID 1 partition ID 72057594088390656 alloc unit ID 72057594089635840 (type "In-row data"). Column "LTV" value is out of range for data type "real". Update column to a legal value.
Msg 2570 Level 16 State 3 Line 1
Page (1:275393) slot 0 in object ID 536649255 index ID 1 partition ID 72057594088390656 alloc unit ID 72057594089635840 (type "In-row data"). Column "LTV" value is out of range for data type "real". Update column to a legal value.
September 17, 2012 at 9:56 am
Take a look at this article. http://www.sqlservercentral.com/articles/65804/
There's a section in there on data purity errors.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
September 17, 2012 at 10:22 am
Thanks I started to update the tables to the correct data type. Would you know of an easy way to update the value with say 400000 rows. I have one table left to update and its complaining about large size of data then errors out.
Much appreciated thanks
September 17, 2012 at 10:37 am
You're going to have to identify which row have the bad values and update them (or you can just update the entire table and set the column say to 0, but then you lose whatever meaning that column had)
The article I referenced has a link to a kb article that explains how to identify the rows with the bad values.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
September 25, 2012 at 8:40 am
Thanks, I have been working on this still. I have figured out that I need to change the data type to nvarchar then update all the values that had NaN to 0. The problem I am having is once I'm completed on updating those values. I then change the data type to Decimal this works on the smaller tables but the ones with 200000 plus rows error out.
Any suggestions on how to get past that? I have even scripted out the table to move the data over then drop that table but that also errors out.
Thanks
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply