April 22, 2015 at 8:59 am
Mark Finnie (4/21/2015)
Thank you Scott. I had to replace TableName with TblName becuase some audit tables had columns with that name, resulting in ambiguous column name errors. The query then ran successfully, although it took longer to run than my original query (10 min 52 sec vs. 7 min 41 sec).
Interesting. Just out of curiosity, did you have the print sql and/or debug flags on?
SQL DBA,SQL Server MVP(07, 08, 09) "It's a dog-eat-dog world, and I'm wearing Milk-Bone underwear." "Norm", on "Cheers". Also from "Cheers", from "Carla": "You need to know 3 things about Tortelli men: Tortelli men draw women like flies; Tortelli men treat women like flies; Tortelli men's brains are in their flies".
April 22, 2015 at 4:26 pm
No, I only had the execute flag on. There seems to be more activity on our server this morning. I have run both queries again to get another comparison. I ran my code first and this time it took 8 minutes 35 seconds. Then I ran your code and it took 15 minutes 24 seconds.
April 23, 2015 at 8:52 am
I did something like this a couple of years ago looking for specific 3 to 6 character strings in every table in a database (what fools we were to not use ISO currency codes from the beginning...), and excluded character based columns that were too short to hold the data as well as all non-character columns
select table_name, column_name from information_schema.columns
where data_type like '%char%'
and (CHARACTER_MAXIMUM_LENGTH > 2 or CHARACTER_MAXIMUM_LENGTH < 0)
I used that as the basis for the cursor, then created dynamic SQL that generated statements similar to
select table_name, column_name, count(*)
from table_name
where column_name like '%search text%'
group by table_name, column_name
Viewing 3 posts - 16 through 17 (of 17 total)
You must be logged in to reply to this topic. Login to reply