January 6, 2010 at 11:44 am
Comments posted to this topic are about the item Search all columns in every table for a value
January 25, 2010 at 3:38 am
One good idea is only search from columns which column length is
greather than @searchValue length
set nocount on
--initial declarations
declare @rowID INT, @maxRowID INT
declare @sql NVARCHAR(4000)
declare @statements TABLE (rowID INT, SQLL NVARCHAR(MAX) COLLATE DATABASE_DEFAULT)
CREATE TABLE #results (tableName NVARCHAR(250) COLLATE DATABASE_DEFAULT, tableSchema NVARCHAR(250) COLLATE DATABASE_DEFAULT
, columnName NVARCHAR(250) COLLATE DATABASE_DEFAULT, foundtext NVARCHAR(MAX) COLLATE DATABASE_DEFAULT )
SET @rowID = 1
declare @searchValue NVARCHAR(100)
SET @searchValue = 'text to find'
DECLARE @pikkus INT
SET @pikkus=LEN(@searchValue)
--TEXT 35
--NTEXT 99
--VARCHAR 167
--CHAR 175
--NVARCHAR, SYSNAME 231
--NCHAR 239
--XML 241
--create CTE table holding metadata
;WITH MyInfo (tableName, tableSchema, columnName, XTYPE) AS (
SELECT sysobjects.name AS tableName, USER_NAME(sysobjects.uid) AS tableSchema
, syscolumns.name AS columnName, syscolumns.XTYPE
FROM sysobjects WITH(NOLOCK) INNER JOIN syscolumns WITH(NOLOCK)
ON (sysobjects.id = syscolumns.id)
WHERE sysobjects.xtype = 'U' AND sysobjects.category=0
AND sysobjects.name <> 'sysdiagrams' --MSSQL diagrams
AND syscolumns.XTYPE IN (35,99,167,175,231,239,214) AND syscolumns.prec >= @pikkus
)
--create search strings
INSERT INTO @statements
SELECT row_number() over (order by tableName, columnName) AS rowID, 'INSERT INTO #results SELECT '''+tableName+''', '''+tableSchema+''', '''+columnName+''', CAST('+columnName+' AS NVARCHAR(MAX)) FROM ['+tableSchema+'].['+tableName+'] WITH (NOLOCK) WHERE '+
CASE WHEN myInfo.XTYPE=241 --XML
THEN +'CONVERT(NVARCHAR(MAX),['+columnName+'])'
ELSE '['+columnName+']'
END+' LIKE ''%'+@searchValue+'%'''
FROM myInfo
--select * from @statements
--initialize while components and process search strings
SET @maxRowID = ( SELECT MAX(rowID) FROM @statements )
WHILE @rowID <= @maxRowID
BEGIN
SET @sql = (SELECT sqll FROM @statements WHERE rowID = @rowID )
EXEC sp_executeSQL @sql
SET @rowID = @rowID + 1
END
--view results and cleanup
SELECT * FROM #results
drop table #results
January 25, 2010 at 6:37 am
First of all, thank you for the query. If I can get it to run, it will definitely be a very handy tool.
Unfortunately, I am receiving the following error:
Msg 8170, Level 16, State 2, Line 1
Insufficient result space to convert uniqueidentifier value to char.
I tried adding "uniqueidentifier" to the "...data_type not in..." clause of the statement (after all, we certainly do not need to search those fields), but I still receive the same results.
Any ideas?
Thanks.
January 25, 2010 at 6:47 am
Thanks for the feedback Kevin - that data type should have been included.
I replicated the error (by creating a table with a GUID) and then modified the section that limits the types...all worked for me.
replace:
select table_name, table_schema, column_name from information_schema.columns where data_type not in ('image','text','timestamp')
with:
select table_name, table_schema, column_name from information_schema.columns where data_type not in ('image','text','timestamp','uniqueidentifier')
I'll update the script on the article. I also like the above comment to limit the field search based on the column size. No use looking in a two character field to see if a 5 character string exists!!! That is the great part about a forum like this -- many people will suggest improvements. Also, to anyone else out there I am aware of the ms_forEachTable procedure but decided to write the script as shown.
January 25, 2010 at 6:47 am
I keep getting errors related to a view:
Msg 207, Level 16, State 1, Line 1
Invalid column name 'Short Name'.
Msg 4413, Level 16, State 1, Line 1
Could not use view or function 'dbo.assets_in_supportworks_not_in_dynamics' because of binding errors.
Msg 207, Level 16, State 1, Line 1
Invalid column name 'Short Name'.
Msg 4413, Level 16, State 1, Line 1
Could not use view or function 'dbo.assets_in_supportworks_not_in_dynamics' because of binding errors.
Msg 207, Level 16, State 1, Line 1
Msg 208, Level 16, State 1, Line 1
Invalid object name 'dbo.tq_config_temp1'.
Msg 4413, Level 16, State 1, Line 1
Could not use view or function 'dbo.Missing ME Relationships' because of binding errors.
Msg 102, Level 15, State 1, Line 1
Incorrect syntax near ''.
Msg 102, Level 15, State 1, Line 1
Incorrect syntax near ''.
Msg 102, Level 15, State 1, Line 1
Incorrect syntax near ''.
Is there a way I can get it to ignore errors?
January 25, 2010 at 7:36 am
I have a colleague working on a side project to search all columns for any errant information that looks like a credit card and I was wondering if there are performance issues involved with this script? He's going about the concept of creating "dynamic" stored procedures as they run faster, but still trying to optimize it. Will this script work efficiently for truly massive amounts of data?
Gaby________________________________________________________________"In theory, theory and practice are the same. In practice, they are not." - Albert Einstein
January 25, 2010 at 7:49 am
No, it will not run efficiently, because there is CAST() and LIKE %search% operations in conditions which will not able to use indexes
GabyYYZ (1/25/2010)
I have a colleague working on a side project to search all columns for any errant information that looks like a credit card and I was wondering if there are performance issues involved with this script? He's going about the concept of creating "dynamic" stored procedures as they run faster, but still trying to optimize it. Will this script work efficiently for truly massive amounts of data?
January 25, 2010 at 1:06 pm
You might consider removing the wildcards from the dynamic sql and allow them to be specified in the search string. That way, you could explicitly specify left based LIKE statements that would allow you to take advantage of indexes on some fields. That would certainly speed up *some* of the tests. But basically, by it's very nature, you're doing a massive amount of text based searching here so the larger the date sets, the longer the result time.
This seems like a pretty cool script for those one-off "I wonder" kind of tests where you don't mind waiting a bit, but if you're implementing something business critical (like the credit card pattern checking example), you'll probably want to be a little more specific about what fields your testing in what tables. No doubt your application doesn't allow updating of most fields through whatever the interface is except certain textual fields. Focus on scanning only those and you'll get better results. No need to be testing date fields for possible entry of credit card numbers, right?
Tim
January 25, 2010 at 1:12 pm
Timothy Graffham (1/25/2010)
You might consider removing the wildcards from the dynamic sql and allow them to be specified in the search string. That way, you could explicitly specify left based LIKE statements that would allow you to take advantage of indexes on some fields. That would certainly speed up *some* of the tests. But basically, by it's very nature, you're doing a massive amount of text based searching here so the larger the date sets, the longer the result time.This seems like a pretty cool script for those one-off "I wonder" kind of tests where you don't mind waiting a bit, but if you're implementing something business critical (like the credit card pattern checking example), you'll probably want to be a little more specific about what fields your testing in what tables. No doubt your application doesn't allow updating of most fields through whatever the interface is except certain textual fields. Focus on scanning only those and you'll get better results. No need to be testing date fields for possible entry of credit card numbers, right?
Tim
Definitely, restrict to char/varchar/nvarchar/(text?)
Gaby________________________________________________________________"In theory, theory and practice are the same. In practice, they are not." - Albert Einstein
January 25, 2010 at 1:49 pm
A few years back I needed to search for data in some DBs I wasn't familiar with. So I created a quick and dirty script to do this. I put together a few versions of it, one for string/character based stuff, another for numerics and so on as needed. Here is what I did for searching for strings. I used dynamic sql to put together the queries and opted to PRINT them out rather than select them to rows so there are a few tricks I needed to use to implement this. It does still have some bugs (like field type of text isn't implemented) I haven't worked out but thought I would present it as one way of doing this sort of thing.
--Script to look for varchar data in tables
--Run this replacing the value in @dataLookingFor with the exact string you are looking for
--check the Messages tab for info on what tables have the string
DECLARE
@staticSql nvarchar(max)
,@dynamicSql nvarchar(max)
,@combinedSql nvarchar(max)
,@dataLookingFor varchar(100)
,@parmDefinition nvarchar(100)
,@dateStart datetime
,@dateEnd datetime
,@dateTot datetime
,@print varchar(100)
;
SELECT @dateStart = GETDATE()
SET @dataLookingFor = 'SomeString';
SET @parmDefinition = N'@data varchar(100)'; --change to size/type needed
SELECT
@dynamicSql = theSql.value('/row[1]/column_alias[1]', 'nvarchar(max)')
FROM
(
SELECT column_alias
FROM
(
SELECT N'IF EXISTS(SELECT 1 FROM '
+ N'dbo.[' + C.TABLE_NAME + N']'
+ N' WHERE [' + C.COLUMN_NAME + N'] = @data) BEGIN '
+ N' SELECT @cnt = CAST(COUNT(*) AS varchar(20)) FROM ' + N'dbo.[' + C.TABLE_NAME + N']'+ N' WHERE [' + C.COLUMN_NAME + N'] = @data;'
+ N' RAISERROR (''' + C.TABLE_NAME + '.' + C.COLUMN_NAME + N' has data - %d times'', 0, 1, @cnt) WITH NOWAIT;'
+ N' END'
+ N';'
FROM INFORMATION_SCHEMA.COLUMNS C
INNER JOIN
INFORMATION_SCHEMA.TABLES T
ON C.TABLE_CATALOG = T.TABLE_CATALOG
AND C.TABLE_SCHEMA = T.TABLE_SCHEMA
AND C.TABLE_NAME = T.TABLE_NAME
WHERE T.TABLE_TYPE = 'BASE TABLE'
--add tables and column types that you don't want to check (columns should remain as is since they are not string data)
-- AND T.TABLE_NAME NOT IN ('???')
AND C.DATA_TYPE NOT IN (
'bigint'
,'decimal'
,'int'
,'numeric'
,'smallint'
,'money'
,'tinyint'
,'smallmoney'
,'bit'
,'float'
,'real'
,'datetime'
,'smalldatetime'
,'binary'
,'image'
,'varbinary'
,'cursor'
,'timestamp'
,'sql_variant'
,'uniqueidentifier'
,'table'
,'xml' --got this type below
,'text' --need to implement this?
)
FOR XML PATH(''), TYPE
) tab_alias (column_alias)
FOR XML RAW, TYPE
) tab_alias2 (theSql)
SELECT @dateEnd = GETDATE()
SELECT @dateTot = @dateEnd - @dateStart
SELECT @print = 'Time elapsed for running create query: ' + CONVERT(VARCHAR(20),@dateTot,114)
--note that @cnt has to be create before the above query can use it in the query
--so we concatenate @staticSql with @dynamicSql
SET @staticSql = N' DECLARE @cnt int;'
+ N' RAISERROR (''Data being checked for - %s'', 0, 1, @data) WITH NOWAIT;'
+ N' RAISERROR ('''', 0, 1) WITH NOWAIT;'
SET @combinedSql = @staticSql + N' ' + @dynamicSql
-- PRINT @combinedSql
SELECT @dateStart = GETDATE()
EXEC sp_executesql
@combinedSql
,@parmDefinition
,@data=@dataLookingFor
;
SELECT @dateEnd = GETDATE()
PRINT ''
PRINT @print
SELECT @dateTot = @dateEnd - @dateStart
SELECT @print = 'Time elapsed for running query: ' + CONVERT(VARCHAR(20),@dateTot,114)
PRINT @print
--look in the XML now
SELECT @dateStart = GETDATE()
SELECT
@dynamicSql = theSql.value('/row[1]/column_alias[1]', 'nvarchar(max)')
FROM
(
SELECT column_alias
FROM
(
SELECT N'IF EXISTS(SELECT 1 FROM '
+ N'dbo.[' + C.TABLE_NAME + N']'
+ N' WHERE CAST([' + C.COLUMN_NAME + N'] AS varchar(max)) LIKE ''%' + @dataLookingFor + '%'') BEGIN '
+ N' SELECT @cnt = CAST(COUNT(*) AS varchar(20)) FROM ' + N'dbo.[' + C.TABLE_NAME + N']'+ N' WHERE CAST([' + C.COLUMN_NAME + N'] AS varchar(max)) LIKE ''%' + @dataLookingFor + '%'';'
+ N' RAISERROR (''' + C.TABLE_NAME + '.' + C.COLUMN_NAME + N' has data - %d times'', 0, 1, @cnt) WITH NOWAIT;'
+ N' END'
+ N';'
FROM INFORMATION_SCHEMA.COLUMNS C
INNER JOIN
INFORMATION_SCHEMA.TABLES T
ON C.TABLE_CATALOG = T.TABLE_CATALOG
AND C.TABLE_SCHEMA = T.TABLE_SCHEMA
AND C.TABLE_NAME = T.TABLE_NAME
WHERE T.TABLE_TYPE = 'BASE TABLE'
--add tables and column types that you don't want to check (columns should remain as is since they are not string data)
-- AND T.TABLE_NAME NOT IN ('???')
AND C.DATA_TYPE ='xml'
FOR XML PATH(''), TYPE
) tab_alias (column_alias)
FOR XML RAW, TYPE
) tab_alias2 (theSql)
SELECT @dateEnd = GETDATE()
SELECT @dateTot = @dateEnd - @dateStart
SELECT @print = 'Time elapsed for running create query: ' + CONVERT(VARCHAR(20),@dateTot,114)
--note that @cnt has to be create before the above query can use it in the query
--so we concatenate @staticSql with @dynamicSql
SET @staticSql = N' DECLARE @cnt int;'
+ N' RAISERROR (''Data being checked for - %s'', 0, 1, @data) WITH NOWAIT;'
+ N' RAISERROR ('''', 0, 1) WITH NOWAIT;'
SET @combinedSql = @staticSql + N' ' + @dynamicSql
-- PRINT @combinedSql
SELECT @dateStart = GETDATE()
EXEC sp_executesql
@combinedSql
,@parmDefinition
,@data=@dataLookingFor
;
SELECT @dateEnd = GETDATE()
PRINT ''
PRINT @print
SELECT @dateTot = @dateEnd - @dateStart
SELECT @print = 'Time elapsed for running query: ' + CONVERT(VARCHAR(20),@dateTot,114)
PRINT @print
November 9, 2010 at 8:00 am
the only thing i'd like to mention is regarding the convert in the where clause. if you don't specify the length when you convert the current search column to varchar it, by default, will truncate the column value to a length of 30. it will also trim trailing spaces, so it would really only search the first 30 characters of the current column. see my example below:
DECLARE @Value varchar(100)='12345678901234567890123456789012345678901234567890'
SELECT @Value, LEN(@Value), LEN(CAST(@Value AS varchar)), LEN(CONVERT(varchar, @Value)), LEN(CAST(@Value AS varchar(1000))), LEN(CONVERT(varchar(1000), @Value))
SET @Value=REPLACE(@Value, '0',' ')
SELECT @Value, LEN(@Value), LEN(CAST(@Value AS varchar)), LEN(CONVERT(varchar, @Value)), LEN(CAST(@Value AS varchar(1000))), LEN(CONVERT(varchar(1000), @Value))
May 25, 2011 at 8:43 am
Yes, thanks to Michael and others for this. Helped me quickly find an errant value in a field during a very large insert that failed.
Ken
December 7, 2012 at 12:14 pm
Viewing 14 posts - 1 through 13 (of 13 total)
You must be logged in to reply to this topic. Login to reply