December 26, 2009 at 4:59 am
I uploaded a bunch of Access tables to SQL and have to convert the Access data into tables used by a SQL database program. The problem is that all the varchar type fields in Access were created with a field length of 255. My SQL database program has field widths much lower than those and that vary as to what the info is that is being loaded (like first_name with a width of 20, suffix with a width of 5, etc) and can't be changed.
In a nutshell, I need to check each Access table/field and determine what the maximum length is currently being used and then identify which records might need manual adjusting before importing into the SQL database. I can run the following for each table/field but that would be cumbersome:
select max(len(employee_name)) from employee_table
Does anyone know of a utility that would loop through all the tables and all the fields that might find and report on this scenario?
Thank you.
Rog
December 26, 2009 at 7:37 am
here you go Roger; a project like this has to use the metadata to determine which columns to query, so using a cursor in this case is fine;
the logic is pretty obvious: get all the char-type columns, get their defined size, and find the max length as it exists in the table.
i'm stuffing the data into a global temp table and finally ordering the info back with a final query of the results. watch for MaxActualLength=0, since that's nulls or no rows in the database, which might not be representative of future data.
example results:
TableName ColumnName DefinedLength MaxActualLength
example exampleid 8 0
example exampletext 30 0
Z_Data_Extract_Activity_C06350_1_TXT RAWDATA 8000 779
the query:
select sysobjects.name as TableName,
syscolumns.name as ColumnName,
TYPE_NAME(syscolumns.xtype) AS VariableType,
syscolumns.length AS DefinedLength,
0 as MaxActualLength
into ##tmp
from sysobjects
inner join syscolumns on sysobjects.id=syscolumns.id
where sysobjects.xtype='U'
declare
@isql varchar(2000),
@tbname varchar(64),
@clname varchar(64)
declare c1 cursor for
select TableName,ColumnName from ##tmp where VariableType in('varchar','char','nvarchar','nchar')
open c1
fetch next from c1 into @tbname,@clname
While @@fetch_status <> -1
begin
select @isql = 'UPDATE ##TMP SET MaxActualLength = (SELECT ISNULL(max(DATALENGTH(@clname)),0) FROM @tbname) WHERE TableName =''@tbname'' and ColumnName =''@clname'''
select @isql = replace(@isql,'@tbname',@tbname)
select @isql = replace(@isql,'@clname',@clname)
print @isql
exec(@isql)
fetch next from c1 into @tbname,@clname
end
close c1
deallocate c1
select * from ##tmp
order by TableName,ColumnName
Lowell
December 26, 2009 at 7:38 am
What I would do is design the database in SQL Server properly, and then pump all the data in. I am sure that you will want to change and modify the design some when you use a much better relational database. Access is not much on this type of thing. You can create the same table structure in SQL, and then pump the data in, without allowing SQL Server to design the tables for you.
In the long run, it will take you much less time than trying to have some dinky app attempt to do this for you, by some automated process, and still get it incorrect. Even if you have say.... 250 tables in Access, that would take you maybe a full day to design that same database in SQL. But now you can rethink the design, solve some of the problems that Access restrictions gave you, and allow SQL Server to create the proper relationships between tables. Take my advice, you will be much further ahead by doing some work by hand.
Andrew SQLDBA
December 26, 2009 at 9:05 am
Lowell, this is perfect! Thank you so much!
Roger
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply