July 19, 2006 at 8:22 am
i'm working on some huge bcp solution for moving data
quickly.
of course getting the data out into some kind of flat
file is easy, but the BCP IN is starting to throw me
cause of the classic error:
"Cannot insert duplicate key in object"
is there a quick way (per table) i can bypass any keys
which might interfere with the BCP IN process?
by the way... the source and destination are the same.
and all data can be replaced at the destination.
thoughts?
_________________________
July 19, 2006 at 1:06 pm
exec sp_MSforeachtable 'Alter table ? nocheck constraints all'
-- do your bcp stuff
exec sp_MSforeachtable 'Alter table ? check constraints all'
keep in mind that at this point you are on your own reagarding data quality.
* Noel
July 19, 2006 at 1:43 pm
tried this real quick on an export from authors table, and
upon inserting it into another db like pubs2 with bulk insert
i get the following:
Bulk insert data conversion error (truncation) for row 1, column 1 (au_id).
technically though... i haven't specified a 'truncation'
_________________________
July 19, 2006 at 1:45 pm
data conversion error are usually asociated with missalignment between the Files and the Columns on the Tables.
Are you using format files? If you are you are going to have to check the column order on the files and the table.
Cheers,
* Noel
July 19, 2006 at 1:50 pm
not really just exported it with bcp using
the -n switch is about it.
out with:
bcp pubs..authors out c:\authors.bcp -n
then importing it with the following:
bulk insert pubs2..authors from 'c:\pubs-authors.bcp'
_________________________
July 19, 2006 at 1:54 pm
To be in control of the column order you 'need' format files
* Noel
July 19, 2006 at 1:55 pm
perhaps i should drop all keys/constraints etc prior
to the import
is it possable capture all
different types of keys prior to drop, and then
recreate them 'exactly' as they were after the
import is completed? perhaps from information schema?
question is... if this can be done, can it be automated
via script prior to the 'bulk insert' then the keys/constraints
created just afterwords??
thoughts?
_________________________
July 19, 2006 at 2:14 pm
the error that you are getting has nothing to do with constraints!!
You are simply importing the wrong value in the wrong column!!
* Noel
July 19, 2006 at 10:28 pm
The default data file type for bulk insert is "char". Try:
bulk insert pubs2..authors from 'c:\pubs-authors.bcp' with datafiletype = 'native'
July 20, 2006 at 7:27 am
and there it is. some how i think
that will completely resolve the issue.
going to test this out around lunch time today.
thanks lance!!
_________________________
July 20, 2006 at 9:21 am
actually... i just tried this.
still have a problem though.
give it a go if you have time.
first:
bcp pubs..authors out c:\authors.bcp -n
then importing it with the following:
bulk insert pubs..authors from 'c:\authors.bcp' with (datafiletype = 'native')
more weirdness.
_________________________
July 20, 2006 at 11:25 am
You're referring to the original dup key issue? Bulk insert isn't going to clear your table for you before you load it, so that part's left up to you.
This works for me:
bcp pubs..authors out c:\temp\authors.bcp -n
Then...
alter table [titleauthor] nocheck constraint [FK__titleauth__au_id__0519C6AF] delete from authors bulk insert pubs..authors from 'c:\temp\authors.bcp' with (datafiletype = 'native') alter table [titleauthor] check constraint [FK__titleauth__au_id__0519C6AF]
Note that since the constraint names can be generated, yours may be different than mine. Here's a little snippet that you can use to generate your nocheck/check constraint statements.
select 'alter table ['+object_name(fkeyid)+'] nocheck constraint ['+object_name(constid)+']' from sysreferences where rkeyid = object_id('authors')
July 20, 2006 at 12:06 pm
You have data quality issues in your input, that means you are going to have to do more than just BCP data. I'd suggest a process something like the following:
1) Truncate an import table that has the column structure you want minus any constraints. This ensures you are only going to look at the fresh data.
2) BCP your data into the import table
3) Look for bad data, remove these records and report them to someone
4) Move the remaining contents of the import table to your production table
5) If its a large amount of data, truncate the import table to free the space
Viewing 13 posts - 1 through 12 (of 12 total)
You must be logged in to reply to this topic. Login to reply