December 6, 2007 at 1:06 pm
Our old tables have varchar fields. We want to convert them to nvarchar so that we can handle Unicode data.
For each table I create a corresponding new table with the name TableName_Uni with the same structure, except for nvarchar fields instead of varchar.
Now I try to bcp the data out/in:
bcp dbName.dbo.TableName out c:\temp\TableName.bcp -w –T
bcp dbName.dbo.TableName_Uni in c:\temp\TableName.bcp -w –T
Sometimes it works, but sometimes the data in the TableName_Uni does not match the data in TableName!
I tried using -c and -N instead of -w with the same wrong results.
Is there something I'm doing wrong? Or bcp is simply not going to work if the tables have different structure?
December 11, 2007 at 3:25 pm
Try using -n (note the lower case n)
-
December 11, 2007 at 3:41 pm
Jason,
I finally figured it out. The new table had an identity column, so bcp wasn't inserting garbage, it was simply renumbering my rows! Running SET IDENTITY INSERT NewTable ON before running bcp in didn't help, so I had to specify the -E key
December 11, 2007 at 7:20 pm
I don't understand why you need BCP for this... what's wrong with just doing a straight insert into the new tables?
--Jeff Moden
Change is inevitable... Change for the better is not.
December 11, 2007 at 10:04 pm
Jeff Moden (12/11/2007)
I don't understand why you need BCP for this... what's wrong with just doing a straight insert into the new tables?
Speed. an INSERT is recorded in the transaction log - takes time and space. I have some tables with 100M rows - bcp out/bcp in proved to be 2..3 times faster than the old INSERT-based script.
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply