July 22, 2015 at 8:38 am
Hi all,
I'm testing, with SQL 2014 on the same DB, a procedure that extracts data from a table into a file and Loads data from that file into a different table which has the same columns as the initial table (I use a function to create the create table statement from the source table and change the name of the destination table)
when doing my bcp -c the record with the special character "é" doesn't make it to the file.
when doing a bcp -w the record with the special character makes it to the file but the bulk insert omits the whole record.
The result file in the case that makes it to the file is unicode.
I'm not using a format file (don't see the need for it)
The bulk insert into the destination table that contains identical columns as the source (a mixture on int, varchar, char) --> didn't work
I also tried by building the columns of the destination table with nvarchars -- Still doesn't work.
I tried the bulk insert with:
codepage = 'ACP, 'RAW' --> didn't change anything. still didn't work.
it's a complicated process that takes 1 XML record that contains information + the Create Table Statement (to eventually be able to this on a different server/DB) + the title Row for each column + the Data... Each of these are created with a BCP command (all with the same options). they are then appended to each other with a copy /B c:\file1.txt + c:\File2.txt + c:\File3.txt + c:\File4.csv c:\ResultFile
once the result file is created I bulk insert the 2 first rows in one table "TableA"
create the tmp table "TABLE B" with the create table statement that is in "TableA"
and do another bulk insert of the remainder of the file into the newly created table.
What else can I try?
Should I be creating a format file? what are the benefits of a format file?
It's a very long procedure that does both Extract and Load (with 12 parameters) not sure what I should put here.
July 22, 2015 at 9:14 am
jghali (7/22/2015)
Hi all,I'm testing, with SQL 2014 on the same DB, a procedure that extracts data from a table into a file and Loads data from that file into a different table which has the same columns as the initial table (I use a function to create the create table statement from the source table and change the name of the destination table)
when doing my bcp -c the record with the special character "é" doesn't make it to the file.
when doing a bcp -w the record with the special character makes it to the file but the bulk insert omits the whole record.
The result file in the case that makes it to the file is unicode.
I'm not using a format file (don't see the need for it)
The bulk insert into the destination table that contains identical columns as the source (a mixture on int, varchar, char) --> didn't work
I also tried by building the columns of the destination table with nvarchars -- Still doesn't work.
I tried the bulk insert with:
codepage = 'ACP, 'RAW' --> didn't change anything. still didn't work.
it's a complicated process that takes 1 XML record that contains information + the Create Table Statement (to eventually be able to this on a different server/DB) + the title Row for each column + the Data... Each of these are created with a BCP command (all with the same options). they are then appended to each other with a copy /B c:\file1.txt + c:\File2.txt + c:\File3.txt + c:\File4.csv c:\ResultFile
once the result file is created I bulk insert the 2 first rows in one table "TableA"
create the tmp table "TABLE B" with the create table statement that is in "TableA"
and do another bulk insert of the remainder of the file into the newly created table.
What else can I try?
Should I be creating a format file? what are the benefits of a format file?
It's a very long procedure that does both Extract and Load (with 12 parameters) not sure what I should put here.
So what's the ultimate objective? Seems to me that there are a number of alternatives, such as Replication, Log Shipping, and Service Broker, where that kind of technical detail has already been figured out. I'm not sure I'd want to re-invent the wheel...
Steve (aka sgmunson) 🙂 🙂 🙂
Rent Servers for Income (picks and shovels strategy)
July 22, 2015 at 9:38 am
EDIT: duplicate post - deleted content
Steve (aka sgmunson) 🙂 🙂 🙂
Rent Servers for Income (picks and shovels strategy)
July 22, 2015 at 10:20 am
Hi sgmunson,
The systems are independent but structure can be identical.
I'm still in testing mode (so I'm testing on the same system) but eventually this will be going do different SQL servers with different versions and collations...
the idea is the file that contains the structure and the data (as well as other information) will create a temporary table with the accompanied data... which will then trigger a process that will process the data and do what needs to be done.
As for the problem I am having i found that it wasn't the character "é" that was making the record disappear but it was the bulk insert that was skipping the first record...
I had a similar problem earlier last week...
another issue appeared,
I script a create table statement using information_schema.columns from the original server and create the temporary with that same script...
the data comes from that table but when I do the bulk insert, I get data "Bulk load data conversion error (truncation) for row 2, column 1 (name)."
when I go look at the file my row terminator (!|!) is right before the data followed by my column terminator (|!|)
as soon as i change the script to have an extra character, everything passes... this is mind boggle-ing
thanks
JGhali
July 22, 2015 at 11:00 am
jghali (7/22/2015)
Hi sgmunson,The systems are independent but structure can be identical.
I'm still in testing mode (so I'm testing on the same system) but eventually this will be going do different SQL servers with different versions and collations...
the idea is the file that contains the structure and the data (as well as other information) will create a temporary table with the accompanied data... which will then trigger a process that will process the data and do what needs to be done.
As for the problem I am having i found that it wasn't the character "é" that was making the record disappear but it was the bulk insert that was skipping the first record...
I had a similar problem earlier last week...
another issue appeared,
I script a create table statement using information_schema.columns from the original server and create the temporary with that same script...
the data comes from that table but when I do the bulk insert, I get data "Bulk load data conversion error (truncation) for row 2, column 1 (name)."
when I go look at the file my row terminator (!|!) is right before the data followed by my column terminator (|!|)
as soon as i change the script to have an extra character, everything passes... this is mind boggle-ing
thanks
JGhali
I noticed that you dodged the question of why you're re-inventing the wheel, but perhaps that's moot, given that your process is now working...
Steve (aka sgmunson) 🙂 🙂 🙂
Rent Servers for Income (picks and shovels strategy)
July 22, 2015 at 11:33 am
not sure how I dodged the question... (I didn't mean to)
What would you suggest ?
right now the process in place uses a file system and they want me to develop an ETL process that will extract and send changes or loads to a secondary system using SQL 2008/2012/2014 Express (compatibility between all 3 is imperative). This process can't be online, if the master DB is down other systems need to continue without even knowing that the system is down and vice versa.
I know express doesn't have replication and SQL agent is disabled.
maybe I'm going at this all wrong.
other opinions are good and might help me make a better decision.
Thanks
JG
July 22, 2015 at 2:18 pm
i finally figured out the problem and solution
the copy adds a character when merging 2 files (I was using /B right after the copy) had to change it to use /b at the end of the copy command...
didn't think where it is put makes a difference.
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply