Bulk inserting of a text file that contains special characters

  • Hello All,

                  Im using a bulk insert task to upload a file into a table , the file format is text, it contains number of rows each of length 520 bytes, each row ends with a record seperator which is nothing but a special character, but the problem is , the text file also contains some special characters which starts at position 480 and ends at 500. There are some data after the 500th position.

    Bulk insert task is able to insert the starting 480 bytes, but it fails to insert the special characters.........

    So , anybody outthere who can sort out this isssue....

       

    Thanks in advance

  • This was removed by the editor as SPAM

  • It's tough to give a specific answer without more specifics: let me see if I understand your issue: your data has the same character in it as you are using as a row separator?

    This is common when you use the default newline as row seperator but are inserting text that contains newlines.

    If this is your issue, the answer is simple: use a different row separator.   Remember that a row separator can be a sequence, which I've had to do in data porting, using separators like

    \013{\001!X\001!}\013

    nothing ASSURES that this sequence never exists in the data, but you keep testing, and checking.  Sometimes is it simply easier to write something to parse it yourself and generate the inserts - but that's an extreme case.

    If I've misunderstood your issue, I apologize.

    Roger L Reid

  •  First of all I would like to thank you for spending time and providing me a solution to the above mentioned issue......

    But the bottom line is , the special character which im encountering is different from that used as a row delimeter/row seperator.As you know that bulk insert task will not load any special characters into the table.So the data which is there after the 480th position is getting truncated.....

    It would be very much helpful if you would guide me how to get the ASCII values of the special characters...

  • Actually, I *don't* know of any issue in loading any particular characters.  The notion of "special" characters is hardly relevant to a modern dataserver: what's so special about one character vs another?

    However, if your table or server was set up to handle only one limited character set, then it is correct for the server to not allow values in char data outside that set.   Again, given the lack of any specifics (what character is the problem?  What's the datatype of the field its supposed to go into?  What version of SQL Server?  Are you using a collation or is it purely Unicode?) its hard to say what the issue is.

    But if you have bad data (data that the table/database/server does not allow) then you need to fix your data file.

    Sorry I can't help more.  I regularly store feilds that mix Cyrillic, Hebrew, and Latin character sets in SQL Server, but I never force a codepage -

    Roger L Reid

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply