April 10, 2009 at 8:46 am
I have a table which allows for storing notes about a client. One field is ntext(8000), where the users input the history about a particular client/claim. They have CR/LF in the field, which I can handle in SQL and the application. However, when I export the data to a text file, vertical bar delimited for each field, when our Oracle DBA tries to import it, it breaks that field into multiple rows of data, breaking on the CR/LF. Now, having said all that, I know zip about Oracle and don't know how to help the Oracle DBA. I've tried ansi and unicode as an export option and that doesn't seem to matter. Lastly, we cannot setup a linked server as our version of Oracle (7.32) needs to be at 7.33, minimum, based on my attempts to setup the linked server and there are no current plans to upgrade. Any thoughts on how/what I can try to get around this?
-- You can't be late until you show up.
April 10, 2009 at 9:51 am
I started to do that. The table has 660K rows of data. After about 10 or 15 minutes, I finally canceled the query. I don't deem the number of rows to be extensive but when it's searching through an 8000 ntext field maybe the run-time isn't that extreme?? And, of course, I cannot put an index on the field in question due to the data type. Maybe I'll offload that table to a development server and just let the update run however long it takes and go from there. Thanks for the reply.
-- You can't be late until you show up.
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply