May 24, 2005 at 8:40 am
Hi
Anybody know how to import 2,000,00 rows of csv data that includes double quotes
Some of the rows WILL have commas inside of quoted fields
Some of the fields also have quotes???
eg housename ,""My House"",
May 24, 2005 at 10:31 am
Sorry, no good way that I know of to do this. I might import it as one large text field and then use T-SQL with charindex/patindex to parse things out.
May 25, 2005 at 10:53 am
I'd consider using TextPad (www.textpad.com) to split the file via a macro then import it as two files.
Or maybe, again using TextPad, delete the double quotes, unless you absolutely must have them. I don't follow how one file could have two different types of delimiters.
-----
[font="Arial"]Knowledge is of two kinds. We know a subject ourselves or we know where we can find information upon it. --Samuel Johnson[/font]
May 25, 2005 at 2:31 pm
Try to open the file with notepad for ex and replace the "" with a single " or empty space if you don't need them to be imported
Vasc
May 25, 2005 at 5:38 pm
This is where a higher powered editor shows its superiority. I built a five million record file, it loaded in TextPad in a matter of seconds, it took NotePad several minutes. I was quite surprised that NotePad loaded it at all! 😉
But I still miss the days of working in Brief, that was a heck of an editor in the dos days.
-----
[font="Arial"]Knowledge is of two kinds. We know a subject ourselves or we know where we can find information upon it. --Samuel Johnson[/font]
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply