June 13, 2007 at 7:59 am
Hi, I am currently developing an eCommerce system. The data in this system is fed from our core system which is unfortunately a 16 bit Multi-valued database system.
Transactions or records are contained in delimited text files. One text file for each record. Obviously the number of fields, etc differ bewteen the various records (tables) moving between the two databases.
Originally i wrote a windows service that read each file and imported the data using ADO command object and underlying stored procedures, but this obviously requires maintenance.
I was thinking about using dynamic SQL by manipulating the data row and executing dynamic insert statements but a bit worried about the performance and overhead of running these statements as the existing system deals with thousands of transactions every hour.
Any one have any ideas?
June 13, 2007 at 9:53 am
I always use DTS to import data from text files (DTS can get very involved, but if all you're doing is importing all rows from a text file, the wizard will get you set up with just a few point and clicks).
If you start out by right clicking your DB in Enterprise Manager, then select all tasks, import data. Go through the wizard, getting the data into a staging table that you can manipulate aftwards is pretty straight forward. Then save your DTS package to SQL and schedule it (if you want)...
-
June 13, 2007 at 12:03 pm
You're saying the format of a given text file changes a lot??? Do the text files have header information on the first line of the file?
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 3 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic. Login to reply