December 22, 2010 at 11:55 pm
Hi All,
I have a text file in which columns are identified by line numbers and are placed vertically.column names are mapped with the line numbers in a saperate document. I need to load the data into SQL server tables.
Example: we have 5 columns in a file
the data I receive will be in the format below
----+----10---+----20---+----30---+----40
US.........................MA........ Fernandez
CS .................................................
identification code
0---+----100--+----110--+----120--+----130--+----140
999999999 patrick.......................
000000000 ............................................................
Column names are mapped with line numbers
Line spacing 0 - 20 - Column name - Country
Line spacing 20 - 30 - Column name - State
Line spacing 30 - 40 - Column name - Last name
Line spacing 0 to 1o0 - Column name - mobile
Line spacing 100 to 110 - Column name - Firstname
Hope this example helps.
Can I have some inputs please your suggestions would be worth a ton.
Thank you so much in advance.
December 23, 2010 at 3:12 am
I understand that your file consists of lines of text with fixed length columns. For each person there are 2 lines of text.
Try to parse this file according to the specifications in .NET, for example. Isolate the attributes (country, state, last name, mobile, first name) and build an XML which you'll use as a parameter for the stored procedure doing the insert. You can do the same job with datasets and table valued parameters.
December 23, 2010 at 4:18 am
If I understand correctly, you have two questions:
a) how to load the file into SQL Server and
b) how to shred the data into tables
a) load the (unshredded) file (using a staging table in some scenarios)
there are numerous options: SSIS tasks, bcp utility, bulk insert, an external app (like .NET) to load the file from the file system and pass it to a stored procedure as a varchar(max) variable, just to name a few
b) shred the file:
there are numerous options, too: SSIS, shred the data using T-SQL against the staging table, shred the data in an external app and insert directly into the target tables (I'd vote against that though, since I prefer to do such data manipulation on the SQL Server side, not using the app. Therewith I usually have to apply changes on one layer.)
Which way to go depends on several issues: file size, frequency, concepts currently used for similar tasks, level of experience available and so on.
I, personally, probably would use SSIS for both parts: load the file, separate the rows using a conditional split, extract the columns and insert the data into the target table. Others may prefer bulk loading into a staging table...
December 23, 2010 at 6:40 am
😀
December 23, 2010 at 4:43 pm
Thank you so much for the options. Can I get some more insight of this can be done using SSIS.
As I tried and wasn't successful.
Appreciate your response.
Thanks,
Swetha K
December 23, 2010 at 4:48 pm
I would need a sample file and the target table structure as DDL scripts so I have something to test my solution against.
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply