How to load data into SQL Server in which columns are formated vertically

  • Hi All,

    I have a text file in which columns are identified by line numbers and are placed vertically.column names are mapped with the line numbers in a saperate document. I need to load the data into SQL server tables.

    Example: we have 5 columns in a file

    the data I receive will be in the format below

    ----+----10---+----20---+----30---+----40

    US.........................MA........ Fernandez

    CS .................................................

    identification code

    0---+----100--+----110--+----120--+----130--+----140

    999999999 patrick.......................

    000000000 ............................................................

    Column names are mapped with line numbers

    Line spacing 0 - 20 - Column name - Country

    Line spacing 20 - 30 - Column name - State

    Line spacing 30 - 40 - Column name - Last name

    Line spacing 0 to 1o0 - Column name - mobile

    Line spacing 100 to 110 - Column name - Firstname

    Hope this example helps.

    Can I have some inputs please your suggestions would be worth a ton.

    Thank you so much in advance.

  • I understand that your file consists of lines of text with fixed length columns. For each person there are 2 lines of text.

    Try to parse this file according to the specifications in .NET, for example. Isolate the attributes (country, state, last name, mobile, first name) and build an XML which you'll use as a parameter for the stored procedure doing the insert. You can do the same job with datasets and table valued parameters.

  • If I understand correctly, you have two questions:

    a) how to load the file into SQL Server and

    b) how to shred the data into tables

    a) load the (unshredded) file (using a staging table in some scenarios)

    there are numerous options: SSIS tasks, bcp utility, bulk insert, an external app (like .NET) to load the file from the file system and pass it to a stored procedure as a varchar(max) variable, just to name a few

    b) shred the file:

    there are numerous options, too: SSIS, shred the data using T-SQL against the staging table, shred the data in an external app and insert directly into the target tables (I'd vote against that though, since I prefer to do such data manipulation on the SQL Server side, not using the app. Therewith I usually have to apply changes on one layer.)

    Which way to go depends on several issues: file size, frequency, concepts currently used for similar tasks, level of experience available and so on.

    I, personally, probably would use SSIS for both parts: load the file, separate the rows using a conditional split, extract the columns and insert the data into the target table. Others may prefer bulk loading into a staging table...



    Lutz
    A pessimist is an optimist with experience.

    How to get fast answers to your question[/url]
    How to post performance related questions[/url]
    Links for Tally Table [/url] , Cross Tabs [/url] and Dynamic Cross Tabs [/url], Delimited Split Function[/url]

  • 😀

  • Thank you so much for the options. Can I get some more insight of this can be done using SSIS.

    As I tried and wasn't successful.

    Appreciate your response.

    Thanks,

    Swetha K

  • I would need a sample file and the target table structure as DDL scripts so I have something to test my solution against.



    Lutz
    A pessimist is an optimist with experience.

    How to get fast answers to your question[/url]
    How to post performance related questions[/url]
    Links for Tally Table [/url] , Cross Tabs [/url] and Dynamic Cross Tabs [/url], Delimited Split Function[/url]

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply