Reading txt file then creating new table

  • I am trying to create a job that will run once a month where it will go to a certain directory, read the txt file, and create and populate the new table in the database.  Then it should go back to the directory and delete the file.  I don't think I can use DTS because I don't think there is capability to read a txt file, and I would manually have to tell it what file to read and what to name the new table.  If anybody has any ideas, please pass them on!  I want to make it as automated as possible.  Thanks.

  • DTS does have the ability to read in a text file.  If the file name is dynamic, you'll just have to get creative in designing your DTS tasks.  You could either read in the file using an ActiveX task, or create a Data Pump for a fixed named text file and use ActiveX (or another way) to rename your file to match what the job is expecting. You could come up with a way to dynamically create your tables also.  Do you have a set naming convention in mind for your tables?

    Using DTS to do these types of tasks is common.  If fact, if you want to automate the import, it's your best bet IMHO.  If you can come up with the logical steps that you would take as a human to import the files, you can code all of the decision making into a DTS package. 

    John Rowan

    ======================================================
    ======================================================
    Forum Etiquette: How to post data/code on a forum to get the best help[/url] - by Jeff Moden

  • Can you use SSIS? If yes, you can create a parameter to pass the file name and you can create a table from flat file, do anything you want and even create a log file.

  • You can use BCP (for both SQL2000 and SQL2005). check books online "BCP Utility".

    You can create a job to automate this process

    job step1: T-SQL: create table mytable (if needed)

    job step2: CmdExec: d:\loadfile.bat

    job step3: T-SQL: xp_cmdshell 'del d:\myfile.txt'

    ***in loadfile.bat file, you can have something like this

    bcp mytable in d:\myfile.txt ....

     

  • Our table names should be identical to the files, such as 2007Dec08Sprint, with the month and year changing as needed.

    I created an ActiveX script, which I have pasted below.  ActiveX is new to me, so I'm not sure what to do beyond getting it to open the file?  How do I actually create the table and import the data?  Thanks!

    Function Main()

     Main = DTSTaskExecResult_Success

    ' Read start and end dates from a flat file and

    ' store the values in dynamically generated global variables

    'Function Main()

        dim oFSO

        dim x

    '  instantiate the Scripting Object

        set oFSO = CreateObject("Scripting.FileSystemObject")

    '  Open the file

        set x = oFSO.OpenTextFile("E:\SprintBills\2006Dec08.txt")

        x.Close

        Main = DTSTaskExecResult_Success

    End Function

  • It's hard to give you more guidence on this without knowing more about the process.  Can you walk us through the logic of how you get a file, how you would look at a directory and figure out which file to use, and what is the file format (fixed, delimited,etc.)? 

    John Rowan

    ======================================================
    ======================================================
    Forum Etiquette: How to post data/code on a forum to get the best help[/url] - by Jeff Moden

  • Vivien, I have also been working on implementing your solution.  I have gotten the T-sql step 1 working, but on step 2 it is hanging when the job runs.  I know it is hung on step 2.  Here is my command:

    C:\Documents and Settings\datamining\My Documents\loadfile.bat

    Here is what is inside loadfile.bat:

    bcp datamining.dba.telecom IN E:\SprintBills\Copyof2006Dec08.txt -c

    I do not have much data in the file currently (about 10 rows worth), so I don't know why it would be hung up.  The job says it is Executing.  I have stopped it from running after about 20 minutes.  Do you know why it would be taking so long? 

  • In dos cmd line, try to make the bat file work first then copy the bcp command line code to the bat file.

    cmd> bcp mytable in E:\SprintBills\Copyof2006Dec08.txt ......

    You may need more options for the bcp, like -S for server name, -f for format file.  Check books online, there are examples to follow. 

    Start from a simple file first, then see how it goes.

  • Okay I tried to run from the cmd line, and I got the error "Unknown command 'and' on command line".  The path to where my txt file is at is:

    C:\Documents and Settings\datamining\My Documents

    Can you not have those spaces?

  • You need to enclose the path in quotes:

    "C:\Documents and Settings\datamining\My Documents\loadfile.bat"

  • I am attempting to use Bulk Insert (statement below).  When I run it in Query Analyzer, I'm getting an error: Invalid number of columns in format file 'C:\documents and settings\datamining\my documents\format.fmt'.  I believe this is because some of my columns have null values.  I know I need to specify -k; where in the SQL statement do I add this? 

    BULK INSERT telecom..Jan082007Sprint

    FROM 'C:\documents and settings\datamining\my documents\CopyOf2006Dec08.txt' WITH (FORMATFILE='C:\documents and settings\datamining\my documents\format.fmt');

    GO

  • I've finally gotten a job that works!  I have three T-Sql steps:

    1. Create the table

    2. Load the table

    3. Delete the file out of the directory

    A few questions that could shorten the process even further - my txt file I'm importing has column names in the first line, and the data also starts on the first line.  Is there a way to ignore this?  I can't ignore the whole first line obviously. And is there a way I can copy a txt file from a mapped drive to a local drive on the server? 

     

  • Your source file has column names AND valid data in the first line?  How does that work?

    Tim Mitchell, Microsoft Data Platform MVP
    Data Warehouse and ETL Consultant
    TimMitchell.net | @Tim_Mitchell | Tyleris.com
    ETL Best Practices

  • Right now, I manually open the file and wipe out the column names.  I'm not sure if they will always be saving the data this way; this month it was in a Excel spreadsheet, and the first row contained column names only - no data. 

  • For you to be able to automate the import process, the source files need to be in a consistent format.  I would recommend talking to whoever generates that source file and get a format agreement in place.

    John Rowan

    ======================================================
    ======================================================
    Forum Etiquette: How to post data/code on a forum to get the best help[/url] - by Jeff Moden

Viewing 15 posts - 1 through 15 (of 16 total)

You must be logged in to reply to this topic. Login to reply