Directory of .txt files as Linked Server-confused by column headings.

  • FALSE ALARM, NEVERMIND!

    The following was posted before I realized that I had not added the new file names to the .ini file in the directory holding the input .txt files. It is particularly embarrassing, since the whole activity involved the creation of the .ini file for the download data directory! I'll leave my tale of woe just in case others stumble over the same thing.

    Sorry...

    I have defined a directory of .txt files to be a linked server.

    (Provider = Microsoft Jet 4.0 OLE DB Provider

    Product name: Jet 4.0

    Data source: [path to source directory, on SQL Server]

    Provider string: Text

    )

    The files all have 5 comma delimited fields, prefixed with some junk/decorative lines (which in the past have caused no problem, since I simply filter with a WHERE clause of [F1] LIKE 'FIELDNAME =%',

    which identifies the rows in which I am interested. Here is a sample:

    $==================================================================$

    $ HOLD file created on 2009/07/27 at 04.57.50 by FOCUS 7.1.1 $

    $ Database records retrieved= 3012 $

    $ Records in the HOLD file = 3012 $

    $==================================================================$

    FILE=DOWNLOAD ,SUFFIX=FIX

    SEGNAME=DOWNLOAD,SEGTYPE=S0

    FIELDNAME =NFCORGCD ,E01 ,A16 ,A16 ,$

    FIELDNAME =ORG_STR_SH_N ,E02 ,A30 ,A30 ,$

    FIELDNAME =ORG_STR_NA_1 ,E03 ,A75 ,A75 ,$

    FIELDNAME =ORG_STR_NA_2 ,E04 ,A75 ,A75 ,$

    FIELDNAME =EMP_EXP_IND ,E05 ,A01 ,A01 ,$

    ...

    My existing files, when I right click the "#txt" file name and select:

    Script Table as/Select to/New Query Editor Window, typically yields something like:

    SELECT [F1]

    ,[F2]

    ,[F3]

    ,[F4]

    ,[F5]

    FROM [MasterFiles]...[TAB05MASTER#txt]

    However, two files which I recently added, instead yield:

    SELECT [$===============================================================]

    ,[F2]

    ,[F3]

    ,[F4]

    ,[F5]

    FROM [MasterFiles]...[BEARMASTER#txt]

    To try to avoid the bogus column heading, I tried appending a blank line (CR) to the file, and that left me with:

    SELECT [NoName]

    ,[F2]

    ,[F3]

    ,[F4]

    ,[F5]

    FROM [MasterFiles]...[BEARMASTER#txt]

    I can see nothing different in these problem files. I received them as attachments to a mail message, but I am not aware of any other difference. All files (old and new) appear to be ANSI encoding (in Notepad), so I am at a loss.

    Anyone know what might be going on??

  • Hi Jim,

    I know you've already solved your problem here but I'm curious. How big are these files? I know you said they've all got 5 columns but how many rows do they typical have? Also, how long do that many rows of data take to import using this method?

    I'm asking because I may have a quicker method but I need to do some tests first. I could probably answer the questions I just asked but that's like the mouse guarding the cheese. 😛 Besides, you've already been through it and know the answers off the top of your head.

    Thanks and good "seeing" you again.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden (7/31/2009)


    Hi Jim,

    … How big are these files? I know you said they've all got 5 columns but how many rows do they typical have? Also, how long do that many rows of data take to import using this method?

    These are column definition files from the source system (FOCUS); largest of these are between 200-300 rows. These definition files describe the column layout of a bunch of flat ASCII files that I import biweekly. Reading them is the step 1 of a three step process in which I dynamically create a big schema.ini file in the directory in which the actual data flat files are first received via FTP.

    Step 2 is loading the input files into intermediate tables using BULK INSERT (all char columns), which;

    Step 3, are then loaded and converted (e.g. converting signover punches) into final tables. The data tables are much larger, with up to 2+million rows.

    (I have to migrate the processing of a bunch of new tables, currently imported in MS Access, to SQL Server, and need to review my old work, to remember just what I did for the one table I currently import directly using SQL Server--my memory is dim so I hope that is the sequence. (My dim memory caused me to forget that I needed to add new schema.ini entries for new definition files. Because I use these files to generate a schema.ini for the actual data files, I just figured the schema.ini file left in the definition file directory was just left over output from testing. You may remember back when I had to get help from you guys to figure out why my .ini files didn't work-- they had had started out as UNICODE rather than ASCII.)

    The whole process is something of a kludge, made necessary, I claim, because:

    1. SSIS (2005) cannot FTP data from an IBM mainframe (the FTP task foolishly insists on a "/" as the first character of a source path.)

    2. SSIS has no clean way to programmatically define source files with several hundred columns. (Perhaps I could edit the XML source definitions, but I was reluctant to open that can of worms.)

    3. The "new" version of Office (2007?), which I am being threatened with by the IT staff, has broken Access' ability to correctly import ASCII files with date columns in existing import definitions.

    Thanks, Jeff, any insights or suggestions you might have will be appreciated.

    ...And I forgot to mention time. So far everything I do just in SQL Server has been within my time tolerance: if a process runs within 2 minutes wall time I dont worry about speedups or indexes or looking at the query plan or optimizations or anything. (Q:When is the best time to optomize? A:Not yet!)

    But the part that has been taking much longer is the SSIS transfer of the tables I import in Access to SQL Server. A 2+ million row table has been getting progressivly slower; now takes more than an hour. I expect (hope) that time to be sigificantly reduces whan I get everything migrated to SQL Server.

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply