Append query - best architecture

  • I have an Access application that needs to periodically extract data from a SQL Server 2000 db for processing locally (basically an Append query).

    Only records that have been created since the extract was last run need to be appended (can use an ascending Seqno identity PK field to find these recs quickly).

    There is likely to be a lot of data involved, so I want to optimise the amount of server/client network traffic by getting the query architecture right.

    So – the options I can think of:

    1. Write an Append query that uses the native Access table and a linked SQL Server table (right outer join, select nulls in native Access table, append them).

    2. Write an Append query that uses a Pass-Through query to get the SQL Server data and then checks the Seqno's against the native Access table, selecting only mismatches (v. similar to (1) except for use of pass-through).

    3. Write some fancy VBA that finds the latest seqno in the Access db and then submits a dynamic pass-through query and appends the results of it to the Access table (sounds good, but haven't done this before).

    Number 3 sounds best to me.  Can anyone think of any better – ideally simpler – ways that I might have missed?

    Many thanks.

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • Your best bet, if I'm correct in assuming that all of the records are inserted in ID ascending order, would be to write a UDF in SQL Server that has an in value of the last highest record ID and then returned a table..   You could use a pass-through query to call this UDF and it would return your records in a table format. 

    An easier method, from the Access db would be to create a trigger in SQL Server that would write the data to another table whenever a record is inserted,  then when you wanted to update the access data, you'd just insert all of the records in the second table and then delete them from the second SQL Server table after they've been inserted in the Access table.  Both would be very quick and easy from the Access side (you'd have a regular query of Select * from 2ndtable and a pass-through query of delete table 2ndtable).

    The only caveat is to be sure that you figure for the possibility of multiple rows per insert because SQL Server will fire the trigger only once for insert commands that may insert multiple rows.

  • Nice thinking and thanks for the response - you've approached it from an angle I hadn't considered. 

    But had to get on with it and have already written a wedge of VBA that rewrites the WHERE clause of the pass-through query, based on the data in the Access table, and then executes that.  Works fine and interesting to write because I hadn't done it before.

    Thanks again

    Phil

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • You didn't indicate HOW MUCH data you're going to transfer each time, but I have several setups using MS Access as a front end for some of the departments where I work.  I created VIEWs (almost the same as a stored procedure with out Where or Order By clauses) which Access "sees" as another Table.  Attach the View in Access, write an import query to import your data and you can pass parameters if necessary.  As long as the KEYs are the same for SQL table and the Access table, Access will NOT import the DUPES.  I then wrote a macro in Access to TURN OFF WARNING messages, run the query, then turn Warning Messages BACK ON (IMPORTANT so users get the normal warnings when adding or deleting, etc.)  VIEWs are SO MUCH FASTER in Access it's not even funny.  I guess it has to do with the "execution plan" already being in place.  One of my updates appends records, then UPDATES ALL ADDRESSES to current (the first time we updated addresses we had over 800 out of 1100+ records change) then another query runs to update Names (somebody got married and changed their name).  The three queries took 7 to 8 minutes when I used REAL tables from SQL, and dropped to less than a minute using Views.  Hope this helps.


    Butch

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply