Creating an Access table from SQL Query Analyzer

  • I have a question that may or may not be easy, I don't know and I've been unsuccessful in trying to find an answer by searching.

    I have 2 SQL tables, one is the 'Master File' and the other one is a table where users enter data as it comes in.  Periodically, (anywhere between 3-5 times per week depending on the amount of data we get in) I take what's been entered in the second table and update the entries to the Master File.  This is done using a query in SQL Query Analyzer.

    This is my question...once the information is updated to the Master, the record of who entered it and when is lost.  Therefore, I created a simple Access DB that I linked over to the SQL table where data is entered and before I update the records to the Master I just create a simple table where I back up the changed records and keep the username and date intact.  I just name the table with that day's date and the word Bkup.  I am wondering if it is possible to incorporate this step into the SQL Query Analyzer query, or if it would be easier to move these tables from Access to SQL to accomplish this.

    I'm getting ready to go on maternity leave and I'm just trying to make the process as simple as possible for the person that will be doing this in my absence, as she isn't an "IT" person and just knows the basics of opening SQL queries and running them.

    Thanks for your help.

    By the way, I'm using Access 2000 with MS SQL Server 2000, which resides on a Windows 2000 server.

  • In my opinion, it would be much tidier and easier to manage all of this if you keep all of the data in SQL Server rather than mixing your db platforms.

    1) Assuming that you can change the design of the master table, you could add new columns (username, date) and update these as part of the append query - then you don't lose the data.

    2) Alternatively, add some extra columns to the source data (along the lines of UniqueID (primary key - if there isn't one already) and DateUploaded).  When you do the update, write the UniqueID to the master table along with the other fields and then you can link from master to source at any time in future and find the info you need. You should also set up referential integrity between the two UniqueID fields. This assumes (and assures) that you will never delete the source records.

    From an audit point of view, solution (2) is preferable, as you are never deleting any data, though it is perhaps slightly more complex to set up and maintain.

    The absence of evidence is not evidence of absence
    - Martin Rees
    The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
    - Phil Parkin

  • Look up triggers in SQL BOL.  If you choose to add the audit columns, you can use triggers to populate the columns without needing to modify your code to accomodate those new columns.  Heck, you might even decide to make a new audit table and have the triggers update this new table instead!  Hm.  Maybe change the permissions on this new table so the users can't see or update it.  Food for thought? <g>

    Cheers,

    Ken

  • Thanks for the replies. 

    Phil, I have thought of adding those fields to the Master file but it is used for many other things and probably isn't a good idea.  Also, there is a possibility of a record getting updated more than once, and they don't want to lose history.  I suppose it is a rudimentary audit trail.

    Ken, I'll try looking up triggers.  I'm still fairly new to SQL so I don't know much about those yet, but I've been wanting an excuse to study them anyway.

    I'll see what I can figure out.  Thanks again for your help.

  • Perhaps I'm missing the point here -

    a. to datestamp new records I would create a DATE FIELD and set the DEFAULT value to [getdate()]

    b. user-stamping records depends on the interface to some extent - possibly do same as for date but using SESSION_USER default.

    Here is one example from Q-Analyser help: 

    USE pubs
    GO
    CREATE TABLE deliveries2( order_id int IDENTITY(5000, 1) NOT NULL, cust_id  int NOT NULL, order_date datetime NOT NULL DEFAULT GETDATE(), delivery_date datetime NOT NULL DEFAULT DATEADD(dd, 10, GETDATE()), delivery_person char(30) NOT NULL DEFAULT SESSION_USER)
    GO
    Also - it is very easy to create, update and manipulate Access tables using DTS.
     

    Kindest Regards,

    Ian Smith

  • Why not keep copies of the transaction records in a permanent SQL table instead of destroying or discarding them once the Master table has been updated?  How did Access get into the picture?



    Dana
    Connecticut, USA
    Dana

  • Basically, the entire database used to be in Access (I work for a very small company).  However, they were outgrowing the capacities of Access and needed to switch to SQL Server in order to store all of their data.  The front end that people use for entering data is still in Access though.  I'm trying to deal with a system that someone else set up and I'm just trying to figure out the best way to do it.

    I had considered putting them into a SQL table as well, but hadn't decided what to do yet.  I just started the Access backup as a way to keep the data until I came up with a permenant solution.

    Thanks for the replies.

  • Snap. I'm in a very similiar situation.

    Until I (eventually) get all Access tables and functionality into SQL Server, I have to update (DTS) each night and use SMTP email to automatically advise administrators and management of data anomalies found in Access.

    Hope the responses were helpful, reply not necessary.

     


    Kindest Regards,

    Ian Smith

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply