October 8, 2002 at 9:33 pm
My client and I are trying to sort out the cause and resolution for a recurring problem. They regularly need to upload new medical information from their back office systems to the webserver that our production system works on. While the upload is underway, some users of the website experience timeout errors. The uploads don't take long -- 30-60 seconds -- but can cause 1 to 10 timeout errors across the user population.
The timeouts are all associated with the database being updated, and the specific table being uploaded to. Our automatic error reporting system lets us know what SQL the user's page was executing when the timeouts occur, and the queries are mainly SELECTs, sometimes INSERTs, sometimes UPDATEs.
The upload process is like so... The client prepares the new data in an Access 2002 database. They have a VPN thru the firewalls to the webserver, and set up an ODBC DSN to the webserver database, using the IP address. (They may actually do it by mapping a drive, and setting up the DSN thru that; I can check.) In Access, they link to the target table on the webserver; so Access serves as their GUI into the MSSQL 2000 table they are updating. They then perform an INSERT from their Access table to the linked table in MSSQL.
Is there something inherently daffy about this technique that we should change? Or is this likely a matter of indexes on the target table? Would it be helpful to instead load a temp table on the webserver, and then do the final INSERT from that table to the target (to reduce the extra delay caused by the networking)?
Thanks in advance,
- Tom
October 9, 2002 at 4:12 am
The most likely cause is locks issued by Access to the table, then when the inserts occurr of course the index changes will further delay the lock release. Yes, you temp table process would be a far better bet. You could also consider creating a form that connections via ADO and submits the data one row at a time thru a stored procedure or using a direct INSERT this way Access does not issue locks as each update has to be sent back to Access for viewing, with a code send there is no need for the data trips. Also, for fastest insert time do not place an index on the temp table, have the insert occurr, move to the main table, then truncate the temp table. Code to move from temp to main can be triggered by Access form connected using ADO.
"Don't roll your eyes at me. I will tape them in place." (Teacher on Boston Public)
October 9, 2002 at 4:30 am
Agree. Either one rec at a time or chunks (need to test to see what chunk size works, append 500 or so at a time).
Andy
October 9, 2002 at 8:20 am
Andy/Antares686,
Thanks for the input! So it sounds like Access itself is pretty suspect, in how it (man-)handles the connection to MSSQL when used as both a source and a GUI. I'm guessing when it's used just as a raw source, it doesn't impose any restrictions on the connection?
So suppose we went this route: I write a VB GUI that uses OLE-DB to connect both to the MSSQL installation on the webserver on one side, and Access on the other. On the MSSQL side, it lets them pick a specific target database and table, and on the Access side it lets them pick an MDB file and table to use as the source. They hit the Upload button, and the process begins. I imagine this strategy would give the these advantages:
- it gives me control over the connection parameters for both target and source
- it sidesteps the complications caused by ODBC/DSN
- it sidesteps the complications imposed by using Access as the GUI
- it allows me to do the INSERTs whichever way I want, all at once, singly, in batches, with/without a stored procedure, etc.
Sound like a reasonable strategy? If so, any thoughts on the parameters to use or not use, in setting up the connections? (I'll dig through the reference books, but if you know off-hand to definitely not use Setting X, let me know!)
Thanks again,
- Tom
October 9, 2002 at 11:39 am
Or you could of course use Visual Foxpro which would be far easier to control than say a VB application as it is a database itself....
October 9, 2002 at 5:38 pm
Reasonable. DTS exists to do tasks a lot like this. I use a simple VB exe that runs on the server to pick up mdb's dropped for processing, just runs a DTS package that is compiled in, processes the file, deletes it. Nice part about doing this is that its quick and fault tolerant, they can drop the file anytime, you can process it off peak if you need to.
Andy
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply