November 2, 2003 at 7:44 pm
I'm trying to build a case that connecting access 2000/xp to a production database for reporting will reduce the performance of the sql database. Such as Running queries from selects to updates and deletes can cause records to be locked or other users connecting to the sql database get time outs. I would appreciate anyone sharing any experience or provide facts that access cause performance problems.
Thanks
November 2, 2003 at 10:15 pm
run a profiler trace.
Access can at times really hinder performance because it asks for a bunch of data from SS.
Steve Jones
http://www.sqlservercentral.com/columnists/sjones
The Best of SQL Server Central.com 2002 - http://www.sqlservercentral.com/bestof/
November 2, 2003 at 10:36 pm
As long as only one person is running reports from a database on SQL Server that the same person is using (effectively turning SQL Server into a single-user database) then you should be fine
At a past employer, we had a lot of problems with blocking, timeouts and deadlocks in both Access and other apps if more than 2 or 3 people were running reports from Access. We eventually dummped Access for Crystal Reports.
Hope this helps
Phill Carter
--------------------
Colt 45 - the original point and click interface
--------------------
Colt 45 - the original point and click interface
November 3, 2003 at 3:31 am
Using Access Projects (where Jet is dumped and the access file conatins forms and reports but no data) that will work fine.
Using Access Linked tables then forget it, you might as well run "begin tran while 1=1 select * from sysobjects with tablockx" each day instead
Keith Henry
DBA/Developer/BI Manager
Keith Henry
November 3, 2003 at 5:52 am
I guess I'll continue to be the lone Access supporter!
This soooo depends on what you're doing with Access. For example, if you're using only linked tables you'll get better performance than if you combine linked tables with local tables (thats a damn broad generalization). If you're holding pessimistic locks, that will decrease performance.
I guess what I'd expect a DBA to do is try it first and then tell me/show me the problems. Depending on the load it might be survivable, might require tuning on front end or back end, or might not be doable. How much effort goes into it would depend on how much time it would take to move that functionality to a "real" app or a Crystal report or whatever.
Andy
November 3, 2003 at 8:09 am
I am fairly new SQL Server but have some Access 97 experience.
I have gone down the Access 97 front end (unbound) / SQL Server backend route because I did not know any better (still don't)
I have put together a CRM type application which will support 400+ users on our WAN.
99% of our corporate 'real apps' use Oracle. Our Access app compares extremely favourably performance wise (fastest they have seen) with split second responses.
I guess the key has been minimizing the data transferred across the WAN and optimising performance:
- only requesting data when absolutely required
- Storing reference data in the front end and checking for changes at logon
- Using ADO and Passthru queries (never linked tables)
- Never using SQL from Access - always use procedures or functions.
- Process on the server wherever possible
- Aggregate data
- Use drill down rather than large queries
I am certain there are better tools I should be using. But not knowing what they are and how to use them and convincing anyone to let me buy them ......
November 4, 2003 at 5:29 am
We have been using SQL Server with Access 97 for years and have recently upgraded to Access XP and SQL Server 2000. Our performance is consistently <=2 seconds to open a report with custom parameters on large data files.
We have no data in the Access database; but use pass-through queries to pull in data for reports and Ad hoc querying.
We are currently in the process of rewriting the application as an XP project, which will gain us additional performance.
November 5, 2003 at 3:02 am
I want you to read this very carefully
here is the solution of the Hang Problem between Access and SQL server 2000 or 7 🙂
When you want a report which based on complicated queries joined from several tables, These tables are updated frequently, then you lock these tables (or you will wait for previous transaction lock on these tables to be completed to put your locks on them) .. so all other poeple using these tables are locked too (so hang happened - because they are in the queue of lock of these tables)!!
if you use a query in Access and make a report based on it, and in that query "properties" you choose "No locks" in "Record locks", the query will lock some tables or pages (so although you choose "No locks", it still makes locks!! -the lock type is IS lock- .. because it is a bug in Access with SQL Server)
the best thing is to run the query on the server (by making Pass-Through Query in Access) .. and with each table name in the query, add this statement "with (nolock)" .. so you read data as it is (not recently updated = not commited transaction), and you don't wait for locks to be free and you don't lock these tables, and any other transaction will complete successfully (for you and others) 🙂 .
you will write something like this in your Pass-Through Query in Access:
select Invoices.*, Situation.*
from Situation with (nolock) INNER JOIN Invoices with (nolock)
ON Situation.SituationID = Invoices.Situation
where Situation.SituationID =1
Disadv:
- when using "with (nolock)", you will not seen comitted transaction (changed one) at this moment ... but you can see it at any other time.
Adv:
- no hang at all
- very fast Response
- Little summary for Locks in SQL Log file.
also : you may need to add more Memory (RAM) when server still hangs a little after transfering your queries to path-through queries.... becuase every transaction or query is done on server first, then processed data is passed to your PC.
ALSO:
try to put the DATA and the LOG file of SQL Database on a partition that is Formatted with 32 K or 64 K Cluster Size (not the defalut = 4 K)
because the default Page size in SQL in memory is 8K ,
And If u use the defulat Cluster Size while formating partioin, then to get 1 page (8K), you need 2 hits for the HardDisk (4 K each)
But If u use 32 K Cluster Size while formatiing partioin, then on 1 hit for the HardDisk (32 K each) , you get 4 pages into Memory (more faster ... because the slowest thing that slow the whole system is to read/write from Hard Disk)
I use 32 K not 64 K.. because with 64 K you may have a lot of empty pages in memory ( lost in memeory and your DB may be graw more)
this of course will help you to solve the problem of hang more and more .. bye
Alamir Mohamed
Alamir Mohamed
Alamir_mohamed@yahoo.com
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply