March 27, 2014 at 8:03 am
Hi
My current employer has a slew of homemade reporting using MS Access. At times, these reports place a great deal of load on SQL Server, and I'm looking for ways to reduce resource usage. The Access database is set up with several linked table, and queries are then run off these linked tables.
I'm wondering if there would be any performance benefit to switching these reports over to SSRS. Will the reports run faster in SSRS? Will the CPU/IO/Memory be reduced on SQL?
I will be running some tests on a per-query basis to try to find out the difference, but in the meantime if anyone has any experience with this, I'd be greatful for your insights. 😀
Be still, and know that I am God - Psalm 46:10
March 27, 2014 at 8:05 am
SSRS just issues queries defined in the data sets agains the database and that's it. So it depends on the queries you write in the reports.
With SSRS you also have caching and the ability to use snapshots.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
March 27, 2014 at 1:22 pm
david.gugg (3/27/2014)
HiMy current employer has a slew of homemade reporting using MS Access. At times, these reports place a great deal of load on SQL Server, and I'm looking for ways to reduce resource usage. The Access database is set up with several linked table, and queries are then run off these linked tables.
I'm wondering if there would be any performance benefit to switching these reports over to SSRS. Will the reports run faster in SSRS? Will the CPU/IO/Memory be reduced on SQL?
I will be running some tests on a per-query basis to try to find out the difference, but in the meantime if anyone has any experience with this, I'd be greatful for your insights. 😀
Are use MS Access Pass-Through queries to the base SQL Server or just Access queries against linked tables ?
Check this ..
http://www.mssqltips.com/sqlservertip/1482/microsoft-access-pass-through-queries-to-sql-server/
SSRS reports run queries on the base SQL Server too. The load depends on how good you write these queries. Encapsulating queries a stored proc will have better performance than Ad-Hoc SSRS queries.
--
SQLBuddy
March 27, 2014 at 2:05 pm
I would have to say based on my testing that I disagree with the premise in those links. As a very simple example, I open up a linked table in Access. The first 33ish records are displayed in the table. I close the table without scrolling down. Profiler tells me there were 75 reads against the base object, which happened to be a View. I open the table a second time and this time a scroll down to line 66. I close the table and Profiler says 264 reads. This leads me to believe that Access is only pulling back the needed number of records based on the current request.
The queries we have are against the linked tables. In a test, I executed a complex query in Access. This query joined a linked table to the results of two other queries, one of which was pulled from a subset of a third query. I got a total of ~3200 CPU, 420,000 reads, 2000 duration.
I ran the same query in SSMS by converting the "sub" queries to cte's. The results were almost the same, ~3200 CPU, 420,000 reads, 2000 duration. If Access was pulling back all the records, I'd expect a higher number. In fact, in SSMS I pulled out the WHERE clause - this should give me the same resource usage as Access if Access is doing the WHERE clause after pulling back all records from SQL. This time I get ~9400 CPU, 442,000 reads, 13500 duration.
This leads me to believe Access is not pulling back all records then filtering based on the WHERE clause.
Be still, and know that I am God - Psalm 46:10
March 27, 2014 at 2:20 pm
david.gugg (3/27/2014)
I would have to say based on my testing that I disagree with the premise in those links. As a very simple example, I open up a linked table in Access. The first 33ish records are displayed in the table. I close the table without scrolling down. Profiler tells me there were 75 reads against the base object, which happened to be a View. I open the table a second time and this time a scroll down to line 66. I close the table and Profiler says 264 reads. This leads me to believe that Access is only pulling back the needed number of records based on the current request.The queries we have are against the linked tables. In a test, I executed a complex query in Access. This query joined a linked table to the results of two other queries, one of which was pulled from a subset of a third query. I got a total of ~3200 CPU, 420,00 reads, 2000 duration.
I ran the same query in SSMS by converting the "sub" queries to cte's. The results were almost the same, ~3200 CPU, 420,00 reads, 2000 duration. If Access was pulling back all the records, I'd expect a higher number. In fact, in SSMS I pulled out the WHERE clause - this should give me the same resource usage as Access if Access is doing the WHERE clause after pulling back all records from SQL. This time I get ~9400 CPU, 442,000 reads, 13500 duration.
This leads me to believe Access is not pulling back all records then filtering based on the WHERE clause.
It depends. What kind of Access Queries are you using ? Native MS Access query or a Pass through SQL query ?
--
SQLBuddy
March 27, 2014 at 2:24 pm
They do not have the Pass-Through button toggled, so I believe they are native Access queries.
Be still, and know that I am God - Psalm 46:10
March 27, 2014 at 5:36 pm
david.gugg (3/27/2014)
I would have to say based on my testing that I disagree with the premise in those links. As a very simple example, I open up a linked table in Access. The first 33ish records are displayed in the table. I close the table without scrolling down. Profiler tells me there were 75 reads against the base object, which happened to be a View. I open the table a second time and this time a scroll down to line 66. I close the table and Profiler says 264 reads. This leads me to believe that Access is only pulling back the needed number of records based on the current request.The queries we have are against the linked tables. In a test, I executed a complex query in Access. This query joined a linked table to the results of two other queries, one of which was pulled from a subset of a third query. I got a total of ~3200 CPU, 420,00 reads, 2000 duration.
I ran the same query in SSMS by converting the "sub" queries to cte's. The results were almost the same, ~3200 CPU, 420,00 reads, 2000 duration. If Access was pulling back all the records, I'd expect a higher number. In fact, in SSMS I pulled out the WHERE clause - this should give me the same resource usage as Access if Access is doing the WHERE clause after pulling back all records from SQL. This time I get ~9400 CPU, 442,000 reads, 13500 duration.
This leads me to believe Access is not pulling back all records then filtering based on the WHERE clause.
I don't think this testing was comprehensive enough to come to a conclusion. There are lot of missing parts. For now, I will leave it to you.
Coming to performance, better thing is to convert a bunch of complex native access queries into SQL Pass-through queries and note the performance.
And convert those native access queries into SSRS reports and compare the performance. That way you can have some sort of performance comparison.
I think a combination of SSRS reports with stored procs will give you better performance and will have less load on the SQL Server.
--
SQLBuddy
March 29, 2014 at 5:09 pm
If you're using pass-through queries, it means that your SQL syntax will work against SQL Server without modification. If not, you may have a lot of work to do rewriting queries. A lot of the differences are minor, but they do exist.
If you have rights in the SQL database, you might test some of the queries directly in SQL Server to see if/which queries will work without modification.
March 29, 2014 at 11:08 pm
I have a similar situation as you. My employer is running all importation and exportation (reporting) through MS Access. However, we have developed a very complex module that was built for Access using VBA. All reporting is executed through assigned stored procedures and the MS Access only retrieves and renders the result set.
The performance issues I'm finding reside in MS Access itself. Having switched to SSRS, there was a significant improvement as well flexibility for the entire team utilizing reporting. Add that to the main benefit of moving away from having to use MS Access to a browser based client, everything is running much smoother.
Unfortunately, the original module was built using an older MS Access version. As time went forward, the teams upgraded MS Office, went from 32-bit to 64-bit but the module stayed the same. So, that was a big benefit for us to just switch everything over to SSRS being it was available, but no one was using it for reporting. As the stored procedures for the module were already built, we simply had to build the report templates using Report Builder.
April 3, 2014 at 10:37 am
Ok, I got time to run another test today and the results were quite different. This test used 3 "sub" queries in Access, and one of them had a HAVING clause. Run from Access, the results took ~11 seconds to return, 1100 CPU was used and it took ~80K reads. Run from an SSMS window the query completed in 0.3 seconds and used 300 CPU and 750 reads. A huge improvement! I wonder if the HAVING clause made a big difference this time.
Either way, I now have a good example I can take to the Access database owner with an idea for substantial improvements.
Be still, and know that I am God - Psalm 46:10
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply