June 28, 2007 at 1:59 pm
I have an application (provided by a vendor) that has a View that links to a fairly large table via a linked server. Lately it seems like the responses from this table are going fairly slow, and the only thing I can think of is that there's just a lot of data going across the wire.
I know with a View on a local table you can create a Clustered Index on the View to get another (differently) ordered copy of the database saved to disk. Will the same thing work for a linked server? Since the data in the base table only changes a few times a day would a Clustered Index on the view "effectively cache" the data on the server that's linking across?
There are a few queries that run joins to this data, and in some rare cases, when an insert runs on the other joined tables, we end up with cascading blocks while waiting for the initial select to clear. Since it's a Vendor Application, I can't just go in and modify the code.
Will this help at all, or should I just set up a task to copy the data across after the load processes complete? Thanks!
June 29, 2007 at 7:54 am
1) Remote joins are not a good idea.
2) Is the remote table indexed appropriately (both for the join as well as for insert/update activity from local apps against it)? Are statistics up to date?
3) I would strongly consider some mechanism to keep the data in sync locally, whether that be replication or some regularly (or programatically controlled) batch job.
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
Viewing 2 posts - 1 through 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply