February 26, 2012 at 9:47 pm
Comments posted to this topic are about the item How to join a local table to a remote table by way of a linked server
February 26, 2012 at 10:09 pm
Thanks for the details. I am using linked server for accessing data from remote servers. But I wonder whether I need to handle the transactions manually, if I use linked servers for writing data.!! what you think ?
February 26, 2012 at 11:21 pm
You may be interested in reading one of my posts, which explores the use of EXECUTE...AT clause for executing queries against linked servers: http://beyondrelational.com/blogs/nakul/archive/2011/01/10/executing-queries-against-linked-servers.aspx
Thanks & Regards,
Nakul Vachhrajani.
http://nakulvachhrajani.com
Follow me on
Twitter: @sqltwins
February 27, 2012 at 12:39 am
Hi, any specific reason way you don't use the openquery command?
February 27, 2012 at 1:34 am
I believe the reason would be because OPEN queries require that Adhoc queries be enabled on the SQL Server, which, depending upon the environment and the situation, might be considered a security loop-hole.
Thanks & Regards,
Nakul Vachhrajani.
http://nakulvachhrajani.com
Follow me on
Twitter: @sqltwins
February 27, 2012 at 3:16 am
Linked serverd id quite a big subject with the security levels, ditributed transactions etc.and I think this post could have gone in to a little more detail without detracting from the subject matter.
I think Openquery was worth mentioning and I dont think it was avoided on account of security as the way this linked server example has been set up is highly inssecure anyway.
2 interesting points I think are worth making though.
As a general rule, don't do linked security in the way noted in the article as all connections will be executed remotely with this remote user regardless of the loacal user. Better to map all allowed local users to a remote login and best to use windows security locally if possible.
Secondly there isa gothcha thats worth noting. The linked server with inner join will not work woth XML columns. You would have to use openquery and convert the remote xml field to varchar and then convert it back again.
And finally I would like to mention that, touched on in the article, the copying of a remote table to a local temp table can often be a good idea as this avoids the need for a distributed transaction and therefore reduces overhead and complexity of the security system.
Otherwise I thought the article was a pretty good introduction.
Dave
David Bridge
David Bridge Technology Limited
www.davidbridgetechnology.com
February 27, 2012 at 7:10 am
Copying the entire remote table into a local temp table is not an option when the remote table has millions of rows. Have a look at the REMOTE join hint. There are a couple of caveats with this hint, but if one can properly index the remote table, using this join hint provides very good performance.
February 27, 2012 at 7:36 am
I agree that it's best to map local logins to an account on the remote server, rather than have all logins access the linked server using a default security context. Otherwise, even an account with nothing but PUBLIC access will have equal access to the linked server.
Mapping local logins to remote accounts can be done even when the remote server doesn't have (or doesn't support) windows authentication. Here is how one my linked servers are configured where the remote server is Oracle. A windows domain account, MYCORP\ReportingUser, can access the linked server under the context of the remote Oracle account, RPT_USER.
Local server login to remote server login mappings:
Local Login: MYCORP\ReportingUser
Impersonate: No
Remote User: RPT_USER (account on remote Oracle server)
Remote Password: ########
For a login not defined in the above list, connections will:
- Not be made
"Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho
February 27, 2012 at 8:19 am
When I require live data, I almost always join against OPENQUERY resultsets - much, much, MUCH faster than directly joining to the remote server.
Or else an SSIS job.
-sff
February 27, 2012 at 8:53 am
Nice article. I find in actual use where the person creating the linked server is a company level DBA that providing scripts is a necessary component. Also be aware that you may need to add multiple users (we did). There is also some detail in regards to linking to non-SQL Server servers (Oracle?) that would be useful to see.
Joshua
February 27, 2012 at 8:57 am
@GrassHopper - I don' usually use OPENQUERY and a linked server works better if you have multiple developers who know standard join syntax but are not as knowledgeable about OPENQUERY.
@all - I found it very useful to create SYNONYMs for the four part access path. Cuts down on the typing and can be clearer.
February 27, 2012 at 9:57 am
As a general rule, don't do linked security in the way noted in the article as all connections will be executed remotely with this remote user regardless of the loacal user. Better to map all allowed local users to a remote login and best to use windows security locally if possible.
Not to mention you can run into the double hop security issue with Kerberos.
Ken
February 27, 2012 at 10:36 am
Nakul Vachhrajani (2/27/2012)
I believe the reason would be because OPEN queries require that Adhoc queries be enabled on the SQL Server, which, depending upon the environment and the situation, might be considered a security loop-hole.
Sorry, I don't think this is correct. Adhoc needs to be enabled for OPENROWSET and OPENDATASOURCE, but not for OPENQUERY.
When you use OPENROWSET or OPENDATASOURCE, you're basically creating a linked server on the fly, AFAICS, since the connection string is included in the arguments you pass the function. With OPENQUERY, the linked server already exists. That's what makes the first two "ad hoc" and OPENQUERY not "ad hoc", and should also be a clear indication of why the first two represent a different security risk than the third. See?
Yes, the Kerberos double-hop thing can be a bear and yes the security context has to be locked out. But, with a non-ad-hoc syntax, which OPENQUERY is, you have control over that.
For other people in this thread who feel that the fully-qualified syntax is better for developers than OPENQUERY, there are at least two reasons why this is not a great idea:
1 - even when the linked server is MS SQL, and if you decide you want to put the data someplace else, or if you just want to hit a test database rather than production, and if you have set up your linked server using MS OLE DB Provider for SQL Server or another provider (versus "Server type = SQL Server"), you can *name* the linked server to represent the data you're connecting to, rather than the server name itself, and the move is transparent to your developers because that name doesn't need to change. I realize that synonyms/aliases are another way to do it, but see point #2.
2 - you've limited the developer's thoughts about heterogenous queries to "queries between two MS SQL Servers". What happens when the HR system is Oracle or the CMS system is MySQL? Get them used to the right syntax now. As a bonus, they might not have to change much of anything in their code, when you migrate a system to a different back end.
>L<
February 27, 2012 at 10:53 am
Is it still that case that any join (heterogeneous to another SQL*Server or to some other DB, say Oracle) via a linked server db table and a local db table will basically ignore any relevant remote table where clause constraints and bring the whole remote table locally before then executing the constraints?
This is not obvious to track or assess, but temp space starts to fill up and hasn't always been released readily once the query finished.
The author's advice to create a local #temp table with the remote data essentially does this latter remote copy explicitly and then does the join locally.
I have avoided heterogeneous joins between linked servers and find SSIS is vastly faster to accomplish such tasks.
It might be worth having the links though for development purposes to hit up against remote tables for the sake of schema and data assessment.
The descriptions people give below of the security issues make it seem someone tedious to manage robustly with a fine granularity of privilege control.
It was a good article in terms of explaining the ways you can set up linked servers though and I want to thank the author for his contribution.
February 27, 2012 at 11:11 am
What I typically do is execute a pass-through style query on the remote server using EXEC ('..') AT REMOVE_SERVER syntax and output the result into a temp table, which I then join with my local query. It's not necessary to bring across the entire remote table, of course one only needs specific columns and the remote resultset can be filtered by date, category, client, etc.
"Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho
Viewing 15 posts - 1 through 15 (of 24 total)
You must be logged in to reply to this topic. Login to reply