August 1, 2013 at 1:50 pm
Hi guys,
So after 3 days of dev I'm now collecting enough data from remote DBs to be able to knock up an OK web front end. The last part was the most annoying.
I'm using openquery & exec(@Q) over my linked servers but for one script this proved very annoying.
The query searches for job steps which recently failed that included a backup. It then searches the message column to find out which database was at fault. The problem is this requires a bunch of charindex/substring to get to it.
In PL/SQL you can do: q'[some query]' and it will automatically escape all apostrophes between the square brackets. Does TSQL have similar? This is the kind of mess I was left with:
set @query = 'insert into backup_history
(INSTANCE,DB_NAME,START_DATE,END_DATE,SIZE_MB,TYPE,STATUS)
SELECT * from openquery(' + @server + ',''SELECT @@SERVERNAME server,
SUBSTRING (sh.message,charindex(''''database '''''''''''',sh.message)+10,
case when ((charindex('''''''''''''''', sh.message, charindex(''''database '''''''''''',sh.message)+10))-(charindex(''''database '''''''''''',sh.message)+10)) = -10
then 0
else (charindex('''''''''''''''', sh.message, charindex(''''database '''''''''''',sh.message)+10))-(charindex(''''database '''''''''''',sh.message)+10) end) db
As you may have guessed predicting the right number of apostrophes to include in each section was the main issue.
August 1, 2013 at 3:39 pm
Hi,
Just as an idea. Is it suitable to you to execute a separate scripts from t-sql or from your code. Then you'll not have those issues with the apostrophes.
Regards,
IgorMi
Igor Micev,My blog: www.igormicev.com
August 1, 2013 at 3:54 pm
Hi Igor,
What do you mean?
August 1, 2013 at 4:04 pm
Dird (8/1/2013)
Hi Igor,What do you mean?
I mean if you can put your code with many apostrophes in another script/sp and use it from another place...
Igor Micev,My blog: www.igormicev.com
August 1, 2013 at 4:15 pm
but then it will still have many apostrophes? just in a different location~
August 1, 2013 at 4:27 pm
Dird (8/1/2013)
but then it will still have many apostrophes? just in a different location~
I think i get your issue. What about this:
set quoted_identifier off
declare @query nvarchar(1000)
declare @server nvarchar(100)
set @query = "insert into backup_history
(INSTANCE,DB_NAME,START_DATE,END_DATE,SIZE_MB,TYPE,STATUS)
SELECT * from openquery(" + @server + ",''SELECT @@SERVERNAME server,
SUBSTRING (sh.message,charindex(''''database '''''''''''',sh.message)+10,
case when ((charindex('''''''''''''''', sh.message, charindex(''''database '''''''''''',sh.message)+10))-(charindex(''''database '''''''''''',sh.message)+10)) = -10
then 0
else (charindex('''''''''''''''', sh.message, charindex(''''database '''''''''''',sh.message)+10))-(charindex(''''database '''''''''''',sh.message)+10) end)" db
or maybe you'll put the double quotes on different place. See http://msdn.microsoft.com/en-us/library/ms174393.aspx as well.
IgorMi
Igor Micev,My blog: www.igormicev.com
August 1, 2013 at 4:47 pm
Hi Igor,
No. The issue is "''''''''''''''''"...so many! because for every ' I have to do '' to escape it. Is there another way to escape in TSQL? So I can just put ' instead of ''
August 1, 2013 at 4:56 pm
Dird (8/1/2013)
Hi Igor,No. The issue is "''''''''''''''''"...so many! because for every ' I have to do '' to escape it. Is there another way to escape in TSQL? So I can just put ' instead of ''
That's what i'm trying to appoint through this examples:
--example 1
set quoted_identifier off
declare @STR nvarchar(100)
set @STR="I'm trying to escape ' "
print @STR
--example 2
set quoted_identifier on
declare @STR nvarchar(100)
set @STR='I''m trying to escape '' '
print @STR
Can't you apply it?
IgorMi
Igor Micev,My blog: www.igormicev.com
August 2, 2013 at 12:28 am
Oh I see. I will have a look, thanks 🙂
August 2, 2013 at 1:28 am
See http://www.sommarskog.se/dynamic_sql.html#OPENQUERY for some tips.
[font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply