February 28, 2011 at 11:43 am
I am trying to execute a sql script file using oSQL utility which is larger than 1GB. It works perfectly in SQL Management studio Browser.
EXEC master.dbo.xp_cmdshell 'osql -S Servername -U sa –P pwd –I “c:\SQL Script.sql”'
However when I execute this statement via c# code, it pushes the data into 7 tables out of 10 tables. The SQL command does not throw any exception and successfully executes the next c# code. My command time out is set to zero. I am aware that if I don’t specify –t argument in oSQL command default command time out is zero.
Then, what could be the issue?
February 28, 2011 at 12:11 pm
- Can't you upgarde to use SQLCMD in stead of osql ?
- What's the reason to open xp_cmdshell to perform your sqlcmd ?
Is there a need to perform in _now_ in your app / sproc / ... ?
Did you check other options ?
Johan
Learn to play, play to learn !
Dont drive faster than your guardian angel can fly ...
but keeping both feet on the ground wont get you anywhere :w00t:
- How to post Performance Problems
- How to post data/code to get the best help[/url]
- How to prevent a sore throat after hours of presenting ppt
press F1 for solution, press shift+F1 for urgent solution 😀
Need a bit of Powershell? How about this
Who am I ? Sometimes this is me but most of the time this is me
February 28, 2011 at 12:49 pm
ALZDBA,
I could see using xp_cmdshell to avoid pushing a 1GB file across the wire. I do agree with you however that sqlcmd would be preferable over osql.
mailtosaravanan.m,
xp_cmdshell may not return an error just because your call to osql encounters an error within SQL Server while running your script...i.e. osql may be saying "I succeeded in executing your script against the database server as requested" however that says nothing about the effectiveness of the script within the database server which is what we really care about.
I would recommend capturing the return code of xp_cmdshell just in case something catastrophic occurs and the command completely fails. More importantly I would recommend capturing all output from the command into a temp table so you can inspect it after execution:
DECLARE @cmd VARCHAR(8000),
@return_code INT ;
SET @cmd = 'sqlcmd.exe command line' ;
--create temp table to hold results
CREATE TABLE #temp
(
result_line VARCHAR(8000)
) ;
--capture results in temp table
INSERT #temp
EXEC @return_code = master.sys.xp_cmdshell
@cmd = @cmd ;
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
February 28, 2011 at 12:54 pm
PS check out the -b option of sqlcmd...it may be useful in this context if you are not handling errors within your script and do want sqlcmd to set the DOS ERRORLEVEL when a SQL error occurs (will set @return_code to non-zero in my code example above).
http://msdn.microsoft.com/en-us/library/ms162773.aspx
There are no special teachers of virtue, because virtue is taught by the whole community.
--Plato
February 28, 2011 at 7:31 pm
I know this is of no help and I apologize for that but I have to know... A billion+ byte SQL Script? What on Earth does it do?
--Jeff Moden
Change is inevitable... Change for the better is not.
March 1, 2011 at 5:51 am
opc.three (2/28/2011)
ALZDBA,I could see using xp_cmdshell to avoid pushing a 1GB file across the wire. I do agree with you however that sqlcmd would be preferable over osql.
mailtosaravanan.m,
xp_cmdshell may not return an error just because your call to osql encounters an error within SQL Server while running your script...i.e. osql may be saying "I succeeded in executing your script against the database server as requested" however that says nothing about the effectiveness of the script within the database server which is what we really care about.
I would recommend capturing the return code of xp_cmdshell just in case something catastrophic occurs and the command completely fails. More importantly I would recommend capturing all output from the command into a temp table so you can inspect it after execution:
DECLARE @cmd VARCHAR(8000),
@return_code INT ;
SET @cmd = 'sqlcmd.exe command line' ;
--create temp table to hold results
CREATE TABLE #temp
(
result_line VARCHAR(8000)
) ;
--capture results in temp table
INSERT #temp
EXEC @return_code = master.sys.xp_cmdshell
@cmd = @cmd ;
Essentially, I need to backup and restore a database from SQL server 2008 into SQL server 2005. Hence I decided to create SQL scripts for data and restore it in SQL 2005. Since the script executes 10 million SQL statements, it would be really defy identifying the exception details from temp table.
Can you suggest me some alternatives.
March 1, 2011 at 6:24 am
Why aren't you using an SSIS app to do that in a performant way for you ?
Johan
Learn to play, play to learn !
Dont drive faster than your guardian angel can fly ...
but keeping both feet on the ground wont get you anywhere :w00t:
- How to post Performance Problems
- How to post data/code to get the best help[/url]
- How to prevent a sore throat after hours of presenting ppt
press F1 for solution, press shift+F1 for urgent solution 😀
Need a bit of Powershell? How about this
Who am I ? Sometimes this is me but most of the time this is me
March 1, 2011 at 6:56 am
Saravanan M (3/1/2011)
Essentially, I need to backup and restore a database from SQL server 2008 into SQL server 2005. Hence I decided to create SQL scripts for data and restore it in SQL 2005. Since the script executes 10 million SQL statements, it would be really defy identifying the exception details from temp table.Can you suggest me some alternatives.
Yes I can! 😀 BCP the data out to a file in "native" format and BCP it back in. It will be MUCH faster. Behind the scenes, it's what Replication uses for the initial snapshot.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply