September 28, 2002 at 1:32 am
My application put everything as an error returned by sql server so I need to supress messages while running procedure....
like when I run a procedure with the help of set nocount on I can suppress the messages of rowcount affected...
The problem is some warning are still returned by the procedures like this:
Warning: Null value is eliminated by an aggregate or other SET operation.
How to supress this warning?????
Prakash
Prakash Heda
Lead DBA Team - www.sqlfeatures.com
Video sessions on Performance Tuning and SQL 2012 HA
September 29, 2002 at 8:35 am
I'm not sure this is exactly what you are looking for, but you might try the -r option of osql.
Gregory Larsen, DBA
If you looking for SQL Server Examples check out my website at http://www.geocities.com/sqlserverexamples
Gregory A. Larsen, MVP
September 29, 2002 at 10:45 am
Try set ansi_warnings off. If you dont want to modify your procs, you could just look for the error text of this particular error and strip it out before showing it to the user (if at all).
Andy
September 30, 2002 at 8:34 am
I have a similar complaint.
I'm using Bulk Insert with the following stipulation:
If any lines in the import file error on insert, I want to be able to edit the lines and rerun the Bulk insert without getting duplicates for all the rows that inserted correctly the first time.
My first solution was to put a primary key on the table: Duplicates would not be inserted. The problem that I have with this, is that as soon as the procedure attempts to insert something that violates the primary key, the procedure terminates.
Is there a good way to stop the procedure from terminating on a violation of primary key constraint? I looked into the check_constraints property of bulk insert, but with or without it, the procedure still terminates.
Thanks,
Joel
September 30, 2002 at 9:18 am
Try making it a unique index (not pkey) and set the ignore dupes option.
Andy
October 1, 2002 at 12:37 am
jraha,
Whilst creating an index with IGNORE_DUP_KEY will work I'm wondering how you go about identifying the lines that failed on the import and hence why you don't just fix them before hand.
When I've got a file of dubious character, if I can, I usually import into an import table with all columns as varchars and then process from there into the proper table via an sp. If that can't be done (due to varying column numbers etc) then I process the file in entirity first.
But then you may have a scenario I haven't come across.
Cheers,
Mike.
January 22, 2003 at 10:11 am
Sorry I didn't respond on this. I guess I figured something out with the help from all knowing andy and ran with it.
The scenario I was dealing with inserts ~2 million rows per day, per client. Each row was ~2K. So everything that I was doing there was extremely time sensitive. The process required inserting into the db, then summarizing and joining with numerous other tables. I couldn't afford the time to insert the rows _and_ then move each row to another table based on its accuracy.
Ended up using a Perl script and regular expressions to ensure that each row of the import files were in the proper format. And then ran the bulk insert.
-J
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply