October 20, 2004 at 3:05 pm
When I run this bulk insert embedded in a stored procedure and as a scheduled sql job after my first "bad data in one field on only 1 record" will fail the job
When I run this via query analyzer it reports in the result pane that there was a bad record but the rest of the records are loaded without aborting the load.
Any ideas????
BULK INSERT CARECLMTC_WRK FROM 'G:\CARESTAT\H7TCLMTC.DATA' WITH (BATCHSIZE=1000, TABLOCK, MAXERRORS=10, DATAFILETYPE='char', FORMATFILE='E: \scripts\Load_CARECLMTC.fmt')
October 20, 2004 at 6:40 pm
My understanding is that scheduled SQL job steps run as transactions, ie they either fail or succeed. Partial success is not a possible outcome - as it appears to be in QA - and this may explain the different behaviour you are experiencing.
The absence of evidence is not evidence of absence
- Martin Rees
The absence of consumable DDL, sample data and desired results is, however, evidence of the absence of my response
- Phil Parkin
October 21, 2004 at 9:04 am
Another thing to point out is that a job is based on the underlying DTS, and the properties of the DTS might specify that it "fail on first error".
See: Package|Properties|Logging
October 21, 2004 at 10:53 am
It may be with security. When running in query analyzer it's using your credentials; when running as a job, it's using the credentials from the service account.
October 21, 2004 at 12:25 pm
As a follow up to the security suggestion: There is a user option called xact_abort that may have a different setting for different credentials. This is used to indicate the transactional behaviour that is wanted.
See user options or XACT_ABORT in BOL for more info.
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply