November 7, 2002 at 9:48 pm
Hi All
Just wondering if anyone can advise how I would get a count of the records processed by the BulkInsert task.
There doesn't seem to be a property for it like the DataPump task has.
Thanks
Phill Carter
Edited by - phillcart on 11/07/2002 9:49:09 PM
--------------------
Colt 45 - the original point and click interface
November 10, 2002 at 6:23 pm
This was removed by the editor as SPAM
November 11, 2002 at 12:06 pm
I think @@rowcount should give you the number of records inserted.
begin tran
select * into authors2 from authors
select @@rowcount
commit tran
November 11, 2002 at 3:49 pm
quote:
I think @@rowcount should give you the number of records inserted.begin tran
select * into authors2 from authors
select @@rowcount
commit tran
That gives me how many rows ended up in the destination table. It doesn't give me how many rows where processed by BulkInsert. In comparisson, the DataPump task has ProgressRowCount, RowsComplete and RowsInError. However, I don't want to use the datapump task because I'm importing 2mil+ rows from a text file.
Thanks
Phill Carter
--------------------
Colt 45 - the original point and click interface
November 12, 2002 at 9:13 am
Have you tried using the Batchsize option.I think if you use this option with a specific size(1000) you can findout how many batches are processed and calculate the number of records processed.
November 12, 2002 at 3:56 pm
quote:
Have you tried using the Batchsize option.I think if you use this option with a specific size(1000) you can findout how many batches are processed and calculate the number of records processed.
That may work, but it raises two more questions.
1) How do you know how many "batches" have been processed?
2) How do you get a count of the final partial batch?
Thanks
Phill Carter
--------------------
Colt 45 - the original point and click interface
November 12, 2002 at 4:22 pm
What about @@rowcount along with the batch size
November 13, 2002 at 1:34 am
quote:
What about @@rowcount along with the batch size
Let me explain my situation.
I have a file that has around 2.2mil records in it. This file will be loaded into a table via DTS using the BulkInsert task.
I need to report a count of how many records where in the file, how many where successfully imported and how many failed.
Doing a @@ROWCOUNT on the destination table will give me how many records where imported. How do I get the rest of the info?
When using the DataPump task I can easily get all these counts using a combination of the ProgressRowCount, RowsComplete and RowsInError properties. However, using the DataPump task is not an option due to the length of time it takes to import the file.
I can write a short VBScript that'd open the file and the move through record by record, but that'd take ages as well.
Thanks
Phill Carter
--------------------
Colt 45 - the original point and click interface
December 3, 2002 at 8:59 pm
Try importing all rows into a staging table, do your validation on the data base table, then only import the rows that passed. Should still be fast and you will have all of the SQL Server functions available to do your validation and reporting.
December 9, 2002 at 4:50 am
quote:
Try importing all rows into a staging table, do your validation on the data base table, then only import the rows that passed. Should still be fast and you will have all of the SQL Server functions available to do your validation and reporting.
That's what I do now. What I need to do is report that all the data from the text file made it into the staging table.
Thanks
Phill Carter
--------------------
Colt 45 - the original point and click interface
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply