April 9, 2005 at 3:25 pm
I have multiple files(in one folder) to import into say tableA.
we have 10,000 records in all of these files. The import should be uninterrupted by any error in the transformation like say there was a datatype missmatch in one row then it should be logged some how and continue with the next row to import
do anyone have any idea how to do this
Thanks in advance
April 11, 2005 at 5:00 am
What I have done in this situation is create a stored procedure that calls the DTS package. In the DTS package I create Global variables to handle things such as path information, and parameters to any stored procedures it might call. I then call my DTS SP from another stored procedure passing the information regarding the specific upload/download.
If you do a web search you can find a number of examples on how to call a DTS package from a stored procedure. I typically create an ActiveX task to handle the opening/creating of files and reading to/from a table, but there are a number of ways to go.
In any case Calling the DTS package from a stored procedure will allow you to trap the errors and move on to the next record as needed.
Good luck
April 11, 2005 at 11:14 am
Incubus Online,
Before you do anything else, save the package as a different DTS package. Then...
Two things to change within the DTS:
Go to menu > package > Disconnected Edit then go to Tasks and click on the task that is failing due to errors. Located within this task properties is MaximumErrorCount and change that to a large number. Click the close button.
Go to menu > package > Properties. Click on the Logging tab. Check "Log package execution to SQL Server" Enter in the server and authentication type. Under Error handing near the bottom, click on Browse and enter a Directory\filename to store more of your error information.
Good luck.
Sincerely, Shieldj
April 12, 2005 at 2:10 pm
Thank you John and shieldj
Your ideas get me started..
John, I see ur point. with @@error, it can be checked if there is an error and if there is then the erronous row has to be logged(say in a table) This is done only when we are dealing with each row seperately (like cursors). but if i do a bulk insert and a row has to fail, how can i trap this row. Why i am bothered about this is that the data may potentially increase to about 5 milion rows, which makes it impossible to manually look for the erronious row
Shieldj, This is a very useful info', i am gonna implemet this and test it with some rouge data rows.
Thank you guys
one more problem: since this data is confidiential. all the comparisions has to go encrypted. which makes th whole process slow. do u know how can i improve performance on this..
April 13, 2005 at 8:30 am
I my case, I created a new table (lets say B) with exact same fields as the tables where the data will eventually be loaded (lets say table A). However, in the table B, set all fields to varchar(255) or whatever you anticipate your longest text field to be. This will import all data from text field to table B without any errors.
Now, write another script to check data on table B. For instance, for datetime field, you can use ISDate or Numeric field you can use ISNumeric. If the criteria does not match, enter this data into a third table (lets say table C) and delele the data from table B.
Next step is to insert the remaining data in table B into table A. Since all the processing on datatype checks is done, there should not be any errors.
You can also do checks on duplicate data on table if you want to.
I am writing this on the fly and am not sure if I am clear enough. However, if you need further help, let me know and I can send you some codes.
Thanks
Eric
April 13, 2005 at 8:40 am
In your Transform Data Task use ActiveX transformation. In transformation look at each column in each row. If there was datatype missmatch Just skip it using DTSTransformStat_SkipInsert constant in return value of transformation.
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply