February 14, 2005 at 10:18 pm
Hi all,
Could not able to handle the error in my sp. Code is as follows :-
loop start----
select @statement1='insert ' + @Table + ' values('+@column+ ')'
exec(@statement1)
set @m_err = @@error
IF @m_err <> 0
Begin
Select @tablename
select @errdesc=description from master..sysmessages where error=@m_err
insert into dbo.tempLog values(STR(@m_err),@errdesc,@Tablename,STR(@lineno),@filename,getdate())
End
Loop end ----
I want want to exit from loop in case of Error, my this code is not able to handle .. Pls. suggests.
Thanks for help
February 14, 2005 at 10:20 pm
What is the error you are recieving?
--------------------
Colt 45 - the original point and click interface
February 14, 2005 at 10:34 pm
Thanks Phillcart,
The error is as bellow.
Server: Msg 241, Level 16, State 1, Line 1
Syntax error converting datetime from character string.
But i want to Process all the Item in loop. I just want to ignore the error by insert the error detail in a error log table and process next table/next record fetched by cursor?
Thanks for help !!!
February 14, 2005 at 10:42 pm
this have an introduction to handling error in Sps
http://www.sqlteam.com/item.asp?ItemID=2463
My Blog:
February 14, 2005 at 11:01 pm
I think you'll find that a "Syntax Error" aborts the batch.
If you want to ignore the error, why not validate the data before executing the statement? Maybe something like ISDATE() would help.
On another topic altogether, why are you using cursors?? Surely you could perform better set- based data scrubbing??
--------------------
Colt 45 - the original point and click interface
February 14, 2005 at 11:06 pm
Is this related to this task you were asking about previously?
http://www.sqlservercentral.com/forums/shwmessage.aspx?forumid=8&messageid=159272&p=3
How big is the file you're importing?
--------------------
Colt 45 - the original point and click interface
February 14, 2005 at 11:39 pm
Hi Phill,
Yes its related with same with few changes... The file is having around 50000 record or can be more than that. In few cases table column position is not matching from source file... In that case its giving Datetime error ... I want to Ignore that error and proceed ahead .. is that possible ... if Yes How ?
Thanks
February 15, 2005 at 1:53 am
See if this helps
http://www.sommarskog.se/error-handling-II.html
http://www.sommarskog.se/error-handling-I.html
--
Frank Kalis
Microsoft SQL Server MVP
Webmaster: http://www.insidesql.org/blogs
My blog: http://www.insidesql.org/blogs/frankkalis/[/url]
February 15, 2005 at 3:55 am
Thanks Frank,
I have taken help from the website and changed my code like
Begin
select @statement1='insert ' + @Tablename + ' values('+@column+ ')'
print '2nd'
exec @m_err = sp_executesql @statement1
select @@error, @m_err
print @m_err
IF @m_err <> 0
Begin
select @errdesc=description from master..sysmessages where error=@m_err
insert into dbo.PricingInsertErrorLog values(STR(@m_err),@errdesc,@Tablename,STR(@lineno),@filename,getdate())
End
End
But its still can't able to trap the error.
thanks for Help... Help me to solve the problem
February 15, 2005 at 3:16 pm
Ok, given the small amount of records, how about running a short VBScript over the file to split it into a seperate file for each table. Once that is done you can import each file into its own table using the '|' character as the field seperator. Having the data in seperate fields will allow you to do some basic data scrubbing as well. You'll probably eliminate a lot of the dynamic querying and you'll have the ability to index seperate fields which would in turn increase performance. Also, with each table in a seperate file, you'll have the opportunity to process files in parallel.
I've taken this approach in a couple of projects I've worked on. We used to receive a file that had about 20 different record types across approx 100,000 records. The VBScript split out all the different record types into seperate files and then fired off 20 odd scheduled jobs to import the files. It worked out pretty well. When they added a couple of new record types to the file it was as easy as creating another import routine for each of the new types and changing the VBScript to call those additional routines.
--------------------
Colt 45 - the original point and click interface
February 15, 2005 at 7:48 pm
Yes Phill, I think u r right.. it will increase performance... but in that also you will read file ... you will generate insert statement like this
insert Table1 values('666','AAAA','08','03','k004','5300001702','FFFF','99991231','20040107','0000339130')
Here i am trying to the one in red (datetime value) into char(4) data column... and its generating error...
Actually it will work when we supply like this.
insert Table1 values('666','AAAA','08','03','k004','5300001702','99991231','20040107','FFFF','0000339130')
But pls. consider that this is dynamic sql... and we have only 15-20 records like this out of 50000 or more.
If you think your Idea can still solve my problem pls. help me to implementing that.
Thanks for help.
February 15, 2005 at 8:47 pm
You won't need the dynamic SQL.
From your previous post on this problem, you have a data file that looks like this,
A222|ZS08 |0005 |H350 |GENERAL_REPAIR |A |28.02.2004|01.02.2004|0000352836 |000 A223|ZS16 |0005 |02 |03 |03 |20W50FOL080KLG |31.12.2001|01.01.2001|0000044759 A222|ZS01 |0002 |02 |VG602-11311 |31.12.9999|01.01.2002|0000054528 | A940|ZS01 |0002 |03 |01783-03760 |31.12.9999|20.09.2001|0000029494 | A940|ZS01 |0002 |03 |04111-06012 |31.12.9999|20.09.2001|0000029495 | A940|ZS01 |0002 |03 |04111-16124 |31.12.9999|20.09.2001|0000029496 |
Ok, so disregarding all the different columns and datatypes, you just use the first 4 characters to split up the file. The above data would end up in three seperate files A222.txt, A223.txt, A940.txt and because they are in different files you don't need the first five characters in the file anymore so the A940.txt file would just contain,
ZS01 |0002 |03 |01783-03760 |31.12.9999|20.09.2001|0000029494 | ZS01 |0002 |03 |04111-06012 |31.12.9999|20.09.2001|0000029495 | ZS01 |0002 |03 |04111-16124 |31.12.9999|20.09.2001|0000029496 |
This then becomes a very stright forward DTS datapump task to import the data into a "A940" worktable. Use the '|' character as the seperator and you have the data sperated into seperate fields. It's probably a good idea to have these fields defined as varchar, that way you shouldn't run into any data conversion problems during the first import. Then you can run various data-scrubbing routines ( ISDATE, ISNUMERIC, etc... ) across the records to filter out, or correct, invalid data.
Then instead of executing an "INSERT ... VALUES ..." you execute "INSERT INTO ... SELECT .... FROM ... "
--------------------
Colt 45 - the original point and click interface
February 15, 2005 at 9:10 pm
According to you,
1st i need to break down the Flat files into smaller-smaller tables. Then i need to execute separte-separate DTS to transfer the same into db.
Thanks
February 15, 2005 at 9:26 pm
That's right. That way you eliminate the frustrations you're experiencing now with poor performance, data quality, etc...
For the import part, you can have a single DTS package with multiple datapumps if you like.
--------------------
Colt 45 - the original point and click interface
February 15, 2005 at 9:47 pm
I am not very good in DTS... For as soon as i will create separete text file for each table... i may able to handle.
Pls. give me help for transfering into txt file...
Thanks
Viewing 15 posts - 1 through 15 (of 18 total)
You must be logged in to reply to this topic. Login to reply