February 27, 2011 at 6:33 pm
I have a 1000 row text file with four columns like this
7564829516842860-1.0008/10/10768Fwd Event
094675892756123815.0009/07/10144Fwd Cash
Which needs to be pulled into a SQL server table with the following definition
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [dbo].[Occurances](
[EventSequenceNbr] [int] IDENTITY(1,1) NOT NULL,
[AccountNbr] [varchar](16) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[TotalAmt] [money] NULL,
[Flag] [char](1) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[PostingDate] [datetime] NULL,
[EventCode] [char](4) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[description] [varchar](255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
[EventDate] [char](6) COLLATE SQL_Latin1_General_CP1_CI_AS NULL
) ON [PRIMARY]
GO
SET ANSI_PADDING OFF
Column mapping is as follows
Column0 --> [AccountNbr]
Column1 --> [TotalAmt]
Column2 --> [PostingDate]
Column3 --> [EventCode]
Column4 --> [description]
I am doing a data flow task with text file as source
verified if all columsn are being identified or not
Added a OLEDB destination
task failed with conversion error
Then I added a derived column block and mapped the columns accordingly
for the troubling date column i did the following
(DT_DBTIMESTAMP)(SUBSTRING([date string], 7, 4) + "-" SUBSTRING([date string], 1, 2) + "-" + SUBSTRING([date string], 4, 2))
But I still get the error
'Error at the column2 failed converting the value to DATE format'
Please suggest
urgent
[font="Verdana"]
Today is the tomorrow you worried about yesterday:-)[/font]
February 27, 2011 at 10:13 pm
if possible, while inserting take the date as varchar and later you can modify your date field to datetime.
Just giving an example on how to convert:-
declare @date varchar(100) = '08/10/10'
print @date
set @date = CONVERT(date,@date)
print 'new date is ' + @date
so similary using update syntax you can modify your column from varchar to date or datetime.
----------
Ashish
February 28, 2011 at 1:57 am
Take a look at the following topic:
I would suggest to do the conversion in the data flow, as you were already trying to.
As the data is already in memory, a derived column operation (which is nonblocking) is blazingly fast.
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
February 28, 2011 at 7:58 am
Open the Flat File Connection Manager Editor, in the Advanced section, rename each column, select DateType and OutputColumnWidth according you destination table
Over all, it should looks like:
column rename to datatype OutputColumnWidth
Column0 --> [AccountNbr] string [DT_STR] 16
Column1 --> [TotalAmt] currency [DT_CY]
Column2 --> [PostingDate] database timestamp [DT_DBTIMESTAMP]
Column3 --> [EventCode] string [DT_STR] 4
Column4 --> [description] string [DT_STR] 255
This way, you don't need Derived Column Transformation. I have tried your sample data, without problems.
February 28, 2011 at 10:11 am
Thank You 'crazy4sql'. But I dont have rights to edit the table definition. Else I would have tried that approach. Thanks Anyways 🙂
[font="Verdana"]
Today is the tomorrow you worried about yesterday:-)[/font]
February 28, 2011 at 10:14 am
@@Koen
Thanks for the link. and Nice to learn that derived column works faster.
@@wwei
[font="Verdana"]Thank you so much for your solution [/font], It worked just like that. I appreciate your time. If you dont mind can you throw some light on 'When to use a derived column block for this kind of operation and when not to use'
Thanks all.
[font="Verdana"]
Today is the tomorrow you worried about yesterday:-)[/font]
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply