May 28, 2008 at 8:33 am
@pRecType="A",@pA010="XC",@pA020="270",@pA110="CDC LOCAL"
@pRecType="C",@pC010="M",@pC015="H",@pC050="20080408",@pC060="B",@pC070="E40245",@pC080="P",@pC110="95000",@pC112="000000000",@pC120="P",@pC121="N",@pC122="",@pC124="100000002274",@pC125="166759",@pC210="Y",@pC301="000061131006",@pC320="20080211",@pC321="20080211",@pC511="",@pC512="",@pC900=" ABY"
@pRecType="E",@pE100="1",@pE101="5789",@pE110="",@pE111="",@pE120="",@pE121="",@pE130="",@pE131="",@pE140="",@pE141="",@pE150="",@pE151="",@pE160="",@pE161="",@pE170="",@pE171="",@pE180="",@pE181="",@pE190="",@pE191=""
@pRecType="H",@pH110="8",@pH120="3",@pH130="1",@pH210="",@pH220="20080211",@pH230="20080211",@pH235="01",@pH240="20080211",@pH250="20080211",@pH310="",@pH320="",@pH710="",@pH711="",@pH712="",@pH510="",@pH511="",@pH512="",@pH520="",@pH521="",@pH522="",@pH530="",@pH531="",@pH532="",@pH540="",@pH541="",@pH542="",@pH550="",@pH551="",@pH552="",@pH560="",@pH561="",@pH562="",@pH570="",@pH571="",@pH572="",@pH580="",@pH581="",@pH582="",@pH590="",@pH591="",@pH592="",@pH600="",@pH601="",@pH602="",@pH610="",@pH620="",@pH630="",@pH640="",@pH650="",@pH660="",@pH670="",@pH680="",@pH690="",@pH713=""
@pRecType="M",@pM250="170"
@pRecType="P",@pP010="SE",@pP110="E",@pP130="",@pP150="721153832",@pP210="",@pP310="",@pP340="",@pP350="",@pP360="",@pP370=""
@pRecType="S",@pS010="",@pS020="200278911",@pS110="BABIN",@pS111="JENNIFER",@pS112="L",@pS120="",@pS121="",@pS125="",@pS126="",@pS127="",@pS211="BABIN",@pS212="JENNIFER",@pS213="L",@pS221="F",@pS222="19850919",@pS231="M",@pS310="1605 SOUTH SHIRLEY",@pS311="",@pS315="GONZALES",@pS316="LA",@pS317="70737",@pS410="N"
@pRecType="V",@pV130="AA",@pV140="000000100",@pV150=""
@pRecType="X",@pX010="20080211",@pX020="20080211",@pX131="45378SG",@pX133="5789",@pX134="",@pX141="000095000",@pX142="000000000",@pX147="001",@pX210="0490"
@pRecType="A",@pA010="XC",@pA020="270",@pA110="CDC LOCAL"
@pRecType="C",@pC010="M",@pC015="H",@pC050="20080408",@pC060="B",@pC070="E40245",@pC080="P",@pC110="9340",@pC112="000000000",@pC120="P",@pC121="N",@pC122="",@pC124="100000002527",@pC125="409034402",@pC210="N",@pC301="000061131007",@pC320="20080117",@pC321="20080117",@pC511="",@pC512="",@pC900=" ABY"
@pRecType="E",@pE100="1",@pE101="5856",@pE110="2",@pE111="V0382",@pE120="",@pE121="",@pE130="",@pE131="",@pE140="",@pE141="",@pE150="",@pE151="",@pE160="",@pE161="",@pE170="",@pE171="",@pE180="",@pE181="",@pE190="",@pE191=""
@pRecType="H",@pH110="7",@pH120="2",@pH130="1",@pH210="",@pH220="",@pH230="20080117",@pH235="",@pH240="20080117",@pH250="20080117",@pH310="",@pH320="",@pH710="",@pH711="",@pH712="",@pH510="33",@pH511="20071211",@pH512="",@pH520="",@pH521="",@pH522="",@pH530="",@pH531="",@pH532="",@pH540="",@pH541="",@pH542="",@pH550="",@pH551="",@pH552="",@pH560="",@pH561="",@pH562="",@pH570="",@pH571="",@pH572="",@pH580="",@pH581="",@pH582="",@pH590="",@pH591="",@pH592="",@pH600="",@pH601="",@pH602="",@pH610="",@pH620="",@pH630="",@pH640="",@pH650="",@pH660="",@pH670="",@pH680="",@pH690="",@pH713=""
@pRecType="M",@pM250="170"
Above one is my sample file how to seperate rows
please give me some script
May 28, 2008 at 8:47 am
Hi,
It might help to explain a little about what you've posted... 🙂
What constitutes a row? What do all those parameters represent? A bit of background about the source etc etc?
Then maybe someone can point you in the right direction.
Cheers
Kindest Regards,
Frank Bazan
May 28, 2008 at 11:27 am
I need to seperate each every column
so column delimiter is ","
so i need some script to seperate each every column.
and some columns are null for some rows
so i need to store those nulls in the database
like this in database table
flat file source----->script task------>database table
@pRecType="A" @pA010="XC" @pA020="270" @pA110="CDC LOCAL" NULL NULL
@pRecType="C" @pC010="M" @pC015="H" @pC050="20080408" @pC060="B" @pC070="E40245"
@pRecType="A" @pA010="XC" @pA020="270" null null null
according to input file
I think you got my point
May 28, 2008 at 3:43 pm
Samtel.. , u sound mad and frasutrated. Common this is not the place to show, and my appologies if i am wrong.
What i understand is--u have a flat file and u want to import data from flat file to database table.
Ur problem of null keeping in the database table is simple. there is a option in OLEDB destination -
"KEEP NULLS". ( remeber however, this will override any defaults defined in the table, so use it wisely,
also, there is retain null values options in flat file source to keep nulls coming from pipeline).
I believe ur problem is --uneven column in row. what i mean is there is different nos of column for
each row. yeha we have problem in SSIS when u have uneven number of columns in flat file.
In DTS 2000 this is easily handle.
If this is the problem i can post a script task that handles this.
I had the same problem, followed by more becoz of data type.
I would aslo sugeest you to put more details and things u r trying to do
thanks
May 28, 2008 at 8:48 pm
Save your file as a csv file. Since the first row determines how many columns will be read in, you need to pad that row with commas.
The easiest way is to open the file in Excel and scroll over to the last column of data and put a space in the first row for that column (row 1, column BC in your case). Then save your file.
Open the file in Notepad and scroll to the end of the first row. There should be a space after the last column. You need to delete this.
Now bring the file into your program as a csv file and select "retain null values". You will not need the script.
I have found that there is a limit to how many rows will be filled out with commas. Your file is small enough that this is not an issue.
May 28, 2008 at 9:38 pm
Kbattta, I don't agree with u . that might be a solution, but what if u load file daily and ur load is automatic. u don't want to open each day flat file and look how many columns u have in the first row. also, i don't belive that only first row is scanned whan u make flat file connection. i belive it scanned almost 200 rows. u can check this with privew bottom.
In ssis it is very hard to handle uneven column in row. becoz flat file is parsed only once for rows for better performance. In dts flat file is parsed twice, one for rows and one time for columns so that this uneven file can be easily handle. ( u can try this easily ).
I have a script component to handle this.
or another option u can write a script component to add column delimet and then import it. or there is something called charindex which might help to solve the above issue of uneven columns.
Thanks
May 29, 2008 at 12:14 pm
Hi Old hand you are exactly correct I have un even columns
I would appreciate if you could send me the script
May 29, 2008 at 1:25 pm
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper
Public Class ScriptMain
Inherits UserComponent
Public Overrides Sub Input0_ProcessInputRow(ByVal Row As Input0Buffer)
Row.Col1 = Tokenise(Row.Column0, "|", 1)
Row.Col2 = Tokenise(Row.Column0, "|", 2)
Row.Col3 = Tokenise(Row.Column0, "|", 3)
Row.Col4 = Tokenise(Row.Column0, "|", 4)
Row.Col5 = Tokenise(Row.Column0, "|", 5)
Row.Col6 = Tokenise(Row.Column0, "|", 6)
Row.Col7 = Tokenise(Row.Column0, "|", 7)
Row.Col8 = Tokenise(Row.Column0, "|", 8)
End Sub
'Private function that parses out the columns of the whole row
Private Function Tokenise(ByVal input As String, ByVal delimiter As String, ByVal token As Integer) As String
Dim tokenArray As String()
tokenArray = input.Split(delimiter.ToCharArray) 'Split the string by the delimiter
If tokenArray.Length < token Then 'Protect against a request for a token that doesn't exist
Return ""
Else
Return tokenArray(token - 1)
End If
End Function
End Class
***********
Import one row as a single column and then use this script component inside dataflow to solve it. Make sure code is acoordingly modified to represent the max no of columns in row.
Another issue , you will see if u have smalldattime data type in your sql table. but we can handle this too via derived column.
Thanks
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply