April 26, 2007 at 2:48 pm
I have a DTS development task.
I have to get data from a file into the database but here comes the catch.
There are two excel files with the follwoing columns :
1. col1 , col2 , colx 2. col1 , col2 , colx
Col1 and Col2 are the same for both the files now Colx is containing Qtantity in one file and the other file it contains price.
I need to create a single DTS capable to read these files which would have the same names but insert data into a table depensing on the colx available .
Any ideas as how to achive this?
April 26, 2007 at 3:54 pm
There may be a better way, but here's what I would start with.
1. Create 2 staging tables, one for the Quantity data and one for the Price data.
2. Import the files into the respective tables.
3. Insert rows into the destination table by joining the 2 staging tables together by col1 and col2:
INSERT INTO DEST(col1, col2, Quantity, Price)
SELECT t1.Col1, t1.Col2, t1.Quantity, t2.Price
FROM Quantity_Stage t1
INNER JOIN Price_Stage t2
ON t1.Col1 = t2.Col1 AND t1.Col2 = t2.Col2
WHERE <your filtering criteria here>
4. Drop staging tables.
April 26, 2007 at 4:45 pm
Is there a way in which I can read through the header columns of the Excel File using the ActiveX script and based on this I can manage to map the readings from the excel file into different columns of the table? I would like to explain what i mean in steps here:
1. AtiveX script reads header if excel file
2.Note here the excel file name does not change (but the contents change)
3. Based on the type of the third column if (price) or (qty)
I get the file contents to be run into the table
4. Format of the table col1, col2 , colPrice, ColQty
5. So the third Colx as mentioned in my previuos post can be either mapped into this table accordingly
is this possible?
April 27, 2007 at 8:57 am
I don't know if you can read the column names in via ActiveX, but there may be a way that I am not aware of. What about using the data to determine if it is quantity or price. Will all of the price values be currency? Read the data in and if there is a decimal in the value, you know it is price. If it is a whole integer, you know that it is quantity. Will that work for you?
April 27, 2007 at 6:35 pm
You can get the names thru activeX but it is a bit of a task. You could also use a Data Driven Query Task item to create a single table from the first file (with a nulled 4th column) and use the mentioned task to do a lookup for each record and update the corresponding record with the related value. You'll find the previous mentioned method of two tables then join once loaded is the easiest prior to SQL 2005 with Integration Services.
May 1, 2007 at 9:43 am
I have reached to this point now i need some help
While reading the values from the file into the source table I am using the following script:
'**********************************************************************
' Visual Basic Transformation Script
'************************************************************************
' Copy each source column to the destination column
Function Main()
DTSDestination("Col1") = DTSSource("Col1")
IF DTSSource("TotalQty") then
DTSDestination("Mixed") = DTSSource("Total Qty")
ELSE
DTSDestination("Mixed") = DTSSource("Total Rev")
End IF
DTSDestination("Col2") = DTSSource("Col2")
DTSDestination("Col3") = DTSSource("Col3")
Main = DTSTransformStat_OK
End Function
The problem her is that the DTSSource ("Col") her gives the values row by row of the col specified in the double quotes. Is there any way in which I can get the name of the columns in the excel file that would be the first row of the excel file.
maybe something like Get DTSSource Column function ???
July 23, 2007 at 2:38 am
I have similar problem as above.
I tried to set the Extended Property of my Excel file Connection,
HDR=NO
which will take the first row (the column header) as data.
However, this doesn't really meet my user requirement, because to the column that stores numeric value, the header name of that particular column (which is string) become hidden, and this failed my goal, because I wish to check the header if it meet the predefined standard everytime the user upload the Excel file.
The setting of IMEX=1 for Extended Properties does not really work well, although it can read previously hidden header name, but the numeric value for that column become rounded number, as it renders all data as string.
I wish to know if there is alternative solution to this problem..
July 27, 2007 at 1:32 pm
I suggest that you convert the excel to a csv file first for 2 reasons. One, I have had many cases where values in an Excel datasource never make it to the DTS target, so I ALWAYS convert an Excel to a CSV first and then use a text file datasource in DTS.
Second, you will then be able to read the header field names as the first row of data if that is what you want to do, but I'm not sure you really want to do that.
As you noticed, your activeX script in the datapump task processes one row at a time. You could simply check to see if source fields are not null. Instead of using field names, you can use a numeric value too (1 through x)
Check SQLDTS.com for good info.
July 27, 2007 at 2:10 pm
Thanks for the converting option even I had that in mind but the files are comming in from a FTP location from a client.
So I just wanted to work with the file format privided, I do agree that the CSV file has more flexibility attached when handling the data
July 30, 2007 at 9:44 am
You could have a connection to the Excel file (path dynamically adjustable through a Global Variable) and a SQL task with a SELECT that returns the contents of the Excel file to a recordset Output Parameter (Global Variable). This GV can then be queried using OpenRecordSet in ActiveX and the column names returned.
If you need any help on any of the above, just shout up.
Viewing 10 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply