August 29, 2002 at 5:01 pm
I have a simple DTS to read a simple Excel file (only a few lines of headers plus one line of data) into a SQL Server database. See my codes below:
SELECT DISTINCT F2 AS base_scenario_id, F4 AS ClientID, F5 AS ScenarioName
FROM [scenarioDetail$]
WHERE (F4 IS NOT NULL)
For some reason, not all the data read correctly. Here is my Excel file looks like:
,,ScenarioDetail
,,,,
,Base ScenarioID,,ClientID,ScenarioName,,,
,1,,1,test 1,,,
But when I preview the result, it only show
1 (from the ClientID) and "test 1" (from the ScenarioName), I don't know why it does not read the first 1 correctly. Any idea?
Thanks.
August 29, 2002 at 7:25 pm
When Excel is parse in DTS there is a mapping generation that occurrs. When you connect to the file in the transformation, the tab for excel should tell you what each data type is and how they cam thru. Check to make sure match and try again. Sorry, no server here so a little vague.
"Don't roll your eyes at me. I will tape them in place." (Teacher on Boston Public)
September 5, 2002 at 6:27 pm
Antare:
Thank you for the inputs only by just reading my description of the problem.
You are right, the mapping in the transformation is very important. However, even the mapping is 100% correct, the data loading still would have problem. I have a few bed experience as below:
1. the DTS can't read the 2nd column in the Excel file. I have to move the data from 2nd to 3rd column -- no clue at all what is difference btw 2nd and 3rd column.
2. when there is only one row of record, the DTS can't read any data. But it reads ok when there are more than 1 row.
3. Sometime, in the title line, it misses the 3rd column data, but in the data line, it misses the 5th column data. It's just really unpredictable.
Do you know if there is anything can ensure the stable data reading? Is there any guideline for the title line (below shows what I mean title line)?
Frist Second Third <--- title line
A B C <--- data
September 10, 2002 at 10:18 am
Importing from Excel is always an adventure and rarely a pleasant one. First, I recommend you go to http://www.sqldts.com and search on "Excel" to see some of the issues.
quote:
2. when there is only one row of record, the DTS can't read any data. But it reads ok when there are more than 1 row.
Just to be sure we're talking about the same thing here, do you mean one row in addition to the header row? For example, if I have a spreadsheet:
A
1 Row 1
2 Test
3 Something
By default, if I import this, only "Test" and "Something" would be imported, because DTS (or, more accurately, the driver dts is using) treats row 1 as a header row. A couple of the articles at SQLDts.com tell how to get around this.
The other two problems I couldn't reproduce; probably would have to see your actual spreadsheet.
Do you know if there is anything can ensure the stable data reading? /quote]
No. Don't mean to sound flippant, but by their nature spreadsheets are unstable, compared to db's. I mean, the spreadsheet owner can enter anything he wants; change the type/structure at will. For example, say column A has the following:
805
22
Happy
Birthday
12/25/2002
8888.993
How should DTS interpret this? As a varchar, and all the numbers come in as text? As numbers, and the text and blank come in as nulls?
And the answer is: it depends! I copied the above data into a sheet, rows 2-8 of columns A & B, respectively, with column A in the default (General) format and B formatted as text. Result:
1) It ignored the 805 in both columns, and actually treated it as the HEADER in column2 (even though it is in the 2nd row of the sheet!).
2) In column A, the text and blank were treated as null, and the date was imported as a number.
3) In column B, the blank was treated as a null, and everything else came in as straight text.
Conclusion: Didn't really do enough testing to come to anything concrete, except that DTS seems to depend on column formatting to determine how it interprets values.
When I'm asked to dts spreadsheet data, the first thing I do is see if there's any way they can send the stuff in text, rather than sheet. If it's end users and they don't know what I'm talking about, I take the sheets and dump them to text, comma delimited if at all possible. There's just too much left to chance with sheets.
Good luck.
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply