February 18, 2008 at 5:07 pm
when i tried to import data from flat file to sql server which is in comma delimited format i m getting this error ...in this if i have 25 rows in flatfile i m jus geting 24 rows inserted will u plz tell me how can i solve this issue i am using import/export wizard to import data...the data is in .csv extension and comma delimited...i have gt an error .....so and moreover i have appended the data to an existing table so how oculd i jus add the remaining row or what would be the solution for this do i need to do all the import again ,.....
Initializing Data Flow Task (Success)
- Initializing Connections (Success)
- Setting SQL Command (Success)
- Setting Source Connection (Success)
- Setting Destination Connection (Success)
- Validating (Success)
- Saving (Success)
- Prepare for Execute (Success)
- Pre-execute (Success)
Messages
Information 0x402090dc: Data Flow Task: The processing of file "\\Syndevsql\DAS\Roxane\1997-112000 Roxane\Chargebacks2000_Roxane.csv" has started.
(SQL Server Import and Export Wizard)
- Executing (Success)
- Copying to [Kms_Roxane].[dbo].[Chargebacks1997_Roxane] (Stopped)
Messages
Warning 0x8020200f: Data Flow Task: There is a partial row at the end of the file.
(SQL Server Import and Export Wizard)
Information 0x402090de: Data Flow Task: The total number of data rows processed for file "\\Syndevsql\DAS\Roxane\1997-112000 Roxane\Chargebacks2000_Roxane.csv" is 1987721.
(SQL Server Import and Export Wizard)
Information 0x402090df: Data Flow Task: The final commit for the data insertion has started.
(SQL Server Import and Export Wizard)
Information 0x402090e0: Data Flow Task: The final commit for the data insertion has ended.
(SQL Server Import and Export Wizard)
- Post-execute (Success)
Messages
Information 0x402090dd: Data Flow Task: The processing of file "\\Syndevsql\DAS\Roxane\1997-112000 Roxane\Chargebacks2000_Roxane.csv" has ended.
(SQL Server Import and Export Wizard)
- Cleanup (Success)
Messages
Information 0x4004300b: Data Flow Task: "component "Destination - Chargebacks1997_Roxane" (67)" wrote 1987720 rows.
(SQL Server Import and Export Wizard)
Thanks,
Chinna
Its the Journey which gives you Happiness not the Destination-- Dan Millman
February 18, 2008 at 5:56 pm
If there's no confidential info in the file, recommend you attach the file to a post and lets take a peak. Can't really tell, otherwise.
--Jeff Moden
Change is inevitable... Change for the better is not.
February 18, 2008 at 8:34 pm
Too easy. The message says exactly what it is -- your last row in the table does not have enough info to complete the final record of the flatfile, which is a long for "incomplete".
I just reproduced the problem by trying to import this file
SYSDEVICES STATUS,0,0
default disk,1,0
physical disk,2,0
logical disk,4,0
backup device,16,0
serial writes,32,0
read only,4096,0
deferred,8192
into this table
CREATE TABLE [OLE DB Destination] (
[Column 0] VARCHAR(50),
[Column 1] VARCHAR(50),
[Column 2] VARCHAR(50)
)
with default settings for failures etc.
Please note the highlighted last row of the file.
Cheers,
M.Sc.IT, M.B.A, MCTS BI 2008, MCITP BI 2008, MCTS SQL Dev, CSM, CDVDM
February 18, 2008 at 8:39 pm
Interesting thing is that if for example a previous to the last record is incomplete, SSIS would do a weird thing.
(default settings)
So the flatfile below:
SYSDEVICES STATUS,0,0
default disk,1,0
physical disk,2,0
logical disk,4,0
backup device,16,0
serial writes,32,0
read only,4096,0
deferred,8192
sdf,234,23
would produce the following table
SYSDEVICES STATUS|0|0
default disk|1|0
physical disk|2|0
logical disk|4|0
backup device|16|0
serial writes|32|0
read only4096|0
deferred|8192 sdf|234,23
pretty ugly isn't it?
M.Sc.IT, M.B.A, MCTS BI 2008, MCITP BI 2008, MCTS SQL Dev, CSM, CDVDM
February 18, 2008 at 9:30 pm
hey thkz may be it works i shall try and let u know .....i have got an idea on that ... so you jus want me to insert the last row separately using insert statement ....
Thanks,
Chinna
Its the Journey which gives you Happiness not the Destination-- Dan Millman
February 18, 2008 at 10:26 pm
Not really. Since we don't see your flatfile I assumed that your flatfile is broken right at the end of it, e.g. it does not have enough members to complete the record.
To justify my theory I reproduced the problem on a simple table and similar flatfile with last record "broken".
Please check your flatfile and better send it to us to check
M.Sc.IT, M.B.A, MCTS BI 2008, MCITP BI 2008, MCTS SQL Dev, CSM, CDVDM
February 19, 2008 at 12:06 am
i have hundreds of thousands of rows in flat files...so how could i check the broken row or incomplete row ..i could solve if it is at the end of the flat file ...so i jus want to know how should i do if the row is at the end of the flat file....do it manually or is there any other way to do that or do i need to run the package again if so do i need to mention any condition or should i change the row in the flat file... if i need to run the package again i have a problem as i have appended this flatfile data to an existing table with data so how could i jus run it again ....plz do reply thkx in advance......i m new to sql server so srry abt that ....
Thanks,
Chinna
Its the Journey which gives you Happiness not the Destination-- Dan Millman
February 19, 2008 at 3:06 pm
Post the file here which makes the process fail and we might help
Otherwise were just guessing
M.Sc.IT, M.B.A, MCTS BI 2008, MCITP BI 2008, MCTS SQL Dev, CSM, CDVDM
February 19, 2008 at 3:13 pm
... just a reminder... no privacy info, please. Also, if it's a big file, please zip it. Thanks.
--Jeff Moden
Change is inevitable... Change for the better is not.
February 19, 2008 at 3:16 pm
thka valek i have jus gone through that it worked out i found out the error .....
Thanks,
Chinna
Its the Journey which gives you Happiness not the Destination-- Dan Millman
February 19, 2008 at 8:18 pm
What was the error... partial file transmission or just a bad row or maybe a rowcount at the end of the file? What was the error?
--Jeff Moden
Change is inevitable... Change for the better is not.
February 19, 2008 at 9:13 pm
Also - a couple of tips when this happens:
- check how many rows DID process. if you have 8000 rows, then chances row #8001 in the file has a problem....
- Set SSIS to redirect any error to another file for you to inspect. Look under Configure Error output (inside of the data transform task).
----------------------------------------------------------------------------------
Your lack of planning does not constitute an emergency on my part...unless you're my manager...or a director and above...or a really loud-spoken end-user..All right - what was my emergency again?
April 20, 2010 at 11:45 am
I have a simillar issue and iam running out of ideas. Iam trying to load a files with over 300 million rows, and sometimes it loads 126 million and if run it again it loads a little more or less. I have seen this is also related to space issues not sure if anyone else has encoutered this behavior. But this time i have enough room for these records. We have verified the data and it seems to be ok.
But the strange part is the package logs as completed everytime even though it does not load the complete source file.
Binoj
CodeRewind.com
April 19, 2011 at 4:24 am
Hi
I am too getting the same error - 'Partial record at the end of the file.
The file has around 303400 records .How can i find out which record got rejeced/appended to other records?
Thank you
Magek
April 20, 2011 at 3:52 am
I dont know how to do it in SSIS but I do have the same issue.
A bunch of textfiles sent by FTP to be imported to a database (daily).
To each textfile there is also a file with one row sent. I demanded that.
The row content is the number of rows in the textfile.
I use bulk insert to import the textfiles to a staging table and then the number of rows
are counted and compared to the number in the submitted "controlfile".
Before loading the production table (with substringfunction SQL) I also control the last field.
If anything goes wrong an E-mail is sent and the procedure is stopped.
I use VB.net (2005 or 2008).
I do have a good nights sleep with this strategy because it has prevented
bad quality in the sent textfiles to be accepted.
Best regards
Gosta M
Viewing 15 posts - 1 through 14 (of 14 total)
You must be logged in to reply to this topic. Login to reply