January 14, 2011 at 2:18 pm
brian.willis (1/14/2011)
Thanks for the replies. USE DATABASE is not a part of my SQl command...it's a very simple SELECT that has a GROUP BY. Would USE DATABASE be something the package would append automatically behind the scenes, by chance, or is it only something that the user explicitly codes?When executing the sequence container(s) individually in BIDS I get mixed results. Day before yesterday all ran fine and added rows as expected. Yesterday I tweaked the GROUP BY and the container in question failed with:
"The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020."
I thought this error occurred because cached memory had run out of space, but the change to the GROUP BY was actually one that should return less rows, so I was confused and reverted back to the original groupings. When executed it failed again with the same error, and no matter what I tried I could not get this container to execute without failure again. My experience has been that the BIDS environment can sometimes be a bit buggy, so I completely deleted this container and started from scratch. It executed successfully (go figure...or my lack of knowledge is getting the best of me). Data Flow view showed rows written to tables from the source connection. It was at this point that I placed all related packages for this ETL into an Agent job and then executed successfully, but no rows were written to tables in steps 2-4 of the job (the container I was testing above is set as step #2 of the job).
Step 1 in the job (the one that actually writes to the destination tables) runs a package that uses 'table or view - fast load' option (no SQL command), and again with the same connection managers.
OK, firstly lets see if your query returns any data from the source. In BIDS designer add a grid data viewer to each of your data flow source and run each step to verify that your query is returning data. If it is then the problem is at the destination. Is your source a local or remote server? If it's remote your destination should be OLE DB and not SQL Server. Are you loading data to a view or table?
Thanks for r
MCTS: BI 2008, MCITP: BI 2008
Stay Thirsty My Friends
January 17, 2011 at 9:14 am
So I said I was a novice, right? lol...it appears my problem was a bit of user error. I inherited this package from someone no longer with the company, and in trying to make it more efficient I failed to first notice my destinations in steps 2-4...same table names but totally different schemas.
I've adjusted and the job appears to be running normally now, with rows being loaded into their rightful places. Thanks for your input though...I had never used the data viewers so those will be very handy in the future.
Cheers!
January 17, 2011 at 11:16 am
You can't learn how to run with falling from time to time. 😀
Glad you found the solution!
Need an answer? No, you need a question
My blog at https://sqlkover.com.
MCSE Business Intelligence - Microsoft Data Platform MVP
January 18, 2011 at 9:01 am
brian.willis (1/17/2011)
So I said I was a novice, right? lol...it appears my problem was a bit of user error. I inherited this package from someone no longer with the company, and in trying to make it more efficient I failed to first notice my destinations in steps 2-4...same table names but totally different schemas.I've adjusted and the job appears to be running normally now, with rows being loaded into their rightful places. Thanks for your input though...I had never used the data viewers so those will be very handy in the future.
Cheers!
Anytime.
It's interesting that you didnt get an error that the tables didnt exist. Anyways I am glad you figured out the problem and believe me that is the best way to get grounded in this business - learning by making mistakes. Also get famalar with the data viewers, they are a great tool for troubleshooting data flow in your packages.
Thanks
MCTS: BI 2008, MCITP: BI 2008
Stay Thirsty My Friends
April 16, 2012 at 7:01 am
I too experiencing same problem. but when I preview it shows null... no records are showing. that table has 3 records. can you help on this?
May 21, 2012 at 12:13 pm
I too was experiencing this same issue. The resolution for me, however, was changing the source of the data flow task to ADO NET. Originally, I had it set to OLE DB. Hopefully, this helps.
-KM
June 30, 2014 at 10:57 pm
Using BIDS 2005, and I am having exactly the same problem, but in my case Flat Files (txt) are the source. Have left default at zero rows to skip. Only additional thing to add in my case is that my delimiter is ascii 0254 aka Thorn character, but the previewers for flat file sources and flat file tasks are showing data populating out the columns, and the data delimiting according to supplied ASCII value correctly....just not flowing to OLE DB destination.
execution results:
"[Flat File Source [64]] Warning: The end of the data file was reached while reading header rows. Make sure the header row delimiter and the number of header rows to skip are correct."
Would really appreciate help.
June 1, 2016 at 5:52 am
in ssis package will execute successfully buy seen data viewer help. but in sql server data not showing only tables are showing.......
June 1, 2016 at 7:53 am
Is this a question? This is a very old thread, you should start a new thread and give us a bit more details into what you are attempting to do.
April 9, 2019 at 7:21 pm
I am using SSIS 2015 and receive the following:
[OLE DB Destination 3 [791]] Information: The final commit for the data insertion in "OLE DB Destination 3" has ended.
[SSIS.Pipeline] Information: "OLE DB Destination 3" wrote 0 rows.
The SSIS flat file to the OLE DB destination ".1000 sent, but nothing loads and there are no errors.
I am using "OpenRowset Using FastLoad"
April 9, 2019 at 8:10 pm
I fixed it by closing and reopening the package rebuilding.
Viewing 11 posts - 16 through 25 (of 25 total)
You must be logged in to reply to this topic. Login to reply