September 11, 2012 at 7:18 am
Argghhhhhhhh
And now I have another issue.
I go into Process, Change Settings and add in the settings to log any errors. Click OK and use.
Then when I go back the settings have changed back to how they were before I changed the settings. So it doesnt seem like you can set the processing up to do this all the time. Only on the fly when you want to.
Surely you should be able to set them up to always do this. What a nightmare. I thought I was onto something then
September 11, 2012 at 8:13 am
That's a major beef of mine, having to set your settings Every Time. I'm sure there is a way around it, but, damned if I know what it is!!
Yes, the script is the same as right-clicking in AS or the SSIS component. Process Full is supposed to first process dimensions and then fact tables.
Things are moving here - I get to call Microsoft! first I will get to be reminded that it's my data, it has to be my data, well, I'm sure it's your data ... Our software director (kind of BI sponsor, the only person here that I can talk ETL/SSIS/AS with) has not weighed in on this yet.
whatever comes of it, I'll share, as nobody deserves to be stuck with this and everywhere on the internet, "It's Your Data" is all anyone can say. Except the guy that gave up on that lame answer and resorted to rebuilding his cubes every time this happens. What bugs me most is his saying "every time this happens."
Memory utilization on the processing server has also been mentioned as potential problem and I noticed ours was continuous high utilization. Got a reboot to take that down a bit, but, no change on processing cubes. sheesh....
September 11, 2012 at 8:26 am
Night mare.....
Im scuppered again.... I was very excited about setting up the log files to see what happens in the overnight process.
but if you can only set them up manually Im back to not knowing. That is rubbish design, not being able to set the processing options how you want them to use all the time.
Having to run the thing manually just completely nagates one of the best things about it. Overnight runs.
September 11, 2012 at 9:42 am
I dont want to get to excited but I think you can set up error logging through Integration Services....
I could be wrong I need to test but.........
I shall try to not get too excited
September 11, 2012 at 10:48 am
YES, you can set up logging via SSIS:
Great Article I keep handy.
File is very easy, just remember that it is a new data connection - whoever configures the packages must be sure to check the box for it and to ensure that the file path is set.
I added Event Handlers to some fact ETLS and am digging being able to see how long different steps take, as that is visibility into it I don't get at run time (I fire packages against a farm of hosted servers so in dev can only hit one or two servers). thing to remember on that is that errors in the event handlers count as package errors (found that out thru early botching).
Can't say as I have tried to log any AS anything and will be interested in how you get along with that.
September 12, 2012 at 4:14 am
Here I go running tests with the new Logging In Integration Services
Test 1.
the package says its worked (All tasks go green. Usually the two cubes go to red and I get fail emails).
I have errors in my log files.
I got a little confused. I thought I had opened the sen.log but I was reading about a problem with provision dimension so that was confusing
Provision Dimension
20 errors The attribute key cannot be found when processing
keys
377630, 377611,377651, 377632, 377633, 377636, 377646, 377628, 377623, 377604, 377655, 377618, 377654, 377640, 377643, 377601, 377635, 377634, 377624, 377616
The one thing I have noticed when running the script with the above criteria in, is that there are NULLs in the data specifically in some VARCHAR’s
Test 2. Change all the NULLS in the Provision source SQL Script to blanks (Dates are fine being left as NULL)
Delete errors in the log files and try again
I have found another issue…. I cant see provision in the IS package but the provision details seem to be being sent to the SEN log.
I now have different errors In the SEN Dimension error log.
Table: 'fact_SEN_Fact'
Column: 'DW_Prov_ID' – But this if from the provision dimension NOT the SEN dimension
Measure Group: COP Stage Pupil Business Measures
Something is not right at all……
1.Checked the Analysis Services source data for both provision and SEN and its fine.
2.Checked the Analysis Services dimensions and they are also fine…..
3.Added Provision.Log to
\\d-db94\db94scripts\SSISDeploy\Dependencies\CAYA_DataWarehouse_dev
Because I had missed this out before (Whoops)
4.Checked the Analysis Services Processing task for SEN and its attached to the SEN dimension which is running to the SEN log… Why is it showing errors for the provision keys? Deleted the Dimension and then readded.
5.Added the Provision dimension to the Integration Services project BEFORE SEN.
I now have NO ERRORS!!!!!
I don’t want to get too excited but I think I may have cracked it………
Im going to redeploy the manifest file and see what happens in the overnight run tonight. Exciting…….
September 13, 2012 at 8:33 am
Ah, Steps Forward!!!! YAAAY!!!
I was delayed in launching a case with MS and haven't learned much yet, but, he did confirm that the Att Key Not Found is rather a red herring (which we knew).
We don't allow nulls in our warehouse tables so that was not our problem and I assume that you don't have nulls anywhere for keys.
The item that may apply to both of us is some kind of string limitation. Apparently they have no specific error message for it, so it presents as this thing we are getting.
If you use strings for Keys, it could affect you. Our keys are all integers, so that's not our issue. BUT, he said that even long attribute names could hit the limit, and I'm wondering if that's our issue as the last thing I added (granted, MONTHS AGO) was a new text field to one dsv query to create a Short Name field off of a description field in the underlying dw. We do sometimes have very long attribute names which mostly has to do with creating nice naming via the user BI tool as it does strange stuff with names.
right now, MS is researching and will get back to me. The delay helps, as overnight something went wrong on that server, a one-hour package did not complete and ate log space (oh crap, I don't know anythign about pkg use of log space) so am heading a different direction.
I am sooooo glad you are getting somewhere!!!!
Viewing 7 posts - 16 through 21 (of 21 total)
You must be logged in to reply to this topic. Login to reply