Viewing 15 posts - 76 through 90 (of 470 total)
I think I've just had a brainwave.
We currently store all the document data vertically in Table1. It contains (amongst other things):-
Document name
Field name
Field value
Now, what we then do is split...
June 24, 2020 at 1:12 pm
Hi Jeff
We have to pull in everything from the documents as we don't know what is needed for reporting purposes (orders from the higher-ups).
This is the only document that is...
June 24, 2020 at 6:07 am
That's what I was trying to avoid but it might be necessary in this case.
Should keep things interesting with the code if nothing else as we build the tables automagically.
June 23, 2020 at 1:29 pm
And the plot thickens......
I've just verified that the large_value_types_out_of_row is 1 on the table in question.
I'm still getting an error that reads:-
"Cannot create a row of size 8093 which is...
June 23, 2020 at 8:54 am
Aaaaarrrrrggggghhhhh!!!!!!
Thanks Jeff.
I've just checked and it's been switched off. This is potentially because we had a bug in our code which meant that the table was being dropped and recreated...
June 22, 2020 at 3:39 pm
Changing the option only applies for new rows. The rows that were in the table before you changed the option will stay where they were. Could that be...
June 22, 2020 at 2:21 pm
I actually did a drop/create on the table and then tried a reload but still ran into the same issue.
June 22, 2020 at 1:58 pm
These fields are part of the primary key:-
[pkDocumentID] varchar(30)
[pkEditedByStaffID] varchar(25)
[pkEditedDateTime] datetime
This is the only index on this table.
Like I said in my OP, I did find a method of forcing...
June 22, 2020 at 1:38 pm
I've tried recoding the query with *some* success (it's now down to 42 minutes on a single run from an hour).
Can someone have anther look and check I've got the...
May 20, 2020 at 1:48 pm
Thanks both
It looks like it's the ROW_NUMBER() section that's taking the time to work out.
Is there any sort of alternative to that bit?
I need to get the records in order...
May 20, 2020 at 6:11 am
Thanks for that.
I'll dig through our FACT table data and see what I can find.
March 16, 2020 at 8:49 am
I'm back again.
I've changed to running the XMLA in a "Analysis Services Execute DDL Task" in SSIS (I grabbed the generated XMLA from SSMS and pasted that in.
I've now got...
February 27, 2020 at 3:16 pm
Thanks for that, I'll have a look and see if it offers any improvements.
February 18, 2020 at 4:31 pm
That was the first of two cubes that have been failing but I'm fairly certain the root cause will be the same for both (so the same fix should apply).
Just...
February 18, 2020 at 4:03 pm
Looks like I'm going blind and/or stupid again.
In desperation, I ticked all of the DIMs to be processed (full) and let it run through.
That gave me an error on a...
February 18, 2020 at 3:50 pm
Viewing 15 posts - 76 through 90 (of 470 total)