June 17, 2006 at 5:43 am
HI'
my Fact table contains 70 milion rows.
the machine is X64 44 GIGA memory and 8 CPU/
the PROC falls after 2 hours/
have i passed the max limmit of rows ?
is there away to over come he problem?
thanks.
ron
June 19, 2006 at 10:09 am
Is there a particular error that is specified when the proc fails?
DEX
😀
The more you help the business, the more they will help you...well sometimes anyway.
June 28, 2006 at 9:42 pm
you are crazy!!! that 's too big table!!!!
June 29, 2006 at 12:00 am
monitor your memory on the SQL server side. we had an issue about 6 months ago where the table (views) that we query to began to consume all server memory. the server memory would go down to 80-100MB and bounce around there for a while (i presume swapping out to the page file) then the cube would return a failure. we resolved this by going from 2GB server memory to 4GB utilizing the /3GB switch. (we have a dual server setup - 1 AS server with 2GB RAM and 1 SQL server with 4GB RAM).
afterwards, we reduced the number of rows by splitting the cube into two separate cubes based on business unit, and did some tweaking with the views to reduce complexity. we've even further reduced the problem by partitioning the cube based on year.
once your fact table reaches X many rows (can't really tell you an exact number but i'd say 10-20 million is a good number) you should look for way to avoid processing it all in one fell swoop. break it up somehow and spread the load out.
June 29, 2006 at 8:22 am
As per Jay, I would definitely be looking at using partitions (probably many, many partitions) to load this up.
Steve.
June 29, 2006 at 8:29 am
Thanks you ALL.
the business is TELCOM .
many features per subscriber.
it worked real good until last month.
i already work with partitions -monthly.
Viewing 6 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic. Login to reply