April 14, 2008 at 10:44 am
Currently I have a table in our data warehouse that has about 9 million records and uses 2 gigs (1035832 KB data + 1697096 KB index). It holds about 13 years of data. I've just been told by our data architect this table will grow to about 100 million per year and we will retain data for 5 yrs. Thats 500 million records! Should I be concerned? Any suggestions on steps I should do to avoid problems later.
Paul
April 14, 2008 at 1:07 pm
Hello.
Maybe other guys here could help you better than me. What comes to my mind right now is the Microsoft Project Real and the Data Lifecycle documentation produced by that team.
http://www.microsoft.com/technet/prodtechnol/sql/2005/realpart.mspx
I guess it's a good reading and you can grab some information from there.
Hope this help!
April 14, 2008 at 9:08 pm
The answer is "Yes", you should be concerned. The article at the URL provided Fernando is a very good start for planning. Not all the information is about tables but a lot of the information couched as "Cubes" is very good.
--Jeff Moden
Change is inevitable... Change for the better is not.
April 15, 2008 at 10:50 am
If 9-million records is 2 Gig, then 500-million records will be about 120 Gig (plus or minus). So, assuming you're using relatively modern hard drives, storage shouldn't be too much of a problem.
But you'll definitely need to pay attention to the techniques Microsoft recommends on the page referenced above, to make sure you'll be able to query, etc., the data with any speed at all.
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply