July 22, 2008 at 6:19 am
i want to design the database with a requirement of 20gb of data per hour like that the data will grow upto terabytes, so how do i design to load the database to the server which will load fast and also how to improve the performance in terms of data retrival
July 22, 2008 at 7:07 am
That is not nearly enough information to help you.
July 22, 2008 at 8:43 am
Use good indexes, especially the clustered index. Look into partitioned tables where appropriate. Use partitioned views where appropriate. Only denormalize where you can guarantee a tested performance benefit. Test everything before you deploy it to production.
That's about as much as can be said without a lot more details.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
July 22, 2008 at 12:18 pm
With the little you've given, all I can really say is "make sure you have enough drive space".
- Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
Property of The Thread
"Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon
July 23, 2008 at 12:56 am
GSquared (7/22/2008)
With the little you've given, all I can really say is "make sure you have enough drive space".
and more than one independant RAID array.
Gail Shaw
Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability
July 24, 2008 at 9:29 am
hari255 (7/22/2008)
i want to design the database with a requirement of 20gb of data per hour like that the data will grow upto terabytes, so how do i design to load the database to the server which will load fast and also how to improve the performance in terms of data retrival
You do this by hiring an experienced professional to do it for you. If you have to ask a question like this my guess is that you have no chance of success if you try to do it yourself. Please excuse the bluntness of my statements. I assure you it is not a personal attack, just a statement born out of years of cleaning up messes created by people who thought they really could do such things. 🙂
As others stated, I really don't see how you think we could provide you with any useful input given such a nebulous request either. :w00t:
Best,
Kevin G. Boles
SQL Server Consultant
SQL MVP 2007-2012
TheSQLGuru on googles mail service
July 24, 2008 at 11:02 pm
Ditto. But be assured, the right setup of SQL Server will meet this requirement very well. There are many concerns at every tier to make this work.
July 24, 2008 at 11:14 pm
make sure you have a good box with enough RAM.
"Keep Trying"
July 26, 2008 at 5:48 am
20 GIG an hour that a lot of data - how do you know it is 20 GIG or data. That is something you cannot do with a guess. 100 gig a week your have to schedule various different backup strageties and index rebuilds - after 1 month your have 500 GIG that just seems like a tremendous amount and your backups would never run daily on a volume of that size - you would need to archive the data weekly on a growth that size. Are you sure it is 20 GIG and not 20 MB
July 26, 2008 at 11:54 pm
Hope you had a huge task to achieve ...
for this you can opt.. Bulk insert task in SSIS or ordinary BCP to load your data.
Thanks and Regards,
Venkatesan Prabu, 😛
My Blog:
http://venkattechnicalblog.blogspot.com/
July 27, 2008 at 1:00 am
hari255 (7/22/2008)
i want to design the database with a requirement of 20gb of data per hour like that the data will grow upto terabytes, so how do i design to load the database to the server which will load fast and also how to improve the performance in terms of data retrival
I would suggest you to have a test environment created with the same senorio. Once you do that you could have a clear picture of how the data would act. As already suggested you could go in for RAID configuration. Make sure you have the data and log files placed in separate drives. The queries should be kept as shot and simple as possible. Clustered index could do trick for fast retrieving f data.
When designing the database, do consider the normalization techniques.
Viewing 11 posts - 1 through 10 (of 10 total)
You must be logged in to reply to this topic. Login to reply