November 16, 2006 at 1:46 pm
OK my shop blessed me with a zipplus 4 database to have for reference purposes;
BCP'd it into a clean database, and it's a whopping 14 gig in size; substantially larger than my typical 100meg databases I usually get to play with. It's on a developer machine with a gig of ram, but the machine is not doing anything else.
This whole ordeal is press a button and come back the next day.
adding a primary key to the data took 12 hours, as it thrashed the data in the new order predicated by the PK.
a topxx search takes no time at all to run; doing something with a where statement not covered by the PK took forever....so...
adding a new index additional index on city/state/countycode took an hour;
decided i didn't like that index, droped it and recreated it with an additional column....hour and a half.
peeked at the db and it's jumped to 20 gig.
decided i needed the CountyName in the table, so I don't have to join on an external table... the Alter Table add column took an hour and a half.
the update Countyname from an different table has been running 9 hours. the db has bloated the log file to 70 gig, tempdb to 12 gig.
I just found it interesting how much additional space you need when you add columns to a very large data base (does 20~30 gig count as very large?)
Lowell
November 16, 2006 at 2:09 pm
Last time I checked (1 year +). Wall mart Db was 500 TB. They had only 5 DBAs .
So 12 gig is still medium-small in my mind... bigger than most will ever work on but that's another story .
November 17, 2006 at 3:41 am
Welcome
One of my tables is comfortably in excess of 80Gb... and growing a couple of gig a day... I would love to stick an index on, and change the clustering, but I think it would lead to a day or so of downtime...
Ah well.
November 17, 2006 at 6:28 am
That should give you enough time to take a cofffee break .
November 17, 2006 at 12:46 pm
"It's on a developer machine with a gig of ram, but the machine is not doing anything else."
SET SARCASM_MODE ON -- with a dose of reality
... and you expect performance ??? Unless its a Dual Xeon with 4 Gb of RAM and a SCSI disk subsystem you are going to have lots of time for coffee, meals, sleeping, etc...
SET SARCASM_MODE OFF
You really need to get a small server sized machine. 19 Gb databases are just above baby size. With a slightly better machine you might actually be able to get more than one thoing done per day. As an example I reindex 30 individual databases semi concurrently, containing about 200+ Gb of data weekly in less than 6 hours. Its on a DL580 with quad 3.0 Gh Xeons and 4 Gb of RAM.
RegardsRudy KomacsarSenior Database Administrator"Ave Caesar! - Morituri te salutamus."
November 17, 2006 at 12:49 pm
As if developpment work is easy on a machine... it can be much worse than produciton environement depending on the IN-experience of the user .
November 17, 2006 at 1:04 pm
the usual story around my shop...we have tons of desktop-type machines, with a gig of ram and a P2.0 or above processor; I can have a dozen of em if I want...but trying to get a multi proc server here that is NOT for production...impossible. Just gotta use the tools available.
I wrote some basic web services so that a various queries based on address and such returns required data like census tract, target area, and a bunch of other stuff like that; eventually that will get promoted to production, which really is a decent group of machines.
Lowell
November 17, 2006 at 1:14 pm
Well Lowell, if you can get your stuff to run efficiently on the present 'development' equpment you have. By Jove, when it goes into production on the 'good' stuff' it ought to run like 'greased' lightening.
(please excuse the colorful metaphors, it's 2:10 PM on Friday, all the servers are running smoth, no calls, no tickets and the exit door is in sight !).
RegardsRudy KomacsarSenior Database Administrator"Ave Caesar! - Morituri te salutamus."
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply