Viewing 9 posts - 1 through 9 (of 9 total)
Hi Steve (and community).
It's refreshing to see someone talk about competing products in civil tones. I'm of the same basic opinion as you are in that different databases, different...
December 13, 2006 at 6:07 am
Thanks for the post, Tom. I reindexed all tables, shrank the database again, did the update usage thing, and did another shrink file. The size is down to...
December 5, 2006 at 3:35 pm
Has anyone found a work-around to this problem yet? I'm frustrated by the same behavior as Carl:
I open a query against a database and do something trivial like SELECT COUNT(*)...
December 4, 2006 at 8:54 am
Hi Jeff,
I'd be happy to share some information, but it might be a few days before I can find the time to sit down and think through it well enough...
December 4, 2006 at 8:48 am
Hey Jeff,
You're right. Don't need a temp table and all the complexity for a fixed set of fields. I overlooked that part of his dilemma. Our situation...
December 1, 2006 at 3:47 pm
The simple answer is you have to create a temp table dynamically based on the columns you want to pivot out. Then you have to populate that temp table...
December 1, 2006 at 12:25 pm
It wasn't so much of a formula as it was calculating the average actual size of the varchar/char columns. All of our character fields were either nchar or nvarchar,...
December 1, 2006 at 12:16 pm
For any that might be interested, I did some querying and math and I'm thinking that real data savings comes in at about 21MB. My guess is that when...
December 1, 2006 at 11:44 am
Thanks for the replies. I've done SHRINKFILE as well as SHRINKDATABASE and the UPDATEUSAGE. The size did go down from 1.2 GB to 1.1 GB, which is nice....
December 1, 2006 at 9:19 am
Viewing 9 posts - 1 through 9 (of 9 total)