Viewing 11 posts - 1 through 11 (of 11 total)
Ian's script is exactly what I am looking for. It saves the efforts to record the restore history myself.
Dunnjoe's suggestion also makes sense. I will try that.
Thanks everybody.
July 9, 2008 at 10:45 am
My guess: For all rows, the database will have to run the "SELECT ItemID FROM dbo.GetChilds(@parentItem)", and spend extra time to get the value of @parentItem and pass it to...
October 9, 2006 at 12:52 am
I suppose you have a 3-tiered system, and you just don't want users to see 20 fields in the GUI:
1. In your GUI, as Jon mentioned, you allow the user...
October 5, 2006 at 12:26 pm
I prefer frequent backups than autogrows.
If your system is so busy that your transaction log grows quickly, you should also consider differential backups if you don't have yet. Of course,...
October 5, 2006 at 12:53 am
If I understand your question right (from Prasad's description), can you input the data after the report hass been produced? Or in other words, if non-PDF format is acceptible, why...
September 29, 2006 at 12:16 am
The same problems happended to me many times, and the reasons differed everytime. I agree with CDB that everything is possible, and I would start from checking the memory and...
September 29, 2006 at 12:03 am
Our memory stabilized at 1GB ava. memory and SQL is using 1.7 GB, which is fine for us. Thanks everybody! I will examine our databases and applications more closely so the...
December 14, 2004 at 10:57 am
Steve and Allen, thank you so much! I am looking for the reasonable memory size for OS and will assign an upper bound for SQL.
December 2, 2004 at 8:32 am
All our databases, including systems, are in total 13 gb.
We have sp 3a.
December 1, 2004 at 2:21 pm
You can also try Data Driven Subscription.
November 9, 2004 at 8:32 am
Viewing 11 posts - 1 through 11 (of 11 total)