Viewing 15 posts - 226 through 240 (of 361 total)
Sometimes yes, just make sure you give it a different name and use 'with move'...
January 12, 2007 at 10:22 am
i usually handle it the lazy way
SELECTCASE GROUPING(det) WHEN 0 THEN det ELSE 'All' END,
sum(cnt)
from ( select LEFT(determinant, 5)) as det , count(*) as cntFROM response_master_incident
...
January 9, 2007 at 10:36 am
You can probably get hold of dll to use as an extended proc to compress the data into a blob if you wanted.
January 8, 2007 at 9:52 am
Have you got enough disk space...?
January 5, 2007 at 4:06 am
You could (Should) simply declare (depending upon the size of the result set) a variable table or a #TEMP table. Smaller sets go with a Variable table no question. Easier...
January 3, 2007 at 6:45 am
unless its part of a transaction and the isolation level is appropriate or an updatelock or something has been issued.
January 3, 2007 at 3:42 am
Just reaching here, but sometimes you might see this depending on the data distribution.
In this example plan 1001000046 may have a different data distribution than 1001000068 - possibly producing...
January 2, 2007 at 4:04 am
You mean here: http://www.microsoft.com/technet/prodtechnol/sql/2000/maintain/ss2kidbp.mspx ?
Where it states almost the exact opposite, or am I missing your point?
When analyzing the output from DBCC SHOWCONTIG, consider the following:
December 21, 2006 at 10:46 am
Try bcp'ing the data out to an intermediary textfile then bulkinsert it back in.
Do it right and the performance gains can be massive.
December 21, 2006 at 5:02 am
Hi, had a similar issue last year.
The compat collations are often for upgrades from previous versions/access.
The only way around it I found (don't try to get a 'similar' one...
December 20, 2006 at 4:19 am
There is a chance you may be able to recover this by updating the statistics on the relevant tables. I have had experiences similar to this, where the growth rate...
December 18, 2006 at 6:32 am
oltp it is - also quite narrow. On the positive side there are several data streams into it that can be turned off if required - gives us a nice...
December 15, 2006 at 4:21 am
Personally I suggest in the short term you just pop it in the one table, if it becomes difficult later you can simply add constraints to the underlying table and...
December 14, 2006 at 4:06 am
You could set up a scheduled ftp task?
Or you could look at log shipping, get the db off once then just apply the small transaction logs?
December 14, 2006 at 4:03 am
Viewing 15 posts - 226 through 240 (of 361 total)