Viewing 15 posts - 1 through 15 (of 17 total)
To build the format file I would have to manually configure based off of custom field separators (per field), i.e., one of these four conditions:
",
,"
","
,
That's for 500+ columns. That's...
March 3, 2016 at 3:36 pm
OH MY GOD! So simple, so elegant. So 99.25% on-time!!!!!!!!!!!!!!!!!! Every single job is meeting the 10 second interval within 750 milliseconds between each ProductTypeID pull which...
June 16, 2009 at 1:50 am
Paul, thank you very much. I worked all night (and am now going to my day job) on restructuring an instance of the production code to include Barry's idea......
June 15, 2009 at 6:42 am
Paul White (6/14/2009)
1. ...
June 14, 2009 at 8:33 pm
I arrived at this when modifying the allow_row_locks, allow_page_locks, and configuring multiple "WITH " clauses on the updates to use ROWLOCKs, PAGLOCKs, and TABLOCKs... all while watching in SQL...
June 14, 2009 at 5:21 pm
The verdict is, 50 second delays has now turned into 28 seconds with the FROM / JOINS fixed in the trigger.
Unless the trigger is super costly, I'm still leaning towards...
June 13, 2009 at 9:24 pm
OH... YES, I need this real-time. But, by real-time I mean I need about 2500 new rows coming in per second, more would be grand. My data provider...
June 13, 2009 at 12:01 pm
That's good feedback. The two things you just mentioned not being a favorite of were actually suggestions from a previous post I originally wanted to avoid as well. ...
June 13, 2009 at 9:10 am
I spent some time writing a complete example (excluding the concurrency and jobs) for this. Please feel free to try out this code to play around with it. ...
June 13, 2009 at 2:29 am
Jeff,
Sorry for the cross-post, wasn't sure if this was a performance or tsql question.
THANK YOU!!! OMG! That is so obscure. It's driving me NUTS. I've been...
June 13, 2009 at 12:50 am
Okay, I was able to test the above solutions. First off, the CTE above uses about 4.8Gs of ram, and causes SQL Server to buffer about 7.5Gs of data...
June 10, 2009 at 3:33 pm
Agreed, I've been building an example based off of Lynn's code for the past 15 minutes, and have already come up with like 9 other ways of doing this, similar...
June 5, 2009 at 2:55 pm
This question completely depends on WHY you need to count A's and E's as double characters. Is English the native language for this database? Because if UNICODE characters...
June 5, 2009 at 1:49 pm
Lynn,
I am so sorry, I posted that message after it sat on my desktop for an hour and have been trying to edit it via iPhone for the past hour...
June 4, 2009 at 7:22 pm
Lynn,
The inserts are a couple hundred at a time, which may help since it will scale back the number of computations during each insert by that amount. Each row...
June 4, 2009 at 5:35 pm
Viewing 15 posts - 1 through 15 (of 17 total)