February 20, 2008 at 2:23 pm
"no tables qualified by dbo. you're inviting a system-overhead in order to resolve the schema."
Is this still the case in sql 2005?
February 21, 2008 at 3:15 am
I can't say as I'm not on 2005. But some testing/Profiling would clear that up.
February 21, 2008 at 4:56 am
It's a real shame that you're forced into the single DB model as you lose so many benefits, as others have espoused, such as
* being able to take individual DBs offline
* reduce lock contention - if you have some clients running lots of queries and others doing updates at least they won't conflict
* Having different versions
* Moving DBs to another server eventually
* Rollback of just a single client's data with ease
etc
It's also a shame that so many object modelling tools are being blindly used and pushed these days 🙁 For a simple contacts application you can easily get away with it, but when an ORM tool is abused or if it makes some horrible database designs you're just making life difficult downstream. It's much nicer reporting off a database that was designed by a person without a particular object structure necessarily in mind 🙂 The old COM+ adage of component based development where a method has no state and has a particular purpose returning just the data necessary is no longer in vogue.
I'd be interested to see how it all goes for you though 🙂 If you have to do kludges like freeing the cache on a regular basis then clearly your queries are not well crafted. Does the ORM tool used let you specify stored procedures in place of auto-generated CRUD statements? If so, craft some yourself and see how you go.
February 21, 2008 at 5:45 am
I'd look for another job - you can never really get any performance out of apps like this and once you've exhausted the hardware options you've nowhere to go. Your best bet is to match ram with database size to force all the work to happen in memory, beyond that you have no other options - looking at that code type I expect you'll end up disabling parallel plans thus crippling your performance even more.
[font="Comic Sans MS"]The GrumpyOldDBA[/font]
www.grumpyolddba.co.uk
http://sqlblogcasts.com/blogs/grumpyolddba/
February 21, 2008 at 5:51 am
I'd just add anothe point having come from supporting an ERP system which used dynamic sql, sql 2005 doesn't seem to handle ad-hoc type queries half as well as sql 2000, in fact parameterisation and re-use can be really tricky, sql 2005 is a superb product and far outstrips sql 2000 , but I'd say it was much less forgiving in terms of how it deals with code of questionable quality. Top works fine, can't say I've found a diff with rowcount other than maybe ordering, only every used rowcount a few times.
I don't want to sound contrite but if you don't have experience of the platform you should be very very careful about posting , especially under performance tuning where people would expect those posting to be using sql 2005 , same applies to x64.
[font="Comic Sans MS"]The GrumpyOldDBA[/font]
www.grumpyolddba.co.uk
http://sqlblogcasts.com/blogs/grumpyolddba/
February 21, 2008 at 6:04 am
Indianrock (2/20/2008)
"no tables qualified by dbo. you're inviting a system-overhead in order to resolve the schema."Is this still the case in sql 2005?
Yep. It's actually worse because the introduction of (enhancement of if you insist) schema's means you may have lots of different owners within a system. Plus, the old recompile problem is still there. If you want to ensure your performance (let alone your access), you need to specifiy the owner of objects within the database.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
February 21, 2008 at 6:06 am
colin Leversuch-Roberts (2/21/2008)
I don't want to sound contrite but if you don't have experience of the platform you should be very very careful about posting , especially under performance tuning where people would expect those posting to be using sql 2005 , same applies to x64.
I'm not following you. We're using SQL 2005.
Our legacy product is either all foxpro for smaller clients or foxpro with large tables on sql server 2005 for largest clients. If the new product can't be made to work/perform I presume Legacy can be enhanced with some clients who already converted being migrated back. Right now that's a difficult scenario to imagine since management is all gung ho for the new thing.
February 21, 2008 at 6:09 am
If you want to ensure your performance (let alone your access), you need to specifiy the owner of objects within the database.
I'll have to dig for solid evidence on that, they won't take my word for it.
February 21, 2008 at 6:41 am
Indianrock (2/21/2008)
If you want to ensure your performance (let alone your access), you need to specifiy the owner of objects within the database.
I'll have to dig for solid evidence on that, they won't take my word for it.
Sorry. I sure didn't mean to leave you hanging on that. Here's MS's white paper on troubleshooting recompiles.
Here's the quote directly from the white paper:
Best Practice
It is best to owner qualify stored procedure names when you execute a procedure. This allows for better clarity and easier reuse of the existing execution plan by the current user. For example, if a user who is not the database owner (dbo) executes a dbo-owned stored procedure (called myProc in this example) in the pubs database, use the following statement: exec dbo.myProc
Instead of this: exec myProc
This technique eliminates confusion about other possible versions of the procedure by different owners from a coding and maintenance standpoint, and also allows SQL Server to access the execution plan for the specific procedure more directly.
By not qualifying the owner name, SQL Server enters the compile code and acquires a COMPILE lock on the procedure. However, eventually it determines that a new plan is not required (assuming no other reasons apply), so it does NOT recompile the plan at this point due to the lack of qualification. However, the extra step of getting a COMPILE lock on the procedure can cause blocking contention in severe situations. Refer to Q263889 INF: SQL Blocking Due to [[COMPILE]] Locks for more details on this situation.
If you owner qualify the procedure call with owner.procedure, you do not need to acquire the compile lock, so the contention is reduced.
And this white paper from Microsoft on performance tuning (which has a bunch of other stuff you might want to look at, considering your code):
Fully Qualify Database Objects
By fully qualifying all database objects with the owner, you minimize overhead for name resolution, and you avoid potential schema locks and execution plan recompiles. For example, the SELECT * FROM dbo.Authors statement or the EXEC dbo.CustOrdersHist statement performs better than the SELECT * FROM Authors or the EXEC CustOrderHist statements. In systems that have many stored procedures, the amount of time that is spent to resolve a non-qualified stored procedure name adds up.
There's a lot more material out there, especially if you hit the third party sites, like SSC, but I've always found referencing MS white papers carries a bit more weight than referencing some guy's blog, even if he is 100% correct.
Sorry I didn't back that up before. Good luck. You've got your work cut out for you.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
February 21, 2008 at 7:27 am
Thanks Grant I'll check those whitepapers ( Grant is somewhat rare, but it's my son's name also ) It does appear that qualifying the ownership of tables is important -- as far as stored procedures, our new product doesn't use them other than executesql. I'm just the messenger, not the designer here.
It will be nice when all of this sorts out -- right now we have two schools of thought, the stored procedure school being considered dinosaurs apparently. We can't be the only shop trying this object-generated dynamic sql approach so it would be great to hear the success/failure stories. The no proc/dynamic "Agile" approach has advantages if it can be tweaked to perform at scale.
February 21, 2008 at 7:46 am
...the stored procedure school being considered dinosaurs apparently
please shoot these uneducated hunters, or your organisation will encounter the ice-age.
February 21, 2008 at 8:00 am
AndrewMurphy (2/21/2008)
...the stored procedure school being considered dinosaurs apparentlyplease shoot these uneducated hunters, or your organisation will encounter the ice-age.
That must be a vote for one school over the other, but I'm not sure which one you're advocating.
February 21, 2008 at 8:10 am
I suspect he's suggesting that the "we don't need no stinking procedures" school is going to drive your database to extinction, not the other way around.
We use generated code in a lot of our local projects, just like you. However, we've been very structured in the approach. We actually only generate the CUD part of CRUD (Create, Read, Update, Delete) procedures through code generation. When we do, we generate stored procedures and deploy those to a database and use them. This despite the fact that we could simply generate ad hoc queries too. It's recognized that the database and database server have strengths and purposes that client or application server side code does not. The read procedures are still coded by hand. This is because, of all the procedures that we work with, these are the most likely to require a human brain to determine the best way to get at the data. I'd push the same approach to anyone thinking about generating code.
BTW, for what it's worth, generated code and Agile development are seperate critters. Code generation is not a requirement of Agile.
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
February 21, 2008 at 8:32 am
ok. I thought having business logic stored on the database server ( in procs ) was not agile. The ultimate of agile being where you could switch to Oracle overnight. Just theory conflicting with reality, all over again.
February 21, 2008 at 8:39 am
Why is that we're always going to decide, on the flip of a coin, to swap out the database server so we need to not use any of the functionality offered? How come no one ever suggests, hey, write that VB code like we're going to switch to C# or Ruby or Lisp, tomorrow? Because it's stupid. You need to take advantage of the language you're using. Well, you need to take advantage of the database server you're using too.
You need to beat these developers about the face & neck with a blunt object more often. Keep 'em in line. 😀
"The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood"
- Theodore Roosevelt
Author of:
SQL Server Execution Plans
SQL Server Query Performance Tuning
Viewing 15 posts - 16 through 30 (of 34 total)
You must be logged in to reply to this topic. Login to reply