October 31, 2002 at 4:17 pm
Hi all!!
I'm using a frequent and very currently accessed table (reads), I have several SQL task to do inserts and updates over this table and I also need to maintain an optimal performance level, one way I suppose I could achieve this is using dirty reads, but my question is, how can I implement dity reads ? could this enhance the performance??
thank you for your suggestions
Engineer Programmer Jr.
Edited by - osoto on 10/31/2002 4:45:19 PM
Edited by - osoto on 10/31/2002 4:54:06 PM
Engineer Programmer Jr.
October 31, 2002 at 4:53 pm
To read uncommitted reads in TSQL issue the following command:
SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED
then proceed with your SELECT statements. Your SELECT statement should be able to read records that have not been committed by other sessions.
I'm not totally sure about the performance issues. But I'm guessing reading a record that has not been comitted should not perform any I/O's since the dirty record has yet to be written to disk. Although maybe I'm oversimplifying the issue. Use profiler to see if reading uncommited records is faster then reading a comitted records.
Gregory Larsen, DBA
If you looking for SQL Server Examples check out my website at http://www.geocities.com/sqlserverexamples
Gregory A. Larsen, MVP
October 31, 2002 at 5:30 pm
Thank you Greg,
Your suggestion makes sense to me, I'll do some tests to see the results,
regards
quote:
To read uncommitted reads in TSQL issue the following command:SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED
then proceed with your SELECT statements. Your SELECT statement should be able to read records that have not been committed by other sessions.
I'm not totally sure about the performance issues. But I'm guessing reading a record that has not been comitted should not perform any I/O's since the dirty record has yet to be written to disk. Although maybe I'm oversimplifying the issue. Use profiler to see if reading uncommited records is faster then reading a comitted records.
Gregory Larsen, DBA
If you looking for SQL Server Examples check out my website at http://www.geocities.com/sqlserverexamples
Engineer Programmer Jr.
Engineer Programmer Jr.
November 4, 2002 at 11:13 am
It should be faster if other concurrent accesses to the table(s) were getting locked out. The trade off is concurrency versus data integrity.
By lowering the isolation level you are allowing greater concurrency but are allowing potentially erroneous data to be read by competing threads.
Steve Hendricks
MCSD, MCDBA
AFS Consulting Group
(949) 588-9800 x15
Steve Hendricks
MCSD, MCDBA
Data Matrix
shendricks@afsconsulting.com
(949) 588-9800 x15
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply