November 12, 2008 at 5:12 pm
Do I have any options to speed-up my application that establishes a connection to large database tables? My access VBA application runs very slow when it has to access a large table:
Set rs1 = New ADODB.Recordset
rs1.Open "LargeTable", CurrentProject.Connection, adOpenKeyset, adLockOptimistic
The table can contain 500,000 up to 5,000,000 records. The code needs to loop thru the table values and perform calculations on every row. This takes time but I can live with it. It is the initial opening of the table to establish the recordset that is painfully slow. Do I have any options to improve performance?
Thanks
November 13, 2008 at 3:51 am
If you can change the code that updates each record into an UPDATE statement that runs against the table, you will get a massive improvement in speed
e.g. This loops through all records and multiplies a value by 10
Set rs1 = New ADODB.Recordset
With rs1
.Open "LargeTable", CurrentProject.Connection, adOpenKeyset, adLockOptimistic
Do While Not .EOF
.Edit
![ValueField] = ![ValueField] * 10
.Update
.Movenext
Loop
.close
End With
Set rs1 = Nothing
This can be accomplished with a single SQL statement
Currentdb.Execute "UPDATE LargeTable SET ValueField = ValueField * 10", dbFailOnError
November 14, 2008 at 1:05 pm
Thanks Chris. I'll review my recordset loops and detail and attempt to change them to UPDATE sql. In your experience, can most/all recordset looping operations be changed to a single UPDATE statement?
November 17, 2008 at 2:06 am
Many times you can, but sometimes you have to do it row by row
Viewing 4 posts - 1 through 3 (of 3 total)
You must be logged in to reply to this topic. Login to reply