April 11, 2007 at 1:53 am
Hello,
i am getting a hard time in minimizing time for data retrieval, over SQL Server DataBase. My DataBase consist of 2 tables. One of the table has more than 10000 entries and other table with more than 10 million entries. I have used SQLNative Client for connecting to data base and my goal was to find a value from the 1st table and search it out in 2nd table. The result is more than hundered thousand rows. Now the problem is: the time it took for retrieving those rows is much slower approx. 12 minutes. Can this time be cut down. I have used SQLClient connection to make sure it is accessing SQL server on a direct access base.
Also, i am using
SqlClient::SqlDataReader
for reading, getting rows returned by the my query.
Please help me out.
Thanx.
R. T.
April 11, 2007 at 2:53 am
Please post your query/code.
Do you have any [clustered] index on the large table?
N 56°04'39.16"
E 12°55'05.25"
April 11, 2007 at 4:56 am
HI,
I have used namespace System:: Data:: SqlClient for making and executing the queries.
SqlClient:: SqlDataReader for reading the result of the queries.
The first query is simple:
Select * from Table1 where Entry1 = 'EnterytoFind'
used SqlClient:: SqlDataReader to read data into a variable Var.
Then executed Second query as
for(long Val = 0; Val <= 10000; Val++)// for 0-10000 values of Val
{
Select * from Table2 where Entry_A = Var and Entry_C = Val
// Giving Result: populated in a List
}
Now the main thing is Table 2 is indexed on the basis of Entry_A and Entry_C. Database contains no primary key but a composite key based on Entry_A and Entry_C.
I hope this will explain the problem in more clearity.
R.T.
April 11, 2007 at 5:12 am
This is wrong on so many levels....
Use this query to get all your expected record AT ONCE!
SELECT
t2.Entry_Z
FROM Table2 AS t2
INNER JOIN Table1 AS t2 ON t2.Entry_A = t1.Entry_A
WHERE t2.Entry_C BETWEEN 0 AND 10000
AND t1.Entry1 = 'EntryToFind'
ORDER BY t2.Entry_C
Also, you should not use * when collecting records.
ONLY FETCH THE RELEVANT columns to populate the list!
Now you use your datareader to get all maximum of 10000 records in one operation.
And put the 10000 values to your list!
I would estimate your time to get all records now will drop from 12 minutes to 4-6 seconds...
N 56°04'39.16"
E 12°55'05.25"
April 12, 2007 at 12:13 am
Hi,
Well, this things help in reducing time effectively,
Now the main point is, Loop was given there just to give an idea that, that SQL query will run approx for that number of times. i have tried implementing it inside a stored procedure, and called that stored procedure from inside C++ (As you might have noticed that it is using C++ "for" loop and "not" standard SQL "WHILE" loop), for this number of times based on the result i am getting. This happen to be my constraint that i have to call this stored procedure from within C++.
Sorry, if i was not able to clear it.
The exact problem statement is that i have to make this loop work for every entry of the table 1, that means a nested loop initiated inside C++ calling 2 stored procedures or 1 Select 1 Stored procedure, etc.
like:
for every entry in table 1
Var = table1.curEntry
Call Stored Procedure with Parameters Var
Stored Procedure:
[-------------------------------------------------------------------------------------------------------------]
[ for Var = 0 to 10000]
[ get all rows of Table 2 such that Table2.Entry_A = Var and Table2.Entry_C = Val]
[-------------------------------------------------------------------------------------------------------------]
I was thinking that if i can make this stored procedure efficient that i can do it efficiently for whole problem, but it seems it will not work here like that.
I also have one more shot to look for, which is using SQL Server Integration Services.
What i have in mind is to have an Integration services with Data flow giving Data inside memory using "DataReader Destination", of the Table 1, and somehow it would be available inside the stored procedure so that it can utilize it.
Regards,
R.T.
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply