February 29, 2012 at 3:59 am
Hi there,I have a DB which has 10,000,00 rows in it.
when I use select Statement in my DB it takes long time to retrieve data.
for instance for retrieving 320,000 rows it takes 24 Secs which is too much.
how can I improve my query?
also I have set index and used SP .
tnx
February 29, 2012 at 4:14 am
Follow these links to improve query performance
Hope this will help.
February 29, 2012 at 4:32 am
Can you attached a copy of your execution plan?
______________________________________________________________________________Never argue with an idiot; Theyll drag you down to their level and beat you with experience
February 29, 2012 at 4:46 am
yes.
February 29, 2012 at 4:57 am
Vahid.Ch (2/29/2012)
yes.
that query is selecting everything from the WebProxyLog table, no WHERE statement to tune;
it takes any network time to return 630 meg of data.
do you need to return all columns? are you doing additional processing after you get the table, that maybe could be done first?
Lowell
February 29, 2012 at 5:08 am
No I don't need,it was a sample.
let me explain more.
I'll have new Database everyday,around 300,000 to 350,000 rows.
so,it stores users log,such as url,clientIP,Logtime and so on.
imagine that an admin wants to have a report for a month ,which will be 30 databases.
first of all i'll join all of my tables and then select them.
for instance:
1.mdb
2.mdb
3.mdb
...
30.mdb
then I select these tables to specific Logtime.
but it'll take long time to retrieve these huge data.
do you have any idea?
tnx indeed.
February 29, 2012 at 5:15 am
Vahid.Ch (2/29/2012)
No I don't need,it was a sample.let me explain more.
I'll have new Database everyday,around 300,000 to 350,000 rows.
so,it stores users log,such as url,clientIP,Logtime and so on.
imagine that an admin wants to have a report for a month ,which will be 30 databases.
first of all i'll join all of my tables and then select them.
for instance:
1.mdb
2.mdb
3.mdb
...
30.mdb
then I select these tables to specific Logtime.
but it'll take long time to retrieve these huge data.
do you have any idea?
tnx indeed.
why do you need a new db for each day.....is it not possible to append each days data into a single table and then run you query's against that?
________________________________________________________________
you can lead a user to data....but you cannot make them think
and remember....every day is a school day
February 29, 2012 at 5:18 am
unfortunately No,it is not.
look at my query,i've done it like this:
USE [MainTable]
GO
/****** Object: StoredProcedure [dbo].[ManagerTotalSendReceive] Script Date: 02/29/2012 15:45:37 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
ALTER PROCEDURE [dbo].[ManagerTotalSendReceive]
(
@UserName NVARCHAR(500),
@FromDate nvarchar(20),
@ToDate NVARCHAR(20),
@Diff INT,
@Date NVARCHAR(100),
@GetDay NVARCHAR(100)
)
AS
BEGIN
CREATE TABLE ManagerTotalSendReceiveTable
(
SentData NVARCHAR(500),
Received int,
FirstName NVARCHAR(500),
LastName NVARCHAR(50),
UnitName NVARCHAR(25)
)
DECLARE @ClientIP uniqueidentifier
SELECT @ClientIP=ClientIP FROM Users
WHERE UserName=@UserName
DECLARE @SQl NVARCHAR(max)
WHILE @diff>=0
BEGIN
SET @Sql='INSERT INTO ManagerTotalSendReceiveTable SELECT SUM(wpl.bytesrecvd)as ''Sent'',SUM(WPL.bytessent) as ''Received''
,U.FirstName,u.LastName,Un.UnitName FROM ISALOG_'+ @Date+@getday+ '_WEB.dbo.WebProxyLog WPL
INNER JOIN MainTable.dbo.Users U
ON U.ClientIP=WPL.ClientIP
INNER JOIN MainTable.dbo.Units UN
ON UN.UnitID=u.UnitID
WHERE wpl.logTime>='''+ @FromDate+ ''' and
wpl.logTime<='''+ @ToDate+ ''' AND u.isadmin=''true'' AND u.username!='''+ @UserName+ '''
group by U.FirstName , U.LastName,Un.UnitName'
SET @diff=@diff-1
SET @getday=@getday+1
IF LEN(@getday)=1
Begin
SET @Getday='0'+CAST(@getday AS nvarchar(10))
END
EXEC (@sql)
END
SELECT SUM(CAST(sentdata AS INT))as 'Sent',SUM(CAST(Received AS int)) as 'Received'
,FirstName,LastName,UnitName FROM ManagerTotalSendReceiveTable
group by FirstName , LastName,UnitName
DROP TABLE ManagerTotalSendReceiveTable
END
Viewing 8 posts - 1 through 7 (of 7 total)
You must be logged in to reply to this topic. Login to reply