Long running queries

  • I have been facing this problem since a while. All the developers they run just simple select,update statements which include where clause,group by,date range. Only thing is when ever they run queries it has to pull up millions of records like 80 -90 millions. sometimes the queries run for couple of days.our business really need to pull that many records. Most of the time the CPU will also hit 100% of usage and many other users will not be able to query databases.

    IS there a best solution as a SQL Admin to manage this kind of heavy loads and every one run there queries smooth and server never hangs.

  • If i have 90million records can that be updated in some bathces so that other process can also run and transaction log will not grow. if i have a table without any indexes and key fields is there a way i can still update records in batches.

  • here is the query that updates 90million records

    update dbo.Rev

    set planid = cast(rate.state as varchar(2))

    from dbo.Rev as a left join STATE_LOOKUP_STATE5.dbo.ratefieldlookup as rate on a.revcode = rate.rev and

    ltrim(rtrim(a.hmrevid)) = rate.hmrevid and a.tax = rate.tax and a.leglent = rate.legent

  • You could try the following (in a test environment first)

    declare @batchsize int;

    set @batchsize = 100000; -- 100,000

    while @batchsize <> 0

    begin

    update top (@batchsize) dbo.Rev set

    planid = cast(rate.state as varchar(2))

    from

    dbo.Rev as a

    left join STATE_LOOKUP_STATE5.dbo.ratefieldlookup as rate

    on (a.revcode = rate.rev and

    ltrim(rtrim(a.hmrevid)) = rate.hmrevid and

    a.tax = rate.tax and

    a.leglent = rate.legent)

    where

    a.planid <> cast(rate.state as varchar(2));

    set @batchsize = @@rowcount;

    end

    Also, this could also be slowing down your updates: ltrim(rtrim(a.hmrevid)) = rate.hmrevid and.

    The function calls can be preventing SQL Server from using any indexes resulting in a table or clustered index scan.

  • Thanks .

    when running this query does the transaction log will be free for every 100,000 recrods? I am just trying to get how this will improve performace when compared to run it without batch. I have seenthis type queries using some time dealy, how does that help?

  • Mike Levan (1/30/2009)


    Thanks .

    when running this query does the transaction log will be free for every 100,000 recrods? I am just trying to get how this will improve performace when compared to run it without batch. I have seenthis type queries using some time dealy, how does that help?

    Actually, if your database is using the Full Recovery model, insert a transaction log backup after the set @batchsize = @@rowcount. That will then manage your transaction log for you.

  • All my databases are in SIMPLE mode.

  • Then you won't need the transaction log backup in the code.

  • hi frnd,

    I think first n foremost thing u need is to have proper indexes in place.

    I want to know which sql server u r using. it is a very expected behavior when you fetch 80-90 million records.

    Apart from batch process running, there are some options which can up to some extent reduce some time with some logical trick.If it is sql server 2000 then you can use archiving with the help of some temporary tables.

    But if you have sql server 2005 then you can opt for table partitioning as well.

  • Lynn

    the batch processing trick worked well with me and asavd lot of time. thank you. From your query does the @@rowcount gives the number records updated, do u think if i change that number depending on the split will that improve better.

    can i also speed up my processes for huge inserts and inserting a hard file into tables.

  • alim.shaikh (1/31/2009)


    hi frnd,

    I think first n foremost thing u need is to have proper indexes in place.

    I want to know which sql server u r using. it is a very expected behavior when you fetch 80-90 million records.

    Apart from batch process running, there are some options which can up to some extent reduce some time with some logical trick.If it is sql server 2000 then you can use archiving with the help of some temporary tables.

    But if you have sql server 2005 then you can opt for table partitioning as well.

    Just an FYI, table partitioning is only available in the Enterprise Edition (and by extension, the Developer Edition) of SQL Server 2005.

  • Mike Levan (2/2/2009)


    Lynn

    the batch processing trick worked well with me and asavd lot of time. thank you. From your query does the @@rowcount gives the number records updated, do u think if i change that number depending on the split will that improve better.

    can i also speed up my processes for huge inserts and inserting a hard file into tables.

    I'm not sure what you are asking regarding the @@rowcount.

    Regarding importing data from files, it depends on how you are doing it; SSIS, BCP, BULK INSERT.

  • Another performance question.

    Developers run updates queries on multiple databases at once, like if they need to update 3 databases they open 3 windows and run the queries simultaneously which brings CPU load to 100%. I dont think thats a wise thing instead if they can run all the updates in single windows one after the other do you think that will improve performance..

    thanks

  • Mike Levan (2/2/2009)


    Another performance question.

    Developers run updates queries on multiple databases at once, like if they need to update 3 databases they open 3 windows and run the queries simultaneously which brings CPU load to 100%. I dont think thats a wise thing instead if they can run all the updates in single windows one after the other do you think that will improve performance..

    thanks

    It would serialize the updates.

    Quick question, why are developers updating production databases directly?

  • It would serialize the updates????

    its Dev environment but when people do heavy updated other developers can not query that databases.

Viewing 15 posts - 1 through 15 (of 18 total)

You must be logged in to reply to this topic. Login to reply