December 17, 2015 at 12:20 pm
I am using SQL Server 2008 R2:
I want to get the results set which are reached my threshold value at each account level. In my example my threshold value is 50
I need best approach to get the accurate results with proper performance. As I red SQL 2012 have better approach and performance.
Below is the sample query where I calculate running total using sub query:
My sample threshold value is: 50, I gave expected results set also.
--------------------
DECLARE @CustomerOrders table
(
id int,
account NVARCHAR(20),
deposit INT
)
INSERT INTO @CustomerOrders Values(1,'AAA',10)
INSERT INTO @CustomerOrders Values(2,'AAA',12)
INSERT INTO @CustomerOrders Values(3,'AAA',15)
INSERT INTO @CustomerOrders Values(4,'AAA',22)
INSERT INTO @CustomerOrders Values(5,'AAA',13)
INSERT INTO @CustomerOrders Values(6,'BBB',20)
INSERT INTO @CustomerOrders Values(7,'BBB',33)
INSERT INTO @CustomerOrders Values(8,'BBB',40)
INSERT INTO @CustomerOrders Values(9,'CCC',10)
INSERT INTO @CustomerOrders Values(10,'DDD',30)
INSERT INTO @CustomerOrders Values(11,'DDD',40)
INSERT INTO @CustomerOrders Values(12,'DDD',10)
SELECT * FROM @CustomerOrders
--- Query for running total
SELECT ID, deposit,account,
(SELECT SUM(deposit)
FROM @CustomerOrders T2
WHERE T2.ID <= T1.ID AND T2.account = T1.account) AS RunningTotal
FROM @CustomerOrders T1
-----------------------
---Expected results:
DECLARE @ExpectedResults table
(
id int,
account NVARCHAR(20),
deposit INT ,
runningTotal INT
)
INSERT INTO @ExpectedResults Values(1,'AAA',10,10)
INSERT INTO @ExpectedResults Values(2,'AAA',12,22)
INSERT INTO @ExpectedResults Values(3,'AAA',15,37)
INSERT INTO @ExpectedResults Values(4,'AAA',22,59)
INSERT INTO @ExpectedResults Values(6,'BBB',20,20)
INSERT INTO @ExpectedResults Values(6,'BBB',33,53)
INSERT INTO @ExpectedResults Values(10,'DDD',30,30)
INSERT INTO @ExpectedResults Values(11,'DDD',40,70)
SELECT * FROM @ExpectedResults
Thanks in advance
Vijay
December 17, 2015 at 2:02 pm
If you only care about which accounts have reached your defined threshold, you could just do something like this.
DECLARE @CustomerOrders table
(
id int,
account NVARCHAR(20),
deposit INT
)
INSERT INTO @CustomerOrders Values(1,'AAA',10)
INSERT INTO @CustomerOrders Values(2,'AAA',12)
INSERT INTO @CustomerOrders Values(3,'AAA',15)
INSERT INTO @CustomerOrders Values(4,'AAA',22)
INSERT INTO @CustomerOrders Values(5,'AAA',13)
INSERT INTO @CustomerOrders Values(6,'BBB',20)
INSERT INTO @CustomerOrders Values(7,'BBB',33)
INSERT INTO @CustomerOrders Values(8,'BBB',40)
INSERT INTO @CustomerOrders Values(9,'CCC',10)
INSERT INTO @CustomerOrders Values(10,'DDD',30)
INSERT INTO @CustomerOrders Values(11,'DDD',40)
INSERT INTO @CustomerOrders Values(12,'DDD',10)
SELECT account, SUM(deposit) AS Total
FROM @CustomerOrders
GROUP BY account
HAVING SUM(deposit) >= 50
December 17, 2015 at 3:04 pm
Hi,
Thanks for reply.
But I want results set which is reached my threshold value. I don't want all records in the table. But I want to display records when running total is reached my threshold value.
Please refer @ExpectedResults in post.
Thanks in advance,
Vijay
December 17, 2015 at 9:32 pm
This should do the trick...
DECLARE @CustomerOrders table
(
id int,
account NVARCHAR(20),
deposit INT
);
INSERT INTO @CustomerOrders Values(1,'AAA',10);
INSERT INTO @CustomerOrders Values(2,'AAA',12);
INSERT INTO @CustomerOrders Values(3,'AAA',15);
INSERT INTO @CustomerOrders Values(4,'AAA',22);
INSERT INTO @CustomerOrders Values(5,'AAA',13);
INSERT INTO @CustomerOrders Values(6,'BBB',20);
INSERT INTO @CustomerOrders Values(7,'BBB',33);
INSERT INTO @CustomerOrders Values(8,'BBB',40);
INSERT INTO @CustomerOrders Values(9,'CCC',10);
INSERT INTO @CustomerOrders Values(10,'DDD',30);
INSERT INTO @CustomerOrders Values(11,'DDD',40);
INSERT INTO @CustomerOrders Values(12,'DDD',10);
-- The solution...
WITH
cte_RunningTotal AS (
SELECT
co1.id,
co1.account,
co1.deposit,
rt.RunningTotal,
OverLimit = CASE WHEN rt.OverLimit IS NOT NULL THEN ROW_NUMBER() OVER (PARTITION BY rt.OverLimit ORDER BY rt.RunningTotal) END ,
MaxDeposit = SUM(co1.deposit) OVER (PARTITION BY co1.account)
FROM
@CustomerOrders co1
CROSS APPLY (
SELECT
RunningTotal = SUM(co2.deposit),
OverLimit = CASE WHEN SUM(co2.deposit) >= 50 THEN MIN(co2.account) END
FROM
@CustomerOrders co2
WHERE
co1.account = co2.account
AND co1.id >= co2.id
) rt
)
SELECT
rt.id,
rt.account,
rt.deposit,
rt.RunningTotal
FROM
cte_RunningTotal rt
WHERE
rt.OverLimit = 1 OR rt.OverLimit IS NULL
AND rt.MaxDeposit >= 50
ORDER BY
rt.id;
December 18, 2015 at 8:30 am
Thank you very much. It is working as I expected.
But If have lot of records, for example in my data-set I have 500 thousand records. The query is taking very very long time.
Thanks,
Vijay
December 18, 2015 at 9:11 am
vijaykumar587 (12/18/2015)
Thank you very much. It is working as I expected.But If have lot of records, for example in my data-set I have 500 thousand records. The query is taking very very long time.
Thanks,
Vijay
Yea... I wouldn't expect the performance of any query that uses a triangular join to be an outstanding performer. That said, making sure you have appropriate indexes should help quite a bit.
If you don'y already have if... Try creating a nonclustered index on the customer table... AccountID & ID (in that order) would be the key columns and Deposit as an included column.
That should give you the optimal performance.
December 18, 2015 at 6:13 pm
vijaykumar587 (12/18/2015)
Thank you very much. It is working as I expected.But If have lot of records, for example in my data-set I have 500 thousand records. The query is taking very very long time.
Thanks,
Vijay
Do you have anything like a transaction date column in the table to identify when the transaction took place? I ask because I have a method that will do this for you in just a couple of seconds and I need to know about the date column so that I can setup 500,000 rows of test data to demonstrate how to use it and how fast it is.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 7 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic. Login to reply