January 5, 2016 at 6:07 pm
I am not sure how to begin writing this. I have tried using row_number() and rank() and even tried CTE, but not getting results that I am needing. Thanks in advance for any guidance, tips, etc.
Initial sample query to start with..I added the last 2 case statements to show what the values would be in this instance since the palletqty = 3.:
SELECT 'TST' AS BOTYPE
,'SFM' AS CONSIGNEE
,S.SALESID AS REFERENCEORD
,'2208200' AS SOURCECOMPANY
,max(S.sfcstorenumber) TARGETCOMPANY
,MAX((convert(VARCHAR, (CONVERT(DATE, S.receiptdaterequested, 101)), 101))) AS SCHEDULEDATE
,CEILING(SUM(S.PACKINGSLIP_PALLET_QTY)) EXPECTEDPALLETS
,CAST(ROUND(sum(S.packingslip_case_qty), 0) AS INT) CASES
,'53' AS CUBE
,CAST(ROUND(sum(S.ITEMSHIPWEIGHT_CASE * S.packingslip_case_qty), 0) AS INT) WEIGHT
,case when CEILING(SUM(S.PACKINGSLIP_PALLET_QTY)) > 1 THEN round(CAST(ROUND(sum(S.packingslip_case_qty), 0) AS INT)/CEILING(SUM(S.PACKINGSLIP_PALLET_QTY)), 0) else CAST(ROUND(sum(S.packingslip_case_qty), 0) AS INT) end brkdwncase
,case when CEILING(SUM(S.PACKINGSLIP_PALLET_QTY)) > 1 THEN round(CAST(ROUND(sum(S.ITEMSHIPWEIGHT_CASE * S.packingslip_case_qty), 0) AS INT)/CEILING(SUM(S.PACKINGSLIP_PALLET_QTY)), 0) else CAST(ROUND(sum(S.ITEMSHIPWEIGHT_CASE * S.packingslip_case_qty), 0) AS INT) end brkdwnwght
FROM [dbo].[tablename] S
WHERE S.SEGMENTID = 'SR'
AND S.RECEIPTDATEREQUESTED = '01/08/2016' --for testing
GROUP BY S.SALESID
The above query would return:
TSTSFM0009185962208200201/08/2016 3125534044
TSTSFM0009185972208200701/08/2016 143531820
TSTSFM000918598220820070501/08/2016 251532216
If the palletqty is > 1 then divide the case by the palletqty and divide weight by palleyqty and create 3 individual rows and of course round up the . So it would go from the above to:
TSTSFM0009185962208200201/08/2016 142531348
TSTSFM0009185962208200201/08/2016 142 53 1348
TSTSFM0009185962208200201/08/2016 142531348
TSTSFM0009185972208200701/08/2016 143531820
TSTSFM000918598220820070501/08/2016 126531108
TSTSFM000918598220820070501/08/2016 126531108
I have tried the following and not sure if I am on the right path:
WITH C
AS (
SELECT row_number () OVER (
PARTITION BY S.salesid ORDER BY S.receiptdaterequested DESC
) AS rownum
,'TST' AS BOTYPE
,'SFM' AS CONSIGNEE
,S.SALESID AS REFERENCEORD
,'2208200' AS SOURCECOMPANY
,max(S.sfcstorenumber) TARGETCOMPANY
,MAX((convert(VARCHAR, (CONVERT(DATE, S.receiptdaterequested, 101)), 101))) AS SCHEDULEDATE
,'53' AS CUBE
,CEILING(SUM(S.PACKINGSLIP_PALLET_QTY)) EXPECTEDPALLETS
,CAST(ROUND(sum(S.packingslip_case_qty), 0) AS INT) CASES
,CAST(ROUND(sum(S.ITEMSHIPWEIGHT_CASE * S.packingslip_case_qty), 0) AS INT) WEIGHT
FROM [dbo].[tablename] S
WHERE SEGMENTID = 'SR'
AND RECEIPTDATEREQUESTED = '01/08/2016' --for testing
GROUP BY S.SALESID
,S.receiptdaterequested
,S.sfcstorenumber
,s.packingslip_case_qty
,s.PACKINGSLIP_PALLET_QTY
)
SELECT BOTYPE
,CONSIGNEE
,REFERENCEORD
,SOURCECOMPANY
,TARGETCOMPANY
,EXPECTEDPALLETS
,CASE
WHEN CEILING(SUM(EXPECTEDPALLETS)) > 1
THEN round(CAST(ROUND(sum(CASES), 0) AS INT) / CEILING(SUM(EXPECTEDPALLETS)), 0)
ELSE CAST(ROUND(sum(CASES), 0) AS INT)
END cases
,CASE
WHEN CEILING(SUM(EXPECTEDPALLETS)) > 1
THEN round(CAST(ROUND(sum(WEIGHT * CASES), 0) AS INT) / CEILING(SUM(EXPECTEDPALLETS)), 0)
ELSE CAST(ROUND(sum(WEIGHT * CASES), 0) AS INT)
END weights
FROM C
GROUP BY BOTYPE
,CONSIGNEE
,REFERENCEORD
,SOURCECOMPANY
,TARGETCOMPANY
,EXPECTEDPALLETS
January 5, 2016 at 7:10 pm
Sounds like you need a Tally table. Here[/url] is a link to Jeff Moden's article.
It's harder to tell without table definitions, but if you joined quantity in your query to the Tally table, you could do something like:
SELECT *
FROM yourQuery x INNER JOIN dbo.Utils.Tally t ON x.Quantity>=t.Number
...
You would get one record for each unit in your query... so if you wanted only part of that, you'd just do some integer division first.
January 5, 2016 at 8:33 pm
You probably need to make sure you do not take N=0 record from Tally, if it's present there
AND t.Number > 0
_____________
Code for TallyGenerator
January 6, 2016 at 11:12 am
That worked, thanks! I am showing what I did in case there is someone else that is in a similar situation:
CREATE PROCEDURE [dbo].[sfc_SPName]
@SCHEDULEDATE DATETIME
AS
SET NOCOUNT ON;
CREATE TABLE #splitmyrows (
Id INT
,BOTYPE VARCHAR(3)
,CONSIGNEE VARCHAR(3)
,REFERENCEORD VARCHAR(10)
,SOURCECOMPANY VARCHAR(10)
,TARGETCOMPANY VARCHAR(10)
,SCHEDULEDATE DATETIME
,EXPECTEDPALLETS INT
,CASES INT
,CUBE INT
,WEIGHT INT
)
BEGIN
INSERT INTO #splitmyrows
SELECT 1 ID
,'TST' AS BOTYPE
,'SFM' AS CONSIGNEE
,S.SALESID AS REFERENCEORD
,'2208200' AS SOURCECOMPANY
,MAX(S.sfcstorenumber) TARGETCOMPANY
,MAX((CONVERT(VARCHAR, (CONVERT(DATE, S.receiptdaterequested, 101)), 101))) AS SCHEDULEDATE
,CEILING(SUM(S.PACKINGSLIP_PALLET_QTY)) EXPECTEDPALLETS
,CASE
WHEN CEILING(SUM(s.PACKINGSLIP_PALLET_QTY)) > 1
THEN ROUND(CAST(ROUND(SUM(packingslip_case_qty), 0) AS INT) / CEILING(SUM(s.PACKINGSLIP_PALLET_QTY)), 0)
ELSE CAST(ROUND(SUM(S.packingslip_case_qty), 0) AS INT)
END AS cases
,'53' AS CUBE
,CASE
WHEN CEILING(SUM(PACKINGSLIP_PALLET_QTY)) > 1
THEN ROUND(CAST(ROUND(SUM(ITEMSHIPWEIGHT_CASE * packingslip_case_qty), 1) AS INT) / CEILING(SUM(PACKINGSLIP_PALLET_QTY)), 0)
ELSE CAST(ROUND(SUM(S.ITEMSHIPWEIGHT_CASE * S.packingslip_case_qty), 0) AS INT)
END WEIGHT
FROM [dbo].[myTableName] S
WHERE S.SEGMENTID = 'SR'
AND S.RECEIPTDATEREQUESTED = @SCHEDULEDATE --'01/08/2016' --for testing
GROUP BY S.SALESID
CREATE TABLE #numbers (num INT identity PRIMARY KEY)
DECLARE @i INT
SELECT @i = MAX(EXPECTEDPALLETS)
FROM #splitmyrows
WHILE @i > 0
BEGIN
INSERT #numbers DEFAULT
VALUES
SET @i = @i - 1
END
SELECT
BOTYPE
,CONSIGNEE
,REFERENCEORD
,SOURCECOMPANY
,TARGETCOMPANY
,CONVERT(VARCHAR, (CONVERT(DATE, SCHEDULEDATE, 101)), 101) SCHEDULEDATE
,1 EXPECTEDPALLETS
,CASES
,[CUBE]
,[WEIGHT]
FROM #splitmyrows
JOIN #numbers ON num <= EXPECTEDPALLETS
DROP TABLE #splitmyrows
DROP TABLE #numbers
END
GO
January 10, 2016 at 4:38 am
Cheryl McLaughlin-385812 (1/6/2016)
That worked, thanks! I am showing what I did in case there is someone else that is in a similar situation:
CREATE PROCEDURE [dbo].[sfc_SPName]
@SCHEDULEDATE DATETIME
AS
SET NOCOUNT ON;
CREATE TABLE #splitmyrows (
Id INT
,BOTYPE VARCHAR(3)
,CONSIGNEE VARCHAR(3)
,REFERENCEORD VARCHAR(10)
,SOURCECOMPANY VARCHAR(10)
,TARGETCOMPANY VARCHAR(10)
,SCHEDULEDATE DATETIME
,EXPECTEDPALLETS INT
,CASES INT
,CUBE INT
,WEIGHT INT
)
BEGIN
INSERT INTO #splitmyrows
SELECT 1 ID
,'TST' AS BOTYPE
,'SFM' AS CONSIGNEE
,S.SALESID AS REFERENCEORD
,'2208200' AS SOURCECOMPANY
,MAX(S.sfcstorenumber) TARGETCOMPANY
,MAX((CONVERT(VARCHAR, (CONVERT(DATE, S.receiptdaterequested, 101)), 101))) AS SCHEDULEDATE
,CEILING(SUM(S.PACKINGSLIP_PALLET_QTY)) EXPECTEDPALLETS
,CASE
WHEN CEILING(SUM(s.PACKINGSLIP_PALLET_QTY)) > 1
THEN ROUND(CAST(ROUND(SUM(packingslip_case_qty), 0) AS INT) / CEILING(SUM(s.PACKINGSLIP_PALLET_QTY)), 0)
ELSE CAST(ROUND(SUM(S.packingslip_case_qty), 0) AS INT)
END AS cases
,'53' AS CUBE
,CASE
WHEN CEILING(SUM(PACKINGSLIP_PALLET_QTY)) > 1
THEN ROUND(CAST(ROUND(SUM(ITEMSHIPWEIGHT_CASE * packingslip_case_qty), 1) AS INT) / CEILING(SUM(PACKINGSLIP_PALLET_QTY)), 0)
ELSE CAST(ROUND(SUM(S.ITEMSHIPWEIGHT_CASE * S.packingslip_case_qty), 0) AS INT)
END WEIGHT
FROM [dbo].[myTableName] S
WHERE S.SEGMENTID = 'SR'
AND S.RECEIPTDATEREQUESTED = @SCHEDULEDATE --'01/08/2016' --for testing
GROUP BY S.SALESID
CREATE TABLE #numbers (num INT identity PRIMARY KEY)
DECLARE @i INT
SELECT @i = MAX(EXPECTEDPALLETS)
FROM #splitmyrows
WHILE @i > 0
BEGIN
INSERT #numbers DEFAULT
VALUES
SET @i = @i - 1
END
SELECT
BOTYPE
,CONSIGNEE
,REFERENCEORD
,SOURCECOMPANY
,TARGETCOMPANY
,CONVERT(VARCHAR, (CONVERT(DATE, SCHEDULEDATE, 101)), 101) SCHEDULEDATE
,1 EXPECTEDPALLETS
,CASES
,[CUBE]
,[WEIGHT]
FROM #splitmyrows
JOIN #numbers ON num <= EXPECTEDPALLETS
DROP TABLE #splitmyrows
DROP TABLE #numbers
END
GO
Although this code may do the intended work it is horribly inefficient and in fact I would not approve it nor promote it to any production environment. Suggest you post the DDL (create table) scripts and sample data as an insert statement on a new thread in this forum and we will then help you to write a proper set based code for the operation.
😎
January 10, 2016 at 10:05 am
Cheryl McLaughlin-385812 (1/6/2016)
That worked, thanks! I am showing what I did in case there is someone else that is in a similar situation:
CREATE PROCEDURE [dbo].[sfc_SPName]
@SCHEDULEDATE DATETIME
AS
SET NOCOUNT ON;
CREATE TABLE #splitmyrows (
Id INT
,BOTYPE VARCHAR(3)
,CONSIGNEE VARCHAR(3)
,REFERENCEORD VARCHAR(10)
,SOURCECOMPANY VARCHAR(10)
,TARGETCOMPANY VARCHAR(10)
,SCHEDULEDATE DATETIME
,EXPECTEDPALLETS INT
,CASES INT
,CUBE INT
,WEIGHT INT
)
BEGIN
INSERT INTO #splitmyrows
SELECT 1 ID
,'TST' AS BOTYPE
,'SFM' AS CONSIGNEE
,S.SALESID AS REFERENCEORD
,'2208200' AS SOURCECOMPANY
,MAX(S.sfcstorenumber) TARGETCOMPANY
,MAX((CONVERT(VARCHAR, (CONVERT(DATE, S.receiptdaterequested, 101)), 101))) AS SCHEDULEDATE
,CEILING(SUM(S.PACKINGSLIP_PALLET_QTY)) EXPECTEDPALLETS
,CASE
WHEN CEILING(SUM(s.PACKINGSLIP_PALLET_QTY)) > 1
THEN ROUND(CAST(ROUND(SUM(packingslip_case_qty), 0) AS INT) / CEILING(SUM(s.PACKINGSLIP_PALLET_QTY)), 0)
ELSE CAST(ROUND(SUM(S.packingslip_case_qty), 0) AS INT)
END AS cases
,'53' AS CUBE
,CASE
WHEN CEILING(SUM(PACKINGSLIP_PALLET_QTY)) > 1
THEN ROUND(CAST(ROUND(SUM(ITEMSHIPWEIGHT_CASE * packingslip_case_qty), 1) AS INT) / CEILING(SUM(PACKINGSLIP_PALLET_QTY)), 0)
ELSE CAST(ROUND(SUM(S.ITEMSHIPWEIGHT_CASE * S.packingslip_case_qty), 0) AS INT)
END WEIGHT
FROM [dbo].[myTableName] S
WHERE S.SEGMENTID = 'SR'
AND S.RECEIPTDATEREQUESTED = @SCHEDULEDATE --'01/08/2016' --for testing
GROUP BY S.SALESID
CREATE TABLE #numbers (num INT identity PRIMARY KEY)
DECLARE @i INT
SELECT @i = MAX(EXPECTEDPALLETS)
FROM #splitmyrows
WHILE @i > 0
BEGIN
INSERT #numbers DEFAULT
VALUES
SET @i = @i - 1
END
SELECT
BOTYPE
,CONSIGNEE
,REFERENCEORD
,SOURCECOMPANY
,TARGETCOMPANY
,CONVERT(VARCHAR, (CONVERT(DATE, SCHEDULEDATE, 101)), 101) SCHEDULEDATE
,1 EXPECTEDPALLETS
,CASES
,[CUBE]
,[WEIGHT]
FROM #splitmyrows
JOIN #numbers ON num <= EXPECTEDPALLETS
DROP TABLE #splitmyrows
DROP TABLE #numbers
END
GO
Not just performance issues, but you may (just making sure) have a really serious logical problem... The following snippet of your code...
,MAX((CONVERT(VARCHAR, (CONVERT(DATE, S.receiptdaterequested, 101)), 101))) AS SCHEDULEDATE
... will [font="Arial Black"]NOT [/font]correctly calculate the latest date for a give SalesID unless the SalesID is absolutely guaranteed to be unique (such as you might find for a CROSSTAB), in which case there's still the problem of the double formatting that you're doing during a CROSSTAB.
Also, what is the datatype of the S.receiptdaterequested column? We need to know so that we can help you trim down your "display formula" for "ScheduleDate".
Next, your "numbers" table population will be relatively slow. Heh... and don't say "Well, there's only going to be a couple of rows" because you don't actually have control over that and every bit of performance helps a lot in the big picture. Like Granny used to say, "Mind the pennies and the dollars will take care of themselves". With that in mind, the last part of the code should be written as follows...
--===== Determine the maximum number of expected pallets to control
-- the maximum value of the embedded numbers table that will be
-- used to create one row for each pallet.
DECLARE @MaxExpectedPallets INT
;
SELECT @MaxExpectedPallets = MAX(EXPECTEDPALLETS)
FROM #splitmyrows
;
--===== Using the value of ExpectedPallets in each row of the #SplitMyRows table,
-- duplicate that number of number of rows so that we have 1 row per pallet.
WITH
E1(N) AS (SELECT 1 FROM (VALUES (1),(1),(1),(1),(1),(1),(1),(1),(1),(1))E0(N))
,E6(N) AS (SELECT 1 FROM E1 a, E1 b, E1 c, E1 d, E1 e, E1 f)
,Tally(N) AS (SELECT TOP (@MaxExpectedPallets)
N = ROW_NUMBER() OVER (ORDER BY (SELECT NULL))
FROM E6)
SELECT r.BOTYPE
,r.CONSIGNEE
,r.REFERENCEORD
,r.SOURCECOMPANY
,Tr.ARGETCOMPANY
,CONVERT(VARCHAR, (CONVERT(DATE, r.SCHEDULEDATE, 101)), 101) AS SCHEDULEDATE
,1 AS EXPECTEDPALLETS
,r.CASES
,r.[CUBE]
,r.[WEIGHT]
FROM #splitmyrows r
JOIN Tally t ON t.N <= r.EXPECTEDPALLETS
;
Note that I've not tested the above because I don't have your data nor your DDL to test with.
Also notice that I've removed the drops of the Temp Tables from the end of the code because that will actually slow SQL Server down a bit. SQL Server will auto-magically drop the Temp Tables when the stored procedure loses focus (completes) so there's no need to drop them. Further, if you drop them in code, they're gone and that overrides the ability of SQL Server to reuse the "skeleton" of the Temp Tables on successive runs. SQL Server does have the optimization of keeping the basic definition of Temp Tables unless they are dropped in code.
Also notice that @i means squat to most people. Always name your variables as to what they actually are.
Last but not least, you've made the code work and that's great. Next, the above code will help make it fast. Now you need to make it "pretty".
Inline with the sequence of "Make it work, make it fast, make it pretty, and it ain't done 'til it's pretty", consider the creation and adoption of some standards and then sticking to them. I didn't change any of your column names just in case your server is {gasp!} 100% case sensitive but using all upper or lower case makes for horrible readability.
Also consider that helpful comments will be a huge help to you in the future and to others that may need to maintain or modify your code in the future. Consider a standard header that identifies the purpose of the proc, what a standard usage example looks like, and a section for revision history. Add comments that explain the "WHY" of why a section of code is needed. I tell my guys that if all code were removed from a proc, the comments should allow someone to at least be able to draw a functional diagram from the comments that remain.
You also need to become more consistent and follow some best practices. If you use AS to define column aliases, then use AS on every column alias. Always terminate statements with a semi-colon because not using them has been deprecated and their absence will cause a blowup someday in the future. And always use table aliases on all columns of any statement when there's any kind of a JOIN present so that people don't have to go hunting for which table or CTE any given column is in. It will also prevent a failure if someone adds an identically named column to a table in the join that doesn't already have it.
Remember that performance isn't just in the current code. It needs to be easily readable and the sections of the code need to be explained to help you or the next poor soul that may need to read your code. 🙂 Even if you're not the originator of the code, it doesn't take long to make the code more bullet proof, a little easier to read, and add a little documentation in the form of comments.
--Jeff Moden
Change is inevitable... Change for the better is not.
January 10, 2016 at 11:56 am
Jeff Moden (1/10/2016)
...Last but not least, you've made the code work and that's great. Next, the above code will help make it fast. Now you need to make it "pretty"..
Brilliant review and summary Jeff, respect!
😎
January 11, 2016 at 5:39 am
Eirikur Eiriksson (1/10/2016)
Jeff Moden (1/10/2016)
...Last but not least, you've made the code work and that's great. Next, the above code will help make it fast. Now you need to make it "pretty"..Brilliant review and summary Jeff, respect!
😎
Gosh, I hope that's true. Looking back at it, it sounds a little like Celko. Hope I never get that way. Seemed like someone new and saw a whole lot of things that someone new could use some help on.
--Jeff Moden
Change is inevitable... Change for the better is not.
January 11, 2016 at 7:30 am
A little "Celko" isn't always a bad thing... I learned a great deal from him when I was first trying to learn.
Speaking from my own experience... Being told, in no uncertain terms, that there is a right way and a wrong way to do things and that you're doing it the wrong way, is actually quite helpful when I'm trying to learn something new.
So, to echo Eirikur's response... I thought it was an outstanding post!
January 11, 2016 at 10:42 am
<snip>
Jeff Moden (1/10/2016)
Also notice that I've removed the drops of the Temp Tables from the end of the code because that will actually slow SQL Server down a bit. SQL Server will auto-magically drop the Temp Tables when the stored procedure loses focus (completes) so there's no need to drop them. Further, if you drop them in code, they're gone and that overrides the ability of SQL Server to reuse the "skeleton" of the Temp Tables on successive runs. SQL Server does have the optimization of keeping the basic definition of Temp Tables unless they are dropped in code.
</snip>
Jeff, I'm inquiring further about this aspect of your exhaustive and detailed explanation. Do you have any links explaining this concept of a "skeleton" of a temp table in SQL Server? My understanding was that once the connection creating a temp table had terminated (whether an SP executing or an ad hoc query window closing) that the temp table was dropped exactly as if you'd issued a DROP TABLE. Your explanation makes it sound more like a behind-the-scenes TRUNCATE.
Thanks,
Rich
January 11, 2016 at 11:09 am
Rich Mechaber (1/11/2016)
Jeff Moden (1/10/2016)
Also notice that I've removed the drops of the Temp Tables from the end of the code because that will actually slow SQL Server down a bit. SQL Server will auto-magically drop the Temp Tables when the stored procedure loses focus (completes) so there's no need to drop them. Further, if you drop them in code, they're gone and that overrides the ability of SQL Server to reuse the "skeleton" of the Temp Tables on successive runs. SQL Server does have the optimization of keeping the basic definition of Temp Tables unless they are dropped in code.Jeff, I'm inquiring further about this aspect of your exhaustive and detailed explanation. Do you have any links explaining this concept of a "skeleton" of a temp table in SQL Server? My understanding was that once the connection creating a temp table had terminated (whether an SP executing or an ad hoc query window closing) that the temp table was dropped exactly as if you'd issued a DROP TABLE. Your explanation makes it sound more like a behind-the-scenes TRUNCATE.
Thanks,
Rich
Also interested in this, since Paul White's tests indicate that explicitly dropping the temporary table does not affect temporary table caching: http://sqlblog.com/blogs/paul_white/archive/2012/08/17/temporary-object-caching-explained.aspx. That explicit dropping prevents caching of temporary tables is referred to as a myth by Aaron Bertrand here: http://dba.stackexchange.com/questions/52358/benefits-of-droping-a-temporary-table.
I just ran a couple quick tests and I see the cached temporary table whether or not I explicitly drop the temporary table in a procedure. I suppose it's possible there's some strange edge case where the explicit DROP changes things, but normal behavior seems to be that it doesn't matter.
Cheers!
January 11, 2016 at 5:27 pm
Thanks folks. I apparently missed Paul's good article on Temp Tables. I found where (I think) I first posted the notion (which, ironically, had a post after mine that listed Paul's article and I also missed that) and now I'm trying to find out where I got the information. Considering Paul's good article (although I've not done a deep dive on it, he's usually dead on for these types of things), I'd have to say that I was repeating a myth. Now, I have to scroll back in time and find out why I believed it to be true.
I appreciate the kind feedback. Even an old dog like me has things to learn. And very humble apologies for the mistake.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 12 posts - 1 through 11 (of 11 total)
You must be logged in to reply to this topic. Login to reply