January 25, 2014 at 5:37 pm
ChrisM@Work (1/24/2014)
Same statement rewritten as UPDATE FROM, with nonsense logic commented out:
While I agree that the commented out code is incorrectly written, I think it should be rewritten with a proper NULL test so that the logic actually does work. The implied logic is that if a particular non-null condition exists, use it. If a NULL condition does exist, preserve the NULL. If neither condition exists, then apply the alternative. The bottom line of what I read as the intent of the code is that if the null conditions exists, then DO NOT apply the alternative using ELSE.
--Jeff Moden
Change is inevitable... Change for the better is not.
January 25, 2014 at 5:40 pm
wit_jp2001 (1/24/2014)
Chris - Thanks works great.
If your final code is still using an "=" sign for a NULL test, then it might not actually be doing what it's supposed to be doing. If you have "Concatenate NULL Yields NULL" turned off to make such bad code work, then you're going to be in a heap of trouble when MS no longer gives you the option to turn it off.
--Jeff Moden
Change is inevitable... Change for the better is not.
January 25, 2014 at 6:03 pm
wit_jp2001 (1/24/2014)
Thanks,I will try out the new query this afternoon. Just so that eveyone knows this process is to update the 1st SKU with the heaviest SKU. When this system prints a pick ticket it has to find the heaviest sku to determine box size from the start.
But this statement plus a few others just eat away at the time to do the actual run of the code.
Thanks again.
What if there's a quantity of 10 of the largest, heaviest SKU that you have and it's the only item for a given order? What then? And what about the problem of packing incompatible items? For example, you wouldn't want to pack an anvil with a carton of eggs (as an example).
Also, what if the "heaviest item" perfectly matches the max capacity of the "container"? You wouldn't want to add other items to the container.
This has all the ear marks of a "bin packing" problem and I don't believe that this code is going to solve it correctly. I also see a major problem in having pivoted the data to 15 stripes instead of processing the data as a single stripe and then pivoting the results.
--Jeff Moden
Change is inevitable... Change for the better is not.
January 27, 2014 at 1:45 am
Jeff Moden (1/25/2014)
wit_jp2001 (1/24/2014)
Thanks,I will try out the new query this afternoon. Just so that eveyone knows this process is to update the 1st SKU with the heaviest SKU. When this system prints a pick ticket it has to find the heaviest sku to determine box size from the start.
But this statement plus a few others just eat away at the time to do the actual run of the code.
Thanks again.
What if there's a quantity of 10 of the largest, heaviest SKU that you have and it's the only item for a given order? What then? And what about the problem of packing incompatible items? For example, you wouldn't want to pack an anvil with a carton of eggs (as an example).
Also, what if the "heaviest item" perfectly matches the max capacity of the "container"? You wouldn't want to add other items to the container.
This has all the ear marks of a "bin packing" problem and I don't believe that this code is going to solve it correctly. I also see a major problem in having pivoted the data to 15 stripes instead of processing the data as a single stripe and then pivoting the results.
The other thread from the same op is concerned with CROSS TABbing normalised SKU data into a structure similar to the sample data here. It would be far cheaper (and simpler) to assign the heaviest SKU to SKU1 prior to CROSS TABbing, and the exercise might hopefully open the door to a proper "bin packing" solution.
For fast, accurate and documented assistance in answering your questions, please read this article.
Understanding and using APPLY, (I) and (II) Paul White
Hidden RBAR: Triangular Joins / The "Numbers" or "Tally" Table: What it is and how it replaces a loop Jeff Moden
Viewing 4 posts - 16 through 18 (of 18 total)
You must be logged in to reply to this topic. Login to reply