December 18, 2018 at 7:07 am
Hello again! The issue I am having today is dealing with a date change when certain conditions are met. As represented in the subject, whenever a certain condition is met, I want the date to be the selected parameter minus a specific amount of days. I know that we may be able to use DATEDIFF() for a problem like this, but I am not entirely sure how to apply it. As always, any help is greatly appreciated and the subject is only a representation of what I am trying to accomplish!
December 18, 2018 at 7:21 am
gQuery - Tuesday, December 18, 2018 7:07 AMHello again! The issue I am having today is dealing with a date change when certain conditions are met. As represented in the subject, whenever a certain condition is met, I want the date to be the selected parameter minus a specific amount of days. I know that we may be able to use DATEDIFF() for a problem like this, but I am not entirely sure how to apply it. As always, any help is greatly appreciated and the subject is only a representation of what I am trying to accomplish!
I think you want the DATEADD function:DATEADD(dd,-30,'2018-01-11')
December 18, 2018 at 7:54 am
Jonathan AC Roberts - Tuesday, December 18, 2018 7:21 AMgQuery - Tuesday, December 18, 2018 7:07 AMHello again! The issue I am having today is dealing with a date change when certain conditions are met. As represented in the subject, whenever a certain condition is met, I want the date to be the selected parameter minus a specific amount of days. I know that we may be able to use DATEDIFF() for a problem like this, but I am not entirely sure how to apply it. As always, any help is greatly appreciated and the subject is only a representation of what I am trying to accomplish!I think you want the DATEADD function:
DATEADD(dd,-30,'2018-01-11')
I've made some more changes and ran into another issue. I want the Set CASE WHEN condition to look at a transaction count (that is located within a subquery later in the code). Is there any way to get the query to identify the that TransactionCount? I think it's going to work perfectly after this change is made.
DECLARE @InitialStartDate date,
@rptStartDate date,
@rptEndDate date
Set @InitialStartDate = '11/1/2018'
Set @rptStartDate = CASE WHEN [TransactionCount] = 2 Then DATEADD(dd,-30, @InitialStartDate)
WHEN [TransactionCount] = 1 Then DATEADD(dd,-30, @InitialStartDate) Else @InitialStartDate End
Set @rptEndDate = '11/30/2018'
--Count of transactions by equipment and fluid
(SELECT fcd.EquipmentID, ft.FluidID, COUNT(*) as 'TransactionCount'
FROM FluidTransfer AS ft left outer join
FluidContainer as FcD on ft.destinationcontainerid = fcD.id
where CAST(ft.TransferDate as date) between @rptStartDate and @rptEndDate and (isburntank = 1)
Group By fcd.EquipmentID, ft.FluidID) as ftc on fta.EquipmentID = ftc.EquipmentID and fta.FluidID = ftc.FluidID
WHERE (((e.AccountingCode) >= '0900' and (e.AccountingCode) <= '9999') AND ((e.IsDeleted)=0)
If needed I can message the entire query to you to look at if you think there is a better way to complete the same objective.
December 18, 2018 at 8:06 am
gQuery - Tuesday, December 18, 2018 7:54 AMJonathan AC Roberts - Tuesday, December 18, 2018 7:21 AMgQuery - Tuesday, December 18, 2018 7:07 AMHello again! The issue I am having today is dealing with a date change when certain conditions are met. As represented in the subject, whenever a certain condition is met, I want the date to be the selected parameter minus a specific amount of days. I know that we may be able to use DATEDIFF() for a problem like this, but I am not entirely sure how to apply it. As always, any help is greatly appreciated and the subject is only a representation of what I am trying to accomplish!I think you want the DATEADD function:
DATEADD(dd,-30,'2018-01-11')
I've made some more changes and ran into another issue. I want the Set CASE WHEN condition to look at a transaction count (that is located within a subquery later in the code). Is there any way to get the query to identify the that TransactionCount? I think it's going to work perfectly after this change is made.
DECLARE @InitialStartDate date,
@rptStartDate date,
@rptEndDate date
Set @InitialStartDate = '11/1/2018'
Set @rptStartDate = CASE WHEN 'TransactionCount' = 2 Then DATEADD(dd,-30, @InitialStartDate) Else @InitialStartDate End
Set @rptEndDate = '11/30/2018'--Count of transactions by equipment and fluid
(SELECT fcd.EquipmentID, ft.FluidID, COUNT(*) as 'TransactionCount'
FROM FluidTransfer AS ft left outer join
FluidContainer as FcD on ft.destinationcontainerid = fcD.id
where CAST(ft.TransferDate as date) between @rptStartDate and @rptEndDate and (isburntank = 1)
Group By fcd.EquipmentID, ft.FluidID) as ftc on fta.EquipmentID = ftc.EquipmentID and fta.FluidID = ftc.FluidID
WHERE (((e.AccountingCode) >= '0900' and (e.AccountingCode) <= '9999') AND ((e.IsDeleted)=0)If needed I can message the entire query to you to look at if you think there is a better way to complete the same objective.
This makes no sense:Set @rptStartDate = CASE WHEN 'TransactionCount' = 2 Then DATEADD(dd,-30, @InitialStartDate) Else @InitialStartDate End
'TransactionCount' is just a text string, there is no way it can ever be equal to 2.
Please can you paste in the entire query.
December 18, 2018 at 8:13 am
DECLARE @InitialStartDate date,
@rptStartDate date,
@rptEndDate date
Set @InitialStartDate = '11/1/2018'
Set @rptStartDate = CASE WHEN [TransactionCount] = 2 Then DATEADD(dd,-30, @InitialStartDate)
WHEN [TransactionCount] = 1 Then DATEADD(dd,-30, @InitialStartDate) Else @InitialStartDate End
Set @rptEndDate = '11/30/2018'
--Count of transactions by equipment and fluid
(SELECT fcd.EquipmentID, ft.FluidID, COUNT(*) as 'TransactionCount'
FROM FluidTransfer AS ft left outer join
FluidContainer as FcD on ft.destinationcontainerid = fcD.id
where CAST(ft.TransferDate as date) between @rptStartDate and @rptEndDate and (isburntank = 1)
Group By fcd.EquipmentID, ft.FluidID) as ftc on fta.EquipmentID = ftc.EquipmentID and fta.FluidID = ftc.FluidID
WHERE (((e.AccountingCode) >= '0900' and (e.AccountingCode) <= '9999') AND ((e.IsDeleted)=0)) AND fta.FIFOTotalCost > 0)
Here are the changes. As stated before I am wondering if the set statement can identify something like the TransactionCount from later on in the query because the way I have it set up now will not work, it underlines [TransactionCount] in red. I also tried ftc.TransactionCount and it doesn't work either.
December 18, 2018 at 8:41 am
DECLARE @InitialStartDate date,
/*@TransactionCount int,*/
@rptStartDate date,
@rptEndDate date
Set @InitialStartDate = '11/1/2018'/* JUST TESTING SOMETHING HERE PLEASE IGNORE Set @TransactionCount = Select ftc.TransactionCount as 'Transaction Count'
from Equipment as e left outer join
(SELECT COUNT(*) as 'TransactionCount' FROM
FluidTransfer AS ft left outer join
FluidContainer as FcD on ft.destinationcontainerid = fcD.id
where CAST(ft.TransferDate as date) between @rptStartDate and @rptEndDate and (isburntank = 1)) as ftc on fta.EquipmentID = ftc.EquipmentID and fta.FluidID = ftc.FluidID)*/
Set @rptStartDate = CASE WHEN /*@TransactionCount*/ = 2 Then DATEADD(dd,-60, @InitialStartDate)
WHEN /*@TransactionCount*/ = 1 Then DATEADD(dd,-60, @InitialStartDate) Else @InitialStartDate End
Set @rptEndDate = '11/30/2018'
Select e.AccountingCode As [Vehicle], CONCAT(e.year + ' ', e.make + ' ', e.model + ' ', CAST(e.[Description] as VARCHAR(80))) As Name,
IIF(cff.StringType = '', cff1.StringType, cff.StringType) as 'Operator', fta.Fluid, ftc.TransactionCount as 'Transaction Count', (CONCAT(ISNULL(CONCAT(CAST(nullif((sh.EndHour - sh.StartHour),0) as VARCHAR(80)) + ' Hours @ ' ,
CAST(nullif(nullif((fta.TotalGallons),0) / nullif((sh.EndHour - sh.StartHour),0),0) as VARCHAR(80)) + ' GPH'), '') , ISNULL(CONCAT(CAST(nullif((so.EndOdometer - so.StartOdometer), 0) as VARCHAR(80)) + ' Miles @ ',
CAST(nullif(nullif((so.EndOdometer - so.StartOdometer),0)/nullif((fta.TotalGallons),0),0) as VARCHAR(80)) + ' MPG'), '' ))) as ' ' , fta.FIFOTotalCost/fta.TotalGallons as [Avg Price],
fta.TotalGallons As [Total Gallons], fta.FIFOTotalCost As [Total Cost]
from Equipment as e left outer join
(SELECT cf.CustomFieldID, cf.EntityID, cf.StringType, cfc.FieldName, cfc.CategoryType, cfc.EntityName
FROM CustomField AS cf INNER JOIN
CustomField_Category AS cfc ON cf.CustomField_CategoryID = cfc.CustomField_CategoryID
WHERE cfc.FieldName = 'Operator' and cf.IsDeleted = 0) as cff on e.EquipmentID = cff.EntityID left outer join
(SELECT cf.CustomFieldID, cf.EntityID, cf.StringType, cfc.FieldName, cfc.CategoryType, cfc.EntityName
FROM CustomField AS cf INNER JOIN
CustomField_Category AS cfc ON cf.CustomField_CategoryID = cfc.CustomField_CategoryID
WHERE cfc.FieldName = 'Unassigned Description' and cf.IsDeleted = 0) as cff1 on e.EquipmentID = cff1.EntityID left outer join
-- The first and last hour meter reading value after the start date for each piece of equipment
(Select emh.Equipment_EquipmentID, MIN(emh.TrueReading) as 'StartHour', MAX(emh.TrueReading) as 'EndHour'
from Equipment_MeterHistory as emh
where CAST(emh.ReadingDate as date) >= @rptStartDate and CAST(emh.ReadingDate as date) <= @rptEndDate
group by emh.Equipment_EquipmentID) as sh on e.EquipmentID = sh.Equipment_EquipmentID left outer join
-- The first and last odometer reading value after the start date for each piece of equipment
(Select emh.Equipment_EquipmentID, MIN(emh.TrueReading) as 'StartOdometer', MAX(emh.TrueReading) as 'EndOdometer'
from Equipment_OdometerHistory as emh
where CAST(emh.ReadingDate as date) >= @rptStartDate and CAST(emh.ReadingDate as date) <= @rptEndDate
group by emh.Equipment_EquipmentID) as so on e.EquipmentID = so.Equipment_EquipmentID left outer join
-- Fuel total gallons and cost per container per fluid
(SELECT fcD.EquipmentID, fcD.FluidID as 'FluidID', MAX(f.code) as 'Fluid',
SUM(CASE WHEN ft.DestinationContainerID = ftd.DestinationContainerID THEN fti.Quantity ELSE -1* fti.Quantity END) as 'TotalGallons',
SUM(fifo.Cost * CASE WHEN ft.DestinationContainerID = ftd.DestinationContainerID THEN fti.Quantity ELSE -1* fti.Quantity END ) as 'FIFOTotalCost'
FROM FluidTransfer AS ft left outer join
fluidtransferdetail as ftd on ft.ID = ftd.FluidTransferID and (ft.DestinationContainerID = ftd.DestinationContainerID or ft.DestinationContainerID = ftd.SourceContainerID)left outer join
FluidTransferInventory as fti on ftd.id = fti.FluidTransferDetailID left outer Join
FluidContainer as fcD on ft.DestinationContainerID = fcD.ID inner join
fluid as f on ft.fluidid = f.ID inner join
FluidCost as fifo on fti.FifoCostID = fifo.ID
where CAST(ft.TransferDate as date) >= @rptStartDate and CAST(ft.TransferDate as date) <= @rptEndDate and ftd.Type in (0,1) and (isburntank = 1)
Group BY fcd.EquipmentID, fcD.FluidID) as fta on e.id = fta.EquipmentID left outer join
--Count of transactions by equipment and fluid
(SELECT fcd.EquipmentID, ft.FluidID, COUNT(*) as 'TransactionCount'
FROM FluidTransfer AS ft left outer join
FluidContainer as FcD on ft.destinationcontainerid = fcD.id
where CAST(ft.TransferDate as date) between @rptStartDate and @rptEndDate and (isburntank = 1)
Group By fcd.EquipmentID, ft.FluidID) as ftc on fta.EquipmentID = ftc.EquipmentID and fta.FluidID = ftc.FluidID
WHERE (((e.AccountingCode) >= '0900' and (e.AccountingCode) <= '9999') AND ((e.IsDeleted)=0)) AND fta.FIFOTotalCost > 0 AND ftc.TransactionCount = 2
order by AccountingCode
December 18, 2018 at 10:39 am
If anyone is having difficulty understanding what I am trying to do let me know and I will try to clarify. π
December 18, 2018 at 12:35 pm
You cannot reference the 'transaction count' prior to determining that count so it isn't really clear what you are trying to accomplish. Plus the fact that it is very hard to read your code because it isn't well formatted.
It looks like you are using a lot of derived tables for outer joins...I would rewrite this with CTEs with the first CTE being the one that generates the transaction count and calculates the report start/end dates. However - when looking at that it won't be possible because you cannot calculate the transaction count using the calculated start/end dates using the transaction counts...
Jeffrey Williams
βWe are all faced with a series of great opportunities brilliantly disguised as impossible situations.β
β Charles R. Swindoll
How to post questions to get better answers faster
Managing Transaction Logs
December 19, 2018 at 12:22 pm
gQuery - Tuesday, December 18, 2018 10:39 AMIf anyone is having difficulty understanding what I am trying to do let me know and I will try to clarify. π
For the past 30+ years, the convention on SQL groups has been to post DDL. Now we have to guess the keys, datatypes, constraints, references, defaults and everything else we would actually need to be able to program because you couldn’t be bothered. But looking at what you did post, I’ll make some guesses.
Your whole approach to RDBMS and SQL is completely wrong. You probably don’t know it, but your writing a really bad combination of 1960s COBOL and some assembly language, but you just happen to be doing it with really bad highly proprietary SQL dialect.
We hate local variables. That was what we used in procedural languages; SQL is a declarative language.
You don’t know how to name things. Database people will use the ISO 11179 naming conventions. While it’s incredibly boring, you can find summaries in any good book on data modeling. For example, each data element should have it distinct name, and not a generic one. Did you know that “year†is a reserved word in SQL? There is no such thing as a generic “makeâ€. These things have to belong to a particular entity in your data model. They just don’t float around. I’m also bothered by the fact that your identifying vehicles without using VIN. I would think that the start of your query would look like this
SELECT E.accounting_code, --- E.vin
E.vehicle_year, E.vehicle_make,
E.vehicle_model, E.vehicle_description
FROM Equipment AS E
LEFT OUTER JOIN
...
SQL is based on a tiered architecture. That means we get a query, pass the results (a table) to a presentation layer and do all of that string formatting, concatenating, and CAST() other crap you are doing in the database tier of the system. You’re doing the way we did it in COBOL in 1960! ARRGH!
If I read your code correctly, you have to discover what’s in the tables by using this weird mix of assembly language flag conventions like (E.isdeleted = 0) and metadata.
Your use of “CAST (..AS VARCHAR(80))†is really funny to me. I am old enough to remember 80 column punch cards which is what you’re doing, but you’re doing it in SQL in 2018, not in COBOL, or Fortran in 1960.
Why did you use a nonstandard control structure? IFF() is not part of SQL; we have a CASE expression.
Things like “stringtypeâ€, “customfieldidâ€, “entityidâ€, “fieldnameâ€, “categorytypeâ€, “categoryid†or “entityname†make no sense in RDBMS. A string is a datatype, not an attribute. You can get more generic and vague than calling something “entityâ€, etc. I always love “category_type†because it’s two attribute properties strung together into a data element name. This for English. It would be the equivalent of two adjectives next to each other without a noun to modify.
The proper way to name things. According to ISO and all the data modeling that’s being done these days is “<attribute>_<attribute property>â€.
What you have so many OUTER JOINS? In a properly designed schema, references to foreign keys eliminate the need for most of these sorts of things. You also don’t seem to know what a field is in SQL.
“no matter how far you have walked down the wrong road, turn around!†Turkish proverb
Please post DDL and follow ANSI/ISO standards when asking for help.
December 19, 2018 at 12:53 pm
Your whole approach to RDBMS and SQL is completely wrong. You probably donβt know it, but your writing a really bad combination of 1960s COBOL and some assembly language, but you just happen to be doing it with really bad highly proprietary SQL dialect.
after participating in forums discussing SQL with the drooling hoards of "accidental DBAs", idiots demanding that factory workers fashion their workflows specifically to adhere to the limitations of the relational handcuffs and demands that untrained end users learn DDL syntax and update table definitions simply to avoid EAV's, I've since decided that SQL has sadly revealed itself to be a mistake. Not to mention the irreparable harm otherwise good intentioned folks have done to the punched card industry.
Remember, if programs were hard to write, they should be hard to read!!!
Let's bring 1960s COBOL back and Make Procedures Great Again!!!
December 19, 2018 at 1:50 pm
patrickmcginnis59 10839 - Wednesday, December 19, 2018 12:53 PMLet's bring 1960s COBOL back and Make Procedures Great Again!!!
You might want to look at the numbers. COBOL never left and represents tens of billions of lines of code in daily use. Several years ago, a younger friend of mine was making fun of me for not learning whatever the language du jour was. So I ask him what he was writing it work. It was an app for cell phones at a bank to look up account information. But when I pushed him on it, it had to go to a backend to get the COBOL data out of an ISAM file. I wonder we shouldn't replace the term "legacy code" with the term "family curse" code. π
Please post DDL and follow ANSI/ISO standards when asking for help.
December 19, 2018 at 2:03 pm
jcelko212 32090 - Wednesday, December 19, 2018 1:50 PMpatrickmcginnis59 10839 - Wednesday, December 19, 2018 12:53 PMLet's bring 1960s COBOL back and Make Procedures Great Again!!!You might want to look at the numbers. COBOL never left and represents tens of billions of lines of code in daily use. Several years ago, a younger friend of mine was making fun of me for not learning whatever the language du jour was. So I ask him what he was writing it work. It was an app for cell phones at a bank to look up account information. But when I pushed him on it, it had to go to a backend to get the COBOL data out of an ISAM file. I wonder we shouldn't replace the term "legacy code" with the term "family curse" code. π
I don't think its good architecture to have COBOL strictly on the back end. Fortunately, MicroFocus agrees!
https://www.microfocus.com/products/visual-cobol/personal-edition/
December 19, 2018 at 2:18 pm
patrickmcginnis59 10839 - Wednesday, December 19, 2018 2:03 PMjcelko212 32090 - Wednesday, December 19, 2018 1:50 PMpatrickmcginnis59 10839 - Wednesday, December 19, 2018 12:53 PMLet's bring 1960s COBOL back and Make Procedures Great Again!!!You might want to look at the numbers. COBOL never left and represents tens of billions of lines of code in daily use. Several years ago, a younger friend of mine was making fun of me for not learning whatever the language du jour was. So I ask him what he was writing it work. It was an app for cell phones at a bank to look up account information. But when I pushed him on it, it had to go to a backend to get the COBOL data out of an ISAM file. I wonder we shouldn't replace the term "legacy code" with the term "family curse" code. π
I don't think its good architecture to have COBOL strictly on the back end. Fortunately, MicroFocus agrees!
https://www.microfocus.com/products/visual-cobol/personal-edition/
And it isn't "COBOL data." It is data data stored in an ISAM database (file if you want).
December 19, 2018 at 3:02 pm
patrickmcginnis59 10839 - Wednesday, December 19, 2018 12:53 PMAnd it isn't "COBOL data." It is data data stored in an ISAM database (file if you want).
ISAM (indexed sequential access method) is not a database, but in access method. Weπ
I would call it COBOL data. It's stored as text in ASCII or EBCDIC depending on the hardware and the age of the bank's systems. . COBOL would've defined the datatypes and precision used. The data division in the host programs would have used very COBOL features such as FILLER, OCCURS, PICTURE, etc. to determine the structure of the records. We would have no concept of a key,
It sure as heck will not be a CSV file, π
Please post DDL and follow ANSI/ISO standards when asking for help.
December 19, 2018 at 3:46 pm
jcelko212 32090 - Wednesday, December 19, 2018 1:50 PMpatrickmcginnis59 10839 - Wednesday, December 19, 2018 12:53 PMAnd it isn't "COBOL data." It is data data stored in an ISAM database (file if you want).
ISAM (indexed sequential access method) is not a database, but in access method. Weπ
I would call it COBOL data. It's stored as text in ASCII or EBCDIC depending on the hardware and the age of the bank's systems. . COBOL would've defined the datatypes and precision used. The data division in the host programs would have used very COBOL features such as FILLER, OCCURS, PICTURE, etc. to determine the structure of the records. We would have no concept of a key,
It sure as heck will not be a CSV file, π
And ISAM can be used by MySQL (or it used to be an option, who knows if Oracle kept it).
I still won't call it COBOL data and I worked with COBOL and ISAM databases.
I bow to the narcissist ego.
Viewing 15 posts - 1 through 15 (of 17 total)
You must be logged in to reply to this topic. Login to reply