SQL isnt SQL

  • datascientist7 wrote:

    Nope, learned what DBA means here!

    I cant say no.  I always say yes.  When Bedford changes back from a town to a locality, I can make the change in 1 minute in my lookup master.  I can then get the report done for the state legislature.  No waiting months for the DBA to say no.

    And here I can see a very good reason for not letting you anywhere near a relational database.

    Yes, it's pretty easy to change the property of Bedford in a lookup master.

    But then what?

    suddenly,  All the historical orders which were executed according to the "town' rules become inappropriately placed because Bedford is not a town but a locality. And after you have generated your yearly report all those employees which placed/approved/executed those "town" orders now become subjects of HR actions because they should not have done it for a "location". And after your quick change there is no way to tell was an order correctly placed because the "location" was "town" at the time or it was rather an error from an employee.

    You must have both records for Bedford in you master lookup, as fro town and as for location, with temporal characteristics against each of them. And queries/references must include a consideration for choosing a correct record based on the rules around temporal properties.

    So, when you ask for such an update, "No" is the only right answer.

    _____________
    Code for TallyGenerator

  • >SSC-Insane, No-one like me queries any OLTP live servers. That's not what we are talking about.

    >As you know, the business guru is given read access to a daily snapshot server. From which I make my Excel datamart.

    >100% of the business report customers want output in either Excel or pdf (easy save-as from Excel) or Word (again, easy from VBA).

    Its my fault for assuming you're querying OLTP servers, my bad!

    Do your reports still go out during your sick day? If you have a backup process then I'm good with that too.

    I'm personally a fan of SSRS, it does all the output functions for excel and pdf, but you can schedule them also, no need for a button in that case. This doesn't mean that excel has no worth, far from it! As far as word is concerned, I remember producing some sort of reports with word and vba macros but often they were meant to be run by the end user consumer.

    >So it's much more efficient to keep the datamart in Excel for any table less than a few hundred thousand records.

    I'll revisit that in a second.

    >Data warehouses do not have quality control. I have quality control charts on all my datamart tables. Going back years.

    Thats probably not a useful generalization. If I were to extrapolate this to its logical conclusion, you're saying that your datamarts are golden but OLTP is crap. We'd have to figure out this before even touching on data warehouses. By applying quality control ONLY to your datamarts, you are creating a duplicate and inconsistant view of data which now means you need a third set of data, the "mismatches" between your view and the rest of the company.

    Don't get me wrong, I've tracked OLTP malfunctions before, some transactions are just going to happen that are out of the norm, and documenting these probably make sense. Oftentimes the datamarts and warehouses I've used and produced are specifically MEANT for quality control, I have one today (the 10 minute refresher mentioned below) that is used for directing corrective action in the OLTP, practically the definition for data quality control. For process quality control, you should be able to apply this to any process regardless of what tool is used to implement it. We shouldn't infer that SQL isn't subject to quality control because its a tool like anything else, produce incorrect processes using any tool will result in a bad time for everybody, I guarantee it!

    >Data warehouses are a day behind in their data and months behind in reporting capability. They are a mausoleum for data.

    That doesn't really make much sense either. I get the "day behind" thing, but on the other hand, it isn't impossible to vary the frequency of different processes. The datamart I use frequently, refreshes every 10 minutes. On the other hand, months behind in reporting capability is probably just not a useful generalization.

    For the efficiencies of excel datamarts, I don't really think thats a rule set in stone. There are so many attributes to explore with business processes that just comparing a blanket "efficiency" metric is probably not that useful, heck that actual "efficiency" metric is probably going to vary according to who's doing the measuring, its really that subjective.

    Heck if someone is a poor slob like me having to go troubleshoot a vba procedure that is now malfunctioning after the original author is long gone, you are going to get a really vastly different assessment of that particular "efficiency."

    Just contributing contrasting viewpoints here, let me know more if I'm mischaracterising your talking points.

     

  • SSRS: I have seen many SSRS reports.  Every user will export to Excel.  At which point either:

    1. All the data cells are merged.  Which makes the table unusable.
    2. The data cells are not merged.  Which makes the table unusable _and_ very dangerous _if_ used: On Sort, will sort only partial columns and make a mess.

    Data quality:

    In my datamart, every table has metadata intimately and natively attached, readable, printable, etc etc in the customer's user interface (Excel):

    Script version timestamp.

    QueryString

    Title

    Date data last refreshed.

    Timestamp when the data or report was generated.

    Total Record count (invaluable once the user starts dinking with their data or report in any way).

    Source Table.

    Original Index Sort Key.

    LastModified and Created Timestamps on every record.

    I am not aware that SQL has that capability.  That's why I wrote the xlQL query language.

    you are creating a duplicate and inconsistant view of data which now means you need a third set of data

    No. No-one uses the Data Warehouse data.  They come to the business.  Those that know the data, know how to use it, maintain it, and have quality control.

     

     

  • datascientist7 wrote:

    SSRS: I have seen many SSRS reports.  Every user will export to Excel.  At which point either:

    1. All the data cells are merged.  Which makes the table unusable.
    2. The data cells are not merged.  Which makes the table unusable _and_ very dangerous _if_ used: On Sort, will sort only partial columns and make a mess.

    Well sure, I mean this looks exactly like a database load, much like some of the issues that I've encountered and solved, except the "database server" is excel which isn't my choice for doing this sort of thing, so I agree, I do not recommend doing database operations in excel. I think we're on the same page there.

    Data quality:

    In my datamart, every table has metadata intimately and natively attached, readable, printable, etc etc in the customer's user interface (Excel):

    Script version timestamp.

    QueryString

    Title

    Date data last refreshed.

    Timestamp when the data or report was generated.

    Total Record count (invaluable once the user starts dinking with their data or report in any way).

    Source Table.

    Original Index Sort Key.

    LastModified and Created Timestamps on every record.

    I am not aware that SQL has that capability.  That's why I wrote the xlQL query language.

    One advantage SQL Server has is that it also comes with its own programming language, but it has the added benefit that there is a rather large and robust vendor behind it as well as plenty of users falling over themselves trying to outdo each other in volunteering support. How big is your support organization for your programming language? Can you approximate the support budget for your database language, you can round it off into billions of dollars, probably be easy that way for comparison purposes when we line it up alongside Microsoft's support organization. Exact figures aren't really needed here, but are you in the same ballpark, say logarithmically speaking?

    SQL Server and databases in general do not prohibit the user from making metadata to track create and update times and users or in fact, really aren't all that restrictive in what design paradigm the programmer decides to use for any aspect, so while I think you are trying to make the point that SQL Server can't do or somehow prohibits the stuff you've mentioned, I hope we can just agree on what we disagree on here.

    you are creating a duplicate and inconsistant view of data which now means you need a third set of data

    No. No-one uses the Data Warehouse data.  They come to the business.  Those that know the data, know how to use it, maintain it, and have quality control.

    Thats too bad, probably why your warehouse has no quality control because if nobody uses it, why bother? Or maybe nobody uses it because it doesn't have quality control? Chicken or egg etc.

    Could you invite one of your QA officers to the thread so I can ping him on some questions I have?

    I am interested in your programming language, what sort of parser did you elect to go with?

  • >Could you invite one of your QA officers

    We are state government.  There is no such thing as a QA officer.

    >I am interested in your programming language, what sort of parser did you elect to go with?

    I need no parser.  Excel reads direct from SSAS.  Excel shares the SSAS engine with Pivot Tables, Power Pivot, Power BI, etc etc.

    Versus the Data Warehouse requires pipe delimited text upstream!  How ancient is that.

    Techopedia parser:

    A parser is a compiler or interpreter component that breaks data into smaller elements for easy translation into another language. A parser takes input in the form of a sequence of tokens, interactive commands, or program instructions and breaks them up into parts that can be used by other components in programming.

    A parser usually checks all data provided to ensure it is sufficient to build a data structure in the form of a parse tree or an abstract syntax tree

     

  • While this is an interesting discussion, I have no idea what this means for the community. Your opening post, and most of this discussion, seems to be about technical views, not the community.

    If this is something to discuss with regards to how to work together, please edit the original post and give some summary there. Please preface changes with "Update". If not, let's move this to another forum.

  • edit: Hey Steve, I started on this reply before yours, if you'd like I'd not be upset if you removed my post. I really want to encourage practical approaches to solving problems and I think its important that we illustrate downsides to atypical approaches. Still, if you think my post is unhelpful like I said its ok to remove this (or honestly any) post of mine (lol like my preferences would matter there, amirite?)

    Could you invite one of your QA officers

    We are state government.  There is no such thing as a QA officer.

    If you can't afford a dedicated QA position, I'd still recommend getting at least some of the functionality the position would otherwise implement. Its fine if we disagree.

    I am interested in your programming language, what sort of parser did you elect to go with?

    I need no parser.  Excel reads direct from SSAS.  Excel shares the SSAS engine with Pivot Tables, Power Pivot, Power BI, etc etc.

    Thats fine, I suspected you didn't have a parser, and therefore didn't really produce a query language. I have no problem with that honestly, implementing a programming language of any sort is probably not practical in your situation.

    Versus the Data Warehouse requires pipe delimited text upstream!  How ancient is that.

    Been used quite a bit, I will grant that!

    Techopedia parser:

    A parser is a compiler or interpreter component that breaks data into smaller elements for easy translation into another language. A parser takes input in the form of a sequence of tokens, interactive commands, or program instructions and breaks them up into parts that can be used by other components in programming.

    A parser usually checks all data provided to ensure it is sufficient to build a data structure in the form of a parse tree or an abstract syntax tree

    Used for implementing programming languages.

    I get it, you have some issues with tech, I think we have all had our preferences refuted at one time or another.

    There is literally nothing wrong with creative and innovative approaches, all I'm encouraging is that you defend them robustly.

     

    • This reply was modified 3 years, 9 months ago by  x. Reason: remark for Steve and invitation to delete my post if needed

Viewing 8 posts - 31 through 37 (of 37 total)

You must be logged in to reply to this topic. Login to reply