Looking Back

  • Jeff Moden - Friday, August 4, 2017 7:27 PM

    Heh... you mean like EDI, XML, JSON, and PowerShell? 😉

    Well, PowerShell is a good example of a really horrible programing language (Javascript wouldis far more useful, and so is good oldfashioned CommandShell).  Indeed PowerShell is Verbose to the extent that it's unbearable. EDI isn't a programming language at all, it's a bloody silly collection of mutually incompatible standards for data interchange pretending to be a single standard.  Neither is XML a programming language - it's a horrible imitation of a mark-up language that's been vastly over-hyped and isusually misused as a hopelessly verbose and inefficient way of presenting data (and/or data formats). Nor is JSON a programming language - it's yet another way of representing data,, and the one good thing about it is that it's nowhere near as awful as XML.

    So yes, those things are examples of the sorts of mistake that I would expect to be the inevitable result of voting for a common language standard.  I'm thoroughly in favour of democracy where it's appropriate and aplicable (except where it's so formulated as to give power to to those for whom there is no genuine democratic support) but I regard democracy as absolutely useless when deciding questions about scientific and/or technological facts.  so regardlessof the massive popular support for XML I regard it as crap in most of teh contexts in which it s used,

    I  suspect we agree, but I guess I could be wrong.

    Tom

  • Jeff Moden - Sunday, August 6, 2017 10:35 AM

    I'm not so sure about that.  One of the quickest ways to stifle innovation is to establish standards and "Best Practices".  If you remove the opportunity to fail, you also remove the opportunity for success.

    I'm inclined to agree.   Most examples of "Best Practices" I've seen could have been better described as "worst" instead of "Best".   Two examples I'm fond are the good old Waterfall Method where "Best Practice" often meant "Do no testing at any stage before UAT", and even better "Don't question incomprehensible requirements";  And (my favorite of all):  the wonderful set of variants of  PRINCE project managemet standard, which is claimed to be Project Management best practice but requires the costing and planning to be determined before any technical analysis of the requirement has taken place.  Of course Prince advocates deny this, but if you look at the standard documents you'll see (unless the Prince Verion 2 documents have been radically changed since I last looked at them) that such denial is a blatant lie.

    Tom

  • Jeff Moden - Friday, August 4, 2017 10:58 PM

    Heh... now, if we could just get them to write a useful splitter. 😉

    Almost no chance at all that they'll do that any time soon - doing that soon would be an early admission that they got it wrong.  I't's practically impossible to get software companies to admit they got it wrong, and near totally impossible to get them to make that admission quickly.  I guess it's slightly less impossible with MS that with Oracle, but I won't hold my breath while I'm waiting for them to do it.

    Tom

  • Jeff Moden - Sunday, August 6, 2017 4:37 PM

    bdcoder - Sunday, August 6, 2017 12:01 PM

    Jeff Moden - Sunday, August 6, 2017 11:28 AM

    bdcoder - Sunday, August 6, 2017 10:51 AM

    Jeff Moden - Sunday, August 6, 2017 10:35 AM

    bdcoder - Sunday, August 6, 2017 9:31 AM

    TomThomson - Sunday, August 6, 2017 6:45 AM

    bdcoder - Friday, August 4, 2017 10:33 PM

    I don't think the author was trying to standardize ALL languages, just some of the common tasks programmers (not necessarily SQL developers) use.  Loops may not have a place in the SQL syntax world -- but in the programming world (compiled or interpreted) -- your damn right they do -- and I for one would LOVE to see a common syntax for loops, data types and conditional operators.   Like the author said, if just those few points were addressed, everyone's life would be easier.  But alas, just looking at all the comments the article has generated proves the point that the programming world will remain wild.

    No, loops and GOTOs and all the associated control flow stuff in Procedural languages have NO place in programming at any level above assembly which directly expresses machine code.  Since the late 60s we've had languages that are Turing-complete (ie are capable of expresing any imaginable computation) without using any of that control-flow crap.  A lot of programming has been done in Functional languages (various dialects of ML, Miranda, and so on - and recently quite a lot in Haskell) and in Logic languages (mostly variants of Prolog and Parlog) and it has been clearly demonstrated for most application areas that developers writing in these languages manage to write far fewer bugs than developers writing in procedural languages.   Too many computer professionals are blind to this, either not sufficiently interested in learning anything about computing beyond the minimum they can get away with, or believing they have a vested interest in procedural languages they have learnt, or having been "educated" in programming by professors whose knowledge is sadly restricted (and hence have a vested interest in teaching the procedural languages that they know).
    I guess the continued general developer preference for rather horrible procedural languages is part of the contimuing wildness of the programming world, and it will be  long time before common sense prevails.  of course there are some signs that things are getting better, for example Microsoft's production of  F# (which has some functional structure mixed in with procedural stuff).

    Agreed -- but you have to admit, IF we did have some sort of standard within our soup of procedural languages, everyone lives would have been so much better -- THAT is what I think the author was trying to get across; and I agree with him or her.

    I'm not so sure about that.  One of the quickest ways to stifle innovation is to establish standards and "Best Practices".  If you remove the opportunity to fail, you also remove the opportunity for success.

    I think (like the author mentioned in the article) if the programming world had a bit more stability in it like the SQL world does (i.e: SELECT / FROM / WHERE has not been "stiffled" in any way over the years and those same small three words are a de-facto standard) things would have been much better; and there would have been even more opportunities for "success". 🙂

    Several stifling attempts have been made in the world of SQL.  For example, there is and has been a move underfoot to make the UPDATE statement more like that of Oracle in that they want to remove the FROM clause from the SQL Server version because it allows people to make a mistake.  There's also the stifling "Best Practice", touted by many, to never do direct date math even though the ANSI Standards specifically state that EndDateTime-StartDateTime shall result in a period (interval).  Same goes for a "variable overlay" because "it's unreliable", which is only true if you use it the wrong way and people frequently do while others use it with great success when using it within its limitations.

    Good points, but I would still rather live (and work) in the SQL world than the "procedural programming" world of today!  I, like the author, have always found the SQL world contains many more "standards" than other areas of computing! All the best!

    Absolutely agreed on the "where to live and work" notion.  I gave up on the front end world of programming at the end of 2002 and haven't looked back since for many of the reasons stated and more.  But SQL isn't "standard" by any means... not even for SELECT.  In SQL Server, I can do variable overloading.  In Oracle, I cannot ()just as one example).  In SQL Server (not including the 4 letter words like SSIS, SSRS, SSAS, etc), I get all of what is available whether it's "standard" SQL or some of the proprietary functionality.  In Oracle, you have to change the way you code because you have "standard" and PL-SQL they don't really play well with each other.

    I hear you!  Just think that some worlds (SQL) have more consistency than others (general programming)!  Cheers!

  • Good points, but I would still rather live (and work) in the SQL world than the "procedural programming" world of today!  I, like the author, have always found the SQL world contains many more "standards" than other areas of computing! All the best!

    Absolutely agreed on the "where to live and work" notion.  I gave up on the front end world of programming at the end of 2002 and haven't looked back since for many of the reasons stated and more.  But SQL isn't "standard" by any means... not even for SELECT.  In SQL Server, I can do variable overloading.  In Oracle, I cannot ()just as one example).  In SQL Server (not including the 4 letter words like SSIS, SSRS, SSAS, etc), I get all of what is available whether it's "standard" SQL or some of the proprietary functionality.  In Oracle, you have to change the way you code because you have "standard" and PL-SQL they don't really play well with each other.

    I agree that the SQL world is a nicer place to work in than the procedural progrmming world.
    I also feel that the procedural programming world is pretty crazy in the way it accepts languages with new notations without any justification at all.  And the SQL world (a wonderful croess between declarative and procedural) suffers from that to a rather lesser degree solely because there are fewer people inventing new versions of a procedural quasi-relational calculus than there are inventing new versions of prely procedural stuff like Algol (under such wonderful names as "JavaScript")  and Simula (under such wonderfiul names as "C++").  Yes, it's fair enough to have new languages that make it easier to write programs, but it's crazy to invent a new language that changes all the operator symbols and introduces nothing new and there have been hordes of those - I was already getting fed up with seeing them when part of my job way back in the 1960s was to conduct a survey of available programming languages.

    Tom

  • TomThomson - Sunday, August 6, 2017 7:20 PM

    Good points, but I would still rather live (and work) in the SQL world than the "procedural programming" world of today!  I, like the author, have always found the SQL world contains many more "standards" than other areas of computing! All the best!

    Absolutely agreed on the "where to live and work" notion.  I gave up on the front end world of programming at the end of 2002 and haven't looked back since for many of the reasons stated and more.  But SQL isn't "standard" by any means... not even for SELECT.  In SQL Server, I can do variable overloading.  In Oracle, I cannot ()just as one example).  In SQL Server (not including the 4 letter words like SSIS, SSRS, SSAS, etc), I get all of what is available whether it's "standard" SQL or some of the proprietary functionality.  In Oracle, you have to change the way you code because you have "standard" and PL-SQL they don't really play well with each other.

    I agree that the SQL world is a nicer place to work in than the procedural progrmming world.
    I also feel that the procedural programming world is pretty crazy in the way it accepts languages with new notations without any justification at all (I've seen languages which replace the "/" for divide with - and the SQL world (a wonderful croess between declarative and procedural) suffers from that to a rather lesser degree solely because there are fewer people inventing new versions of a procedural quasi-relational calculus than there are inventing new versions of Algol (under such wonderful names as "JavaScript")  and Simula (under such wonderfiul names as "C++").  Yes, it's fair enough to have new languages that make it easier to write programs, but it's crazy to invent a new language that changes all the operator symbols and introduces nothing new and there have been hordes of those - I was already getting fed up with seeing them when part of my job way back in the 1960s was to conduct a survey of available programming languages.

    Tom,
    I think an interesting experiment would be if someone (with lots of experience), would put together a site whereby the programmers in the trenches would vote on what programming syntax would make the most sense to them for a specific task (along with explanations like yours).  The voting would last for a period of one year -- it sure would be interesting to see what kind of syntax would bubble to the top of the list.  Or, it might not be, as I suspect everyone would up-vote their own language preferences; which I believe the author (and you) have pointed out.   Dare to dream.  Drat -- I just woke up and have to get back to the 8000 lines of C# I inherited from someone who apparently never knew that comments existed !  Cheers.

  • TomThomson - Sunday, August 6, 2017 4:44 PM

    Jeff Moden - Friday, August 4, 2017 7:27 PM

    Heh... you mean like EDI, XML, JSON, and PowerShell? 😉

    Well, PowerShell is a good example of a really horrible programing language (Javascript wouldis far more useful, and so is good oldfashioned CommandShell).  Indeed PowerShell is Verbose to the extent that it's unbearable. EDI isn't a programming language at all, it's a bloody silly collection of mutually incompatible standards for data interchange pretending to be a single standard.  Neither is XML a programming language - it's a horrible imitation of a mark-up language that's been vastly over-hyped and isusually misused as a hopelessly verbose and inefficient way of presenting data (and/or data formats). Nor is JSON a programming language - it's yet another way of representing data,, and the one good thing about it is that it's nowhere near as awful as XML.

    So yes, those things are examples of the sorts of mistake that I would expect to be the inevitable result of voting for a common language standard.  I'm thoroughly in favour of democracy where it's appropriate and aplicable (except where it's so formulated as to give power to to those for whom there is no genuine democratic support) but I regard democracy as absolutely useless when deciding questions about scientific and/or technological facts.  so regardlessof the massive popular support for XML I regard it as crap in most of teh contexts in which it s used,

    I  suspect we agree, but I guess I could be wrong.

    Agreed that EDI, XML, and JSON are not programming languages (especially EDI) but XML and JSON have language associated standards when it comes to construction/shredding and the like, especially in T-SQL.  And, yes... you got what I was suggesting and your suspicion that we agree on everything (especially on the notion of XML) that you've stated is correct except, possibly, for one thing (although I strongly suspect we'll agree even with this).  Even supposed carefully formulated or observed scientifically and technologically sound "facts" are frequently incorrect because of a lack of understanding that leads to improper testing or, perhaps, such a strong predisposition that a particular hypothesis is true that the tests are written (intentionally or unwittingly) in a biased fashion, both of which lead to incorrect conclusions.  Even when such tests are written correctly and the conclusions are correct for the given tests, there are sometimes exceptions that no one has thought of before and such contrary exceptions are sometimes useful beyond anyone's expectations.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • bdcoder - Sunday, August 6, 2017 5:20 PM

    Jeff Moden - Sunday, August 6, 2017 4:37 PM

    bdcoder - Sunday, August 6, 2017 12:01 PM

    Jeff Moden - Sunday, August 6, 2017 11:28 AM

    bdcoder - Sunday, August 6, 2017 10:51 AM

    Jeff Moden - Sunday, August 6, 2017 10:35 AM

    bdcoder - Sunday, August 6, 2017 9:31 AM

    TomThomson - Sunday, August 6, 2017 6:45 AM

    bdcoder - Friday, August 4, 2017 10:33 PM

    I don't think the author was trying to standardize ALL languages, just some of the common tasks programmers (not necessarily SQL developers) use.  Loops may not have a place in the SQL syntax world -- but in the programming world (compiled or interpreted) -- your damn right they do -- and I for one would LOVE to see a common syntax for loops, data types and conditional operators.   Like the author said, if just those few points were addressed, everyone's life would be easier.  But alas, just looking at all the comments the article has generated proves the point that the programming world will remain wild.

    No, loops and GOTOs and all the associated control flow stuff in Procedural languages have NO place in programming at any level above assembly which directly expresses machine code.  Since the late 60s we've had languages that are Turing-complete (ie are capable of expresing any imaginable computation) without using any of that control-flow crap.  A lot of programming has been done in Functional languages (various dialects of ML, Miranda, and so on - and recently quite a lot in Haskell) and in Logic languages (mostly variants of Prolog and Parlog) and it has been clearly demonstrated for most application areas that developers writing in these languages manage to write far fewer bugs than developers writing in procedural languages.   Too many computer professionals are blind to this, either not sufficiently interested in learning anything about computing beyond the minimum they can get away with, or believing they have a vested interest in procedural languages they have learnt, or having been "educated" in programming by professors whose knowledge is sadly restricted (and hence have a vested interest in teaching the procedural languages that they know).
    I guess the continued general developer preference for rather horrible procedural languages is part of the contimuing wildness of the programming world, and it will be  long time before common sense prevails.  of course there are some signs that things are getting better, for example Microsoft's production of  F# (which has some functional structure mixed in with procedural stuff).

    Agreed -- but you have to admit, IF we did have some sort of standard within our soup of procedural languages, everyone lives would have been so much better -- THAT is what I think the author was trying to get across; and I agree with him or her.

    I'm not so sure about that.  One of the quickest ways to stifle innovation is to establish standards and "Best Practices".  If you remove the opportunity to fail, you also remove the opportunity for success.

    I think (like the author mentioned in the article) if the programming world had a bit more stability in it like the SQL world does (i.e: SELECT / FROM / WHERE has not been "stiffled" in any way over the years and those same small three words are a de-facto standard) things would have been much better; and there would have been even more opportunities for "success". 🙂

    Several stifling attempts have been made in the world of SQL.  For example, there is and has been a move underfoot to make the UPDATE statement more like that of Oracle in that they want to remove the FROM clause from the SQL Server version because it allows people to make a mistake.  There's also the stifling "Best Practice", touted by many, to never do direct date math even though the ANSI Standards specifically state that EndDateTime-StartDateTime shall result in a period (interval).  Same goes for a "variable overlay" because "it's unreliable", which is only true if you use it the wrong way and people frequently do while others use it with great success when using it within its limitations.

    Good points, but I would still rather live (and work) in the SQL world than the "procedural programming" world of today!  I, like the author, have always found the SQL world contains many more "standards" than other areas of computing! All the best!

    Absolutely agreed on the "where to live and work" notion.  I gave up on the front end world of programming at the end of 2002 and haven't looked back since for many of the reasons stated and more.  But SQL isn't "standard" by any means... not even for SELECT.  In SQL Server, I can do variable overloading.  In Oracle, I cannot ()just as one example).  In SQL Server (not including the 4 letter words like SSIS, SSRS, SSAS, etc), I get all of what is available whether it's "standard" SQL or some of the proprietary functionality.  In Oracle, you have to change the way you code because you have "standard" and PL-SQL they don't really play well with each other.

    I hear you!  Just think that some worlds (SQL) have more consistency than others (general programming)!  Cheers!

    I can't disagree with you there.  In fact, I strongly agree and that's a very large part of why I decided to dump the procedural language world in favor of SQL way back in 2002.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • bdcoder - Sunday, August 6, 2017 8:29 PM

    TomThomson - Sunday, August 6, 2017 7:20 PM

    Good points, but I would still rather live (and work) in the SQL world than the "procedural programming" world of today!  I, like the author, have always found the SQL world contains many more "standards" than other areas of computing! All the best!

    Absolutely agreed on the "where to live and work" notion.  I gave up on the front end world of programming at the end of 2002 and haven't looked back since for many of the reasons stated and more.  But SQL isn't "standard" by any means... not even for SELECT.  In SQL Server, I can do variable overloading.  In Oracle, I cannot ()just as one example).  In SQL Server (not including the 4 letter words like SSIS, SSRS, SSAS, etc), I get all of what is available whether it's "standard" SQL or some of the proprietary functionality.  In Oracle, you have to change the way you code because you have "standard" and PL-SQL they don't really play well with each other.

    I agree that the SQL world is a nicer place to work in than the procedural progrmming world.
    I also feel that the procedural programming world is pretty crazy in the way it accepts languages with new notations without any justification at all (I've seen languages which replace the "/" for divide with - and the SQL world (a wonderful croess between declarative and procedural) suffers from that to a rather lesser degree solely because there are fewer people inventing new versions of a procedural quasi-relational calculus than there are inventing new versions of Algol (under such wonderful names as "JavaScript")  and Simula (under such wonderfiul names as "C++").  Yes, it's fair enough to have new languages that make it easier to write programs, but it's crazy to invent a new language that changes all the operator symbols and introduces nothing new and there have been hordes of those - I was already getting fed up with seeing them when part of my job way back in the 1960s was to conduct a survey of available programming languages.

    Tom,
    I think an interesting experiment would be if someone (with lots of experience), would put together a site whereby the programmers in the trenches would vote on what programming syntax would make the most sense to them for a specific task (along with explanations like yours).  The voting would last for a period of one year -- it sure would be interesting to see what kind of syntax would bubble to the top of the list.  Or, it might not be, as I suspect everyone would up-vote their own language preferences; which I believe the author (and you) have pointed out.   Dare to dream.  Drat -- I just woke up and have to get back to the 8000 lines of C# I inherited from someone who apparently never knew that comments existed !  Cheers.

    As you partially imply, it would be equally interesting to see what language would come out of the gin mill if folks with very little experience were used to conduct the same experiment.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden - Sunday, August 6, 2017 9:55 PM

    And, yes... you got what I was suggesting and your suspicion that we agree on everything (especially on the notion of XML) that you've stated is correct except, possibly, for one thing (although I strongly suspect we'll agree even with this).  Even supposed carefully formulated or observed scientifically and technologically sound "facts" are frequently incorrect because of a lack of understanding that leads to improper testing or, perhaps, such a strong predisposition that a particular hypothesis is true that the tests are written (intentionally or unwittingly) in a biased fashion, both of which lead to incorrect conclusions.  Even when such tests are written correctly and the conclusions are correct for the given tests, there are sometimes exceptions that no one has thought of before and such contrary exceptions are sometimes useful beyond anyone's expectations.

    I don't think we diasgree about scientists sometimes getting it wrong - as you say this can happen intentionally, e.g. the tendency to over-prescribe statins is there because pharma companies over-hyped them and withheld (and still have not revealed) the data from most of  their trials, publishing only "summaries" that supported their hype - unfortunately most of the medical profesion is taken in by it; and it can certainly happen unwittingly too (the god-awful mess in the first version of TCP, for example). 
    Often when a standards committee (especially one whose members represent commercial enterprises involved in providing stuff that conforms to the standards to be formulated) gets involved science goes out of the window so that any existing mistakes are preserved for the enjoyment of future generations and new mistakes are introduced to add some easoning to the stew. That can also happen when a research project gets transferred to development (and of course it's quite possible for the development team to misunderstand or to decide to do sometihing different for an important part of the project, as happened with IBM's RDBMS  when Chamberlain and Boyce designed the language originally called SEQUEL and now known as SQL that was not much like Codd's relational algebra - but was that a better language or a worse one?)   And supposedly independent and hence unbiased scientists and engineers are sometimes instructed by governments (or more often by civil sevants) to be extremely biased when reviewing grant requests or standards proposals (and some may follow such instructions; fortunately it seems to me that at least in Britain and Western Europe most, like me, don't).

    Tom

  • This topic has certainly produced some interesting comments however I really can't disagree strongly enough with the "purist" mind frame that one solution, technique or algorithmic aim will solve all problems. There's a reason that we have so many languages, and that they have evolved from different backgrounds, even if many are almost indistinguisable in core capabilities now. No language, SQL or otherwise, is without its faults and any capable language will give the developer enough leeway to cause problems through misuse or inappropriate use of the tools available.

    For example: automatic garbage collection and variable lifetime scoping is a (relatively) good thing at an application level and can prevent many basic errors but only as long as the developer knows what they are doing and doesn't create an indescribably inefficient system that spends more time garbage collecting that it should. Such a language, however, is near useless at a lower level where resources are intentionally tight and arbitrary code pauses caused by garbage collection are most definitely not desired. The more resource constrained a system, typically physically small as well, the more important such things are. However while wasting 90% of CPU time on housekeeping duties on more capable systems doesn't really impact most user level applications too much it is a shocking waste of computing resources for those used to coding in even a vaguely efficient manner. Do I care about garbage collection and variable life in SQL-server? Not really, it's the end result that matters as long as it isn't too inefficient - and we all know MS-SQL's approach to memory management is along the lines of "take it all". Do I care about the same with a client application which is heavily multi-threaded with connections to various hardware devices? Very much yes.

  • n.ryan - Monday, August 7, 2017 9:00 AM

    This topic has certainly produced some interesting comments however I really can't disagree strongly enough with the "purist" mind frame that one solution, technique or algorithmic aim will solve all problems. There's a reason that we have so many languages, and that they have evolved from different backgrounds, even if many are almost indistinguisable in core capabilities now. No language, SQL or otherwise, is without its faults and any capable language will give the developer enough leeway to cause problems through misuse or inappropriate use of the tools available.

    For example: automatic garbage collection and variable lifetime scoping is a (relatively) good thing at an application level and can prevent many basic errors but only as long as the developer knows what they are doing and doesn't create an indescribably inefficient system that spends more time garbage collecting that it should. Such a language, however, is near useless at a lower level where resources are intentionally tight and arbitrary code pauses caused by garbage collection are most definitely not desired. The more resource constrained a system, typically physically small as well, the more important such things are. However while wasting 90% of CPU time on housekeeping duties on more capable systems doesn't really impact most user level applications too much it is a shocking waste of computing resources for those used to coding in even a vaguely efficient manner. Do I care about garbage collection and variable life in SQL-server? Not really, it's the end result that matters as long as it isn't too inefficient - and we all know MS-SQL's approach to memory management is along the lines of "take it all". Do I care about the same with a client application which is heavily multi-threaded with connections to various hardware devices? Very much yes.

    I don't think it will ever be appropriate to have only one programming language. There are problems best solved with functional languages, others are better handled with logic languages, yet others with explicity process languages (CSP,CCS, ACCS and so on), some with very high level procedural languages, some with procedural languages that don't do a lot of behind the scenes housekeeping, some with very high level procedural languages that conceal lots of stuff from the programmer, some problems where enormous performance gains can be had by interpreting instead of compiling to machine level, some  of teh functional language stuff will need the eager evaluation model, others will need lazy evaluation, and others will put that in teh hand of the programmer by having eager/lazy annotations. Despite that, it would be nice not to have to remember 37 different idioms for testing whether one number is smaller than another -  wouldn't it be nice if  that was always done using the symbol " < "?   Also different hardware may need different languages - I can't imagine trying to write a compiler to get C++ executed efficiently on a distributed graph reduction engine, for example.  So to me the single perfect programing language is an amusing myth, not anything real in the forseeable future.

    Tom

  • some problems where enormous performance gains can be had by interpreting instead of compiling to machine level

    whats some examples of those problems? in my simple mind, interpretation is invariably slower so it'd be good to broaden my horizons on that one!

  • TomThomson - Monday, August 7, 2017 12:06 PM

    n.ryan - Monday, August 7, 2017 9:00 AM

    This topic has certainly produced some interesting comments however I really can't disagree strongly enough with the "purist" mind frame that one solution, technique or algorithmic aim will solve all problems. There's a reason that we have so many languages, and that they have evolved from different backgrounds, even if many are almost indistinguisable in core capabilities now. No language, SQL or otherwise, is without its faults and any capable language will give the developer enough leeway to cause problems through misuse or inappropriate use of the tools available.

    For example: automatic garbage collection and variable lifetime scoping is a (relatively) good thing at an application level and can prevent many basic errors but only as long as the developer knows what they are doing and doesn't create an indescribably inefficient system that spends more time garbage collecting that it should. Such a language, however, is near useless at a lower level where resources are intentionally tight and arbitrary code pauses caused by garbage collection are most definitely not desired. The more resource constrained a system, typically physically small as well, the more important such things are. However while wasting 90% of CPU time on housekeeping duties on more capable systems doesn't really impact most user level applications too much it is a shocking waste of computing resources for those used to coding in even a vaguely efficient manner. Do I care about garbage collection and variable life in SQL-server? Not really, it's the end result that matters as long as it isn't too inefficient - and we all know MS-SQL's approach to memory management is along the lines of "take it all". Do I care about the same with a client application which is heavily multi-threaded with connections to various hardware devices? Very much yes.

    I don't think it will ever be appropriate to have only one programming language. There are problems best solved with functional languages, others are better handled with logic languages, yet others with explicity process languages (CSP,CCS, ACCS and so on), some with very high level procedural languages, some with procedural languages that don't do a lot of behind the scenes housekeeping, some with very high level procedural languages that conceal lots of stuff from the programmer, some problems where enormous performance gains can be had by interpreting instead of compiling to machine level, some  of teh functional language stuff will need the eager evaluation model, others will need lazy evaluation, and others will put that in teh hand of the programmer by having eager/lazy annotations. Despite that, it would be nice not to have to remember 37 different idioms for testing whether one number is smaller than another -  wouldn't it be nice if  that was always done using the symbol " < "?   Also different hardware may need different languages - I can't imagine trying to write a compiler to get C++ executed efficiently on a distributed graph reduction engine, for example.  So to me the single perfect programing language is an amusing myth, not anything real in the forseeable future.

    "Despite that, it would be nice not to have to remember 37 different idioms for testing whether one number is smaller than another - wouldn't it be nice if that was always done using the symbol " < "?" -- BINGO -- THAT is exactly what the original author was asking for -- and what we will never see !

  • TomThomson - Monday, August 7, 2017 12:06 PM

    (snip) Despite that, it would be nice not to have to remember 37 different idioms for testing whether one number is smaller than another -  wouldn't it be nice if  that was always done using the symbol " < "?   Also different hardware may need different languages - I can't imagine trying to write a compiler to get C++ executed efficiently on a distributed graph reduction engine, for example.  (snip)

    Comparing less than or greater than is usually fairly consistent... until one gets to a language where one has to use words instead (most recent for this for me is PowerShell). Where I usually come unstuck at least once or twice when switching languages is the equivalence operators and the often barely comprehensible rules that are overlaid over them, this is made particularly troublesome in a language where pointers are obscured from the developers however sometimes they are used in comparisons and other times they are not, sometimes objects are copied, sometimes objects are passed by reference (known as pointers for everyone else).

    One of the annoyances I frequently have with languages is their desire to hide the machine implementation so much that as a developer that often needs to interact with hardware more directly, working out the exact damn data size of a variable type is excessively annoying. Quite why the types in, for example, C which is almost as low as you can get before assembly aren't named int8, int16, int32, int64, uint8, uint16 and so on I just can't fathom. I'm quite happy to accept that sometimes I just don't care about the size (within some constraints), therefore accepting a type which is optimal for the target bitness of the processor is fine, however on many other occasions I really need to know for sure that when I'm referring to a uint8 that I am always referring to a uint8 and it won't be compiled to a uint32 because that's more efficient for the processor to handle.

Viewing 15 posts - 61 through 75 (of 78 total)

You must be logged in to reply to this topic. Login to reply