January 12, 2012 at 1:57 pm
No Agile, back then. 😉
January 12, 2012 at 2:14 pm
Heh, New Guy on the block I guess.
Ctrl-F5 is my friend. Where'd I typo?
I want to worry and keep in my head the object model and the inter-communication and the algorithmic intents. I don't want to worry so much about if I typed INT or ITN. That's what the compiler and checkers are THERE for.
I understand what you're saying, I just disagree horribly. My brain has x space that I can juggle a lot of things in. Do I want to typo? No. Do I want to spend 15 minutes manually de-bugging syntax and typos when in 10 seconds the machine will do it for me and I *don't* have to try to get all that algorithmic methodology back into my head? No. I'd rather spend the mental energy on the method and let the machine let me know if I goofed up the instruction. The output will let me know (and the part *I* have to debug) if I goofed up the method.
Never stop learning, even if it hurts. Ego bruises are practically mandatory as you learn unless you've never risked enough to make a mistake.
For better assistance in answering your questions[/url] | Forum Netiquette
For index/tuning help, follow these directions.[/url] |Tally Tables[/url]
Twitter: @AnyWayDBA
January 12, 2012 at 2:30 pm
Revenant (1/12/2012)
No Agile, back then. 😉
Actually if you read "The mythical man month" by Fred P Brooks you will see that one of his peers was recommending the progenitor of agile back in the 1960s.
The mythical man month was written in 1975 and was the Fred's experience over 25 years in the computer business. I was amused by one of the Amazon reviews of it that said "taught me nothing new". Of course not, a lot of what was proposed became industry standard practise.
January 12, 2012 at 2:42 pm
Evil Kraig F (1/12/2012)
Heh, New Guy on the block I guess.Ctrl-F5 is my friend. Where'd I typo?
I want to worry and keep in my head the object model and the inter-communication and the algorithmic intents. I don't want to worry so much about if I typed INT or ITN. That's what the compiler and checkers are THERE for.
I understand what you're saying, I just disagree horribly. My brain has x space that I can juggle a lot of things in. Do I want to typo? No. Do I want to spend 15 minutes manually de-bugging syntax and typos when in 10 seconds the machine will do it for me and I *don't* have to try to get all that algorithmic methodology back into my head? No. I'd rather spend the mental energy on the method and let the machine let me know if I goofed up the instruction. The output will let me know (and the part *I* have to debug) if I goofed up the method.
This method ranks rite up their with using spell check and auto-correct to insure that you right in proper anguish.
Syntactically valid and logically consistent are sometimes very different things, and "It worked on my machine" is never something I want to try and explain to senior management in the heat of a crisis.
Now that I've gotten that out of my system... I do agree that lint tools are a good idea, and having them integrated into the IDE is even better, but everytime I see a coder search through the auto-complete list to look for a function that has a name that sounds like it does X I cringe. I've seen too many caveats in API docs about memory leaks that aren't obvious.
January 12, 2012 at 3:19 pm
Someone posted a comment on this site about not using a hammer to cut boards.
The right tool at the right time used in the right way delivers the goods.
Given the richness of most APIs I would not want to go back to wading through a tome like "Programming the Windows API" in order to find a function. Auto-complete plus context sensitive F1 is my friend.
Having got SQLPrompt set up just so I feel like my hands have been chopped off when I hot desk to a machine that doesn't have it.
For the most part I feel the same way about SQL Refactor. I personally wouldn't format my code the way that SQL Refactor does but that isn't the point. A unified way of formatting code is very useful when reading other peoples code irrespective of my personal preferences.
I've said it before that when I taught myself C++ (using Ivor Horton's books) I became a much better VB programmer. It wasn't really a skill thing it was a mentality thing. Which brings me back to the original point of the editorial. How do you foster a mentality that prizes quality and thought?
I'm a great fan of cycling people through the IT disciplines including a stint of being on-call in operational support. A few 3am phone calls with a live priority one incident changes your perspective on what constitutes quality, and why code can never be considered self-documenting. Walk a mile in another mans shoes and all that!
January 12, 2012 at 6:41 pm
As primarily a c# coder, I emphatically disagree with this proposition.
In fact, being able to compile frequently (any I often do it once every couple of minutes) allows me to focus fully on the architecture and getting the important details right. The 'fluff' is then picked up by the compiler as errors and I can use it as a brainless 'to do' list.
The fact is, it's entirely possible (and usual, some might say) to write compilable but architecturally shocking code.
Also, the really tricky bugs often arise as a result of subtle procedural interactions which are far from the syntax or grammar of the language.
Compiling is the equivalent of spell and grammar checking a word document.
Keep in mind that my first programs were written on paper before being typed into an Apple II.
January 12, 2012 at 7:32 pm
ben.mcintyre (1/12/2012)
As primarily a c# coder, I emphatically disagree with this proposition.In fact, being able to compile frequently (any I often do it once every couple of minutes) allows me to focus fully on the architecture and getting the important details right. The 'fluff' is then picked up by the compiler as errors and I can use it as a brainless 'to do' list.
The fact is, it's entirely possible (and usual, some might say) to write compilable but architecturally shocking code.
Also, the really tricky bugs often arise as a result of subtle procedural interactions which are far from the syntax or grammar of the language.
Compiling is the equivalent of spell and grammar checking a word document.
Keep in mind that my first programs were written on paper before being typed into an Apple II.
Te word "architecture" may have a different ring on this forum that is dealing with SQL Server than on a forum that deals, say, with physical representation of a UML domain model.
What exactly do you mean by "architecture" when you are scripting SQL Server (22 millions lines of code) running on top of Win2k8 Server (55 million lines of code), within the (perhaps unstated) confines of both?
January 12, 2012 at 7:45 pm
Revenant (1/12/2012)
ben.mcintyre (1/12/2012)
As primarily a c# coder, I emphatically disagree with this proposition.In fact, being able to compile frequently (any I often do it once every couple of minutes) allows me to focus fully on the architecture and getting the important details right. The 'fluff' is then picked up by the compiler as errors and I can use it as a brainless 'to do' list.
The fact is, it's entirely possible (and usual, some might say) to write compilable but architecturally shocking code.
Also, the really tricky bugs often arise as a result of subtle procedural interactions which are far from the syntax or grammar of the language.
Compiling is the equivalent of spell and grammar checking a word document.
Keep in mind that my first programs were written on paper before being typed into an Apple II.
Te word "architecture" may have a different ring on this forum that is dealing with SQL Server than on a forum that deals, say, with physical representation of a UML domain model.
What exactly do you mean by "architecture" when you are scripting SQL Server (22 millions lines of code) running on top of Win2k8 Server (55 million lines of code), within the (perhaps unstated) confines of both?
The OP did specifically mention Turbo Pascal and other non SQL languages.
SQL not being an object oriented language, the architecture is less clear from the code structure. But I should clarify that I'm speaking of software architecture, not hardware/tiering.
So at the crudest, the equivalent in SQL would be correct normalisation of data (perhaps with judicious denormalisation), and indexing of primary keys, etc.
Of course there is usually an 'X' factor with software architecture once getting into the more complex realms; which usually involves having been there before and knowing the consequences of decisions 8 steps down the track rather than at first sight.
My point is, I suppose, that you could write a script for to create an improperly normalised data structure without helpful indexing which would execute correctly.
The true challenges are not at the level of spelling or grammar.
January 13, 2012 at 1:00 am
As a developer who started with home computers to program (Vic 20 in my case) I missed the whole compile a day thing but was taught using non-IDE compilers. This did allow for using the compiler to statically check your code - which I see as a good thing - but the cycle of Edit/Compile/Run being longer than it is today ensured that the cost of writing gibberish was too high. Unfortunately, the problem with some coders today is that they will not THINK about what they are producing in terms of what it is doing and how maintainable it will be.
Strangely, too many people are viewing their own code as black boxes in which they only care about the answer and not how they got it. "Wait a minute!" I hear you cry "Does it really matter?". Well, yes. If you run it and it works for one scenario because it has been "jiggled" (as one poster said) how confident are we that it will work under most, if not all, scenarios?
At college a girl, she was called Amanda, once completed an exercise to learn to programme in Ada (under the section iteration) which was to print the alphabet backwards. She achieved it with the following line of code:
with Ada.Text_IO; use Ada.Text_IO;
procedure ReverseAlphabet is
begin
Put_Line("zyxwvutsrqponmlkjihgfedcba");
end ReverseAlphabet;
The result was correct but she had missed the point entirely (or maybe she hadn't and was struggling).
Is jiggling until you get the right answer good enough for a professional software developer? I say no.
Gaz
-- Stop your grinnin' and drop your linen...they're everywhere!!!
January 13, 2012 at 7:53 am
At college a girl, she was called Amanda, once completed an exercise to learn to programme in Ada (under the section iteration) which was to print the alphabet backward.
I love that bit of code.
I know it was probably meant to start with the alphabet then rearrange it backward.
I'd like to see what the assignment actually said though, because if it WAS to print the alphabet backward, then it does EXACTLY what it is supposed to do.
It rather reminds me of an assignment, set by Edison (I think), to find the volumn of a light bulb. After many measurements and calculations, he came back, filled it with water and poured it into a measuring cup.
As far as coders today not thinking about what they are producing, one of my first tasks was to debug a massive RPG program where the program flow consisted entirely of GOTO statements and this was production code that had been in place for quite a long time.
Sure, if you were smart you would put a lot of thought and time into your program if you had a single compile a day, but this wouldn't necessarly mean it was better. It would just have a better chance of compiling the first time.
January 13, 2012 at 9:14 am
ben.mcintyre (1/12/2012)
As primarily a c# coder, I emphatically disagree with this proposition.In fact, being able to compile frequently (any I often do it once every couple of minutes) allows me to focus fully on the architecture and getting the important details right. The 'fluff' is then picked up by the compiler as errors and I can use it as a brainless 'to do' list.
The fact is, it's entirely possible (and usual, some might say) to write compilable but architecturally shocking code.
Also, the really tricky bugs often arise as a result of subtle procedural interactions which are far from the syntax or grammar of the language.
Compiling is the equivalent of spell and grammar checking a word document.
I think you might be the exception rather than the rule. I would like to think that the compiler is just a "to do" list for many people, but I think many of them write code, compile it, and hope it works rather than thinking things through.
January 13, 2012 at 12:00 pm
Steve Jones - SSC Editor (1/13/2012)
ben.mcintyre (1/12/2012)
As primarily a c# coder, I emphatically disagree with this proposition.In fact, being able to compile frequently (any I often do it once every couple of minutes) allows me to focus fully on the architecture and getting the important details right. The 'fluff' is then picked up by the compiler as errors and I can use it as a brainless 'to do' list.
The fact is, it's entirely possible (and usual, some might say) to write compilable but architecturally shocking code.
Also, the really tricky bugs often arise as a result of subtle procedural interactions which are far from the syntax or grammar of the language.
Compiling is the equivalent of spell and grammar checking a word document.
I think you might be the exception rather than the rule. I would like to think that the compiler is just a "to do" list for many people, but I think many of them write code, compile it, and hope it works rather than thinking things through.
When coding SQL, the developer receives positive reenforcement, because thinking things though in the design results in a significantly shorter compile + runtime.
"Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho
May 2, 2012 at 7:27 am
I'm inclined to agree with Craig: if some tool will remember things for me so that I have more room in my head for stuff that needs me, rather than a tool, to do it then that tool (unless it also has some nasty habits) is going to be welcome. Also of course I agree with Gus - some people will abuse the ability to have a tool check syntax for them and pay no attention to what they are doing because they think the tool will spot all the errors - it's essential to keep track of the things that the tool can't spot the errors in (like the semantics of the code, as opposed to its syntax).
The first computing I did was with paper tape and maybe two runs per day; so after a couple of days when writing Fortran I wrote very few syntax errors, because they took too long to correct. Later I was in a world where syntax checker software was common, but it was still batch even though it took less resources than running the compiler. Still later and I had interactive syntax checkers available using a teletype or a VTU to access a remote interactive system, and that changed the world - I could now put more software to gether more quickly and much more economically (Revenant put his finger on the economical aspect) and that of course carried through into the desktop computer world.
I've worked with many different languages, so that I know several different syntaxes for describing the same passive structure; for example should I use
(int X, int Y, float Z)
or
(X int, Y int, Z float)
? Should I care and make sure I type it right first time, or just let the syntax check (usually done by a compiler) pick it up if I type it wrong? If I had dealt only with resonably sane and sound languages I might think it sensible to do it myself, but having dealt with several real abominations (of which the worst was C++) I have no wish to waste brain space on the nonsense that some language designers have devised so now I let the system find things for me. Neither do I want to restrict my typing speed by ensuring by looking at the keys I strike to make sure that there are no adjacent transposition errors, no double key hits, no mishits: if I'm typing text a spell checker can spot those for me, if I'm typing in a programming langauge the compiler can spot them.
The other problem is that whatever modern system designers shave with, it clearly isn't Occam's Razor. The number of libraries of functions, procedures, subroutines, classes, datatypes, modules, macros that a typical language system now provides is such that it would, in my view, be foolish to try to remember them all (although perhaps some people use only one language, and if it's a parsimonious one they can maybe afford to learn all that junk). My own way of dealing with this is to keep reference books open in a spare window, plus allow autocompletion to make suggestions for me in systems where that makes sense (not in SSMS: I hate autocompletion as implemented there); some people I know allow autocompletion to make the decisions, but that in my view is insanity; other people type something and let the compiler tell them if it's invalid, which is OK as long as there aren't pairs of different but similarly named objects with the compatible parametrisation.
So, I think the plus side of being able to let the machine do some of the grunt-work for us greatly outweighs the minus side; but some people do misuse it.
Tom
May 2, 2012 at 9:42 am
L' Eomot Inversé (5/2/2012)
. . . So, I think the plus side of being able to let the machine do some of the grunt-work for us greatly outweighs the minus side; . . .
Yeah, I have always wished for a programming language that would have only two statements: DO and UNDO. Compiler should figure out the rest from the context.
😀
May 2, 2012 at 10:08 am
Revenant (5/2/2012)
L' Eomot Inversé (5/2/2012)
. . . So, I think the plus side of being able to let the machine do some of the grunt-work for us greatly outweighs the minus side; . . .Yeah, I have always wished for a programming language that would have only two statements: DO and UNDO. Compiler should figure out the rest from the context.
😀
Perl can do a lot with one command.
"Do not seek to follow in the footsteps of the wise. Instead, seek what they sought." - Matsuo Basho
Viewing 15 posts - 16 through 30 (of 32 total)
You must be logged in to reply to this topic. Login to reply