November 16, 2010 at 3:35 pm
Programming right isn't something difficult to learn, just the basics about it willl bring you a lot. But being a good programmer, and also dba will take much more effort and time and experiance. Some things you won't read in a book, some things you just can't learn. How to deal with a crappy database backup but really really have to restore takes you somewhere. Specially when i's still weekend and you know that on monday 200+ people will come into the office building and expect the program to be working. I'm a programmer for work, and part time dba. I'm just scared about programmers not knowing anything about sql servers workings and internals. I don't expect they know about Kalen Delany, but i do expect them to know about indexes and the basics about execution plans. But it's not just programming or dba, the problem also goes to architects .. if they don't know about the systems they are using, they will design crappy systems, do impossible things or ask for screens containing too much data from too many sources. It's a joint effort to go for speed, there is no single point of failure..
Wim van den Brink
November 16, 2010 at 4:34 pm
Was once asked to produce a one page management report that necessitated 52 separate, seriously complex queries.
Some of the totals, from different parts of the several databases involved, produced conflicting results.
Exceedingly long and politically fraught testing established that the business rules for several parts of this national company were in direct conflict.
The result was to drop the requirement for the report.
If Jeff's simple do it once, do it right, had been applied at the inception of each of the various systems, this situation would not have been able to develop.
Maybe I have an odd perspective of what he means. I don't have an issue with an ugly UI for the first cut. I don't have an issue with, "that's a bit slow, mark it for some serious analysis". I do have an issue with bad design up front. The jury is still out, from my perspective, on marketing insisting on smoke and mirrors to get the first cut live. You need some insanely good architects to get away with that.
Peter Edmunds ex-Geek
November 16, 2010 at 5:02 pm
I didn't mean to imply you write bug-free code the first time you sit down. I don't think I ever said it would be perfect, nor do I think Jeff implies that.
The idea of doing it right the first time is that once you know what a good way to do something is, you use those skills to right better code. You spend a few minutes thinking about things, rather than just coding in the most convenient way. You'll still have to test and debug things, but you look to implement the best practices, best techniques, best methods you know of, or have tried to learn about. You expand your toolbox, so you're "best" grows all the time and it improves.
November 18, 2010 at 2:08 am
"Even though many of those constraints don't apply, I still try to be careful and watch my data types, watch my round trips, and be aware that there are actually bits being whipped across the ether or in and out of transistors in a not-so-little package on my computer system."
Even though constraints like that don't apply, I artificially create those constraints. For example, in the client-server code I add a sleep(1000) to sleep for 1 second each time the server (which is on the same machine) is called in my debug version. After I have waited so long on many operations my natural inclination is to cut down on those server calls. As a result my release version goes like a rocket even over the internet with its high latency.
November 19, 2010 at 11:51 am
This talk about code efficiency reminds me of a "war story" from college (in the mid-1980s).
The college had a couple of mainframe VAX/VMS computers, with terminals in several different locations on campus. System accounts were assigned according to department and course number, and one of the features of the VAX was a "show users" command that let you see the other accounts currently online.
There was an ongoing feud of sorts between the computer science students and the petroleum engineering (abbr. Pet) students. The Pet majors had an upper-division course that involved a bunch of FORTRAN progamming (for doing drilling simulations, IIRC). Their programs involved huge amounts of data, but all they were taught was how to put the code together and make it run. They weren't taught anything at all about how to make the programs run efficiently, which meant that their programs consumed hideous amounts of CPU time.
This was a source of constant annoyance to the CS students whose class accounts were on the same VAX as the Pet majors. The CS students, of course, were taught the principles of program efficiency, but they kept finding the VAX to be extremely slow if the Pet majors were on it in any great numbers. It got to the point where if a CS student needing to get some work done logged into the VAX, did a "show users," and found 15 or so Pet majors on the system, he might as well bag it and try again later (or even the next day), because with that many Pet majors siphoning system resources, getting work done just wasn't going to happen. Unsurprisingly, the CS majors got extremely peeved about paying the price for someone else's sloppy programming.
Needless to say, this sparked a lot of flame wars on the VAX's bulletin board, with the CS majors telling the Pet majors to "learn to program or get off the VAX" (to put it nicely) and signing the posts with lines like "A CS student who fell asleep waiting for his code to compile!"
The situation never changed while I was there, but it would have been interesting to have been a fly on the wall if the CS profs had ever complained to the Pet profs about it. Ahhh, memories ... π
November 19, 2010 at 4:29 pm
Geoffrey Wrigg (11/19/2010)
This talk about code efficiency reminds me of a "war story" from college (in the mid-1980s).The college had a couple of mainframe VAX/VMS computers, with terminals in several different locations on campus. System accounts were assigned according to department and course number, and one of the features of the VAX was a "show users" command that let you see the other accounts currently online.
There was an ongoing feud of sorts between the computer science students and the petroleum engineering (abbr. Pet) students. The Pet majors had an upper-division course that involved a bunch of FORTRAN progamming (for doing drilling simulations, IIRC). Their programs involved huge amounts of data, but all they were taught was how to put the code together and make it run. They weren't taught anything at all about how to make the programs run efficiently, which meant that their programs consumed hideous amounts of CPU time.
This was a source of constant annoyance to the CS students whose class accounts were on the same VAX as the Pet majors. The CS students, of course, were taught the principles of program efficiency, but they kept finding the VAX to be extremely slow if the Pet majors were on it in any great numbers. It got to the point where if a CS student needing to get some work done logged into the VAX, did a "show users," and found 15 or so Pet majors on the system, he might as well bag it and try again later (or even the next day), because with that many Pet majors siphoning system resources, getting work done just wasn't going to happen. Unsurprisingly, the CS majors got extremely peeved about paying the price for someone else's sloppy programming.
Needless to say, this sparked a lot of flame wars on the VAX's bulletin board, with the CS majors telling the Pet majors to "learn to program or get off the VAX" (to put it nicely) and signing the posts with lines like "A CS student who fell asleep waiting for his code to compile!"
The situation never changed while I was there, but it would have been interesting to have been a fly on the wall if the CS profs had ever complained to the Pet profs about it. Ahhh, memories ... π
The CS students were apparently not competent enough to use the following commands to either kill the Pet major jobs, or to set their priority so low they couldn't bother anyone. It would require them to gain sufficient rights to do it, but thatβs just part of the challenge.
STOP/IDENTIFICATION=pid
SET PROCESS /IDENTIFICATION=pid /PRIORITY=2
When I worked on an overloaded VAX, the first thing I did every morning was set my one process priority above the default level of 4 so that I didn't have to suffer like the mass of users. π
November 19, 2010 at 5:21 pm
Agreed! Though, I was working on a few dozen Alpha timeshare machines for a few years, and we only used that measure in emergencies π
The issue of non-programmers (PET) doing programming is an interesting one. I don't mean hobbyist programmers (the quality of which varies itself), but people who come in from side professions and are just trying to get a job done.
I have a friend who is a mechanical engineer, and he had to take over a role setting up A/V while the company looked over for a replacement. Specifically, he had to write some code to run on a server and a few dozen clients to detect a remote control signal from a doohickey he made for a serial port and adjust media player volume.
There was some problem so I looked over the code. It was likely one of the worst (immoral, unnatural) things I'll ever see. And it's now in production. But it works, and I couldn't have done it, and so kudos to him.
My long and meandering point is, it's hard for me to hold something against non-programmers, and "right to program" almost seems like one of the rights we should be born with, even if it's bad. We're not all going to be race-car drivers, but we're still allowed to drive normal cars, right.
November 22, 2010 at 8:51 am
The CS students were apparently not competent enough to use the following commands to either kill the Pet major jobs, or to set their priority so low they couldn't bother anyone. It would require them to gain sufficient rights to do it, but thatβs just part of the challenge.
STOP/IDENTIFICATION=pid
SET PROCESS /IDENTIFICATION=pid /PRIORITY=2
When I worked on an overloaded VAX, the first thing I did every morning was set my one process priority above the default level of 4 so that I didn't have to suffer like the mass of users. π
The CS student accounts didn't have the right to screws with priority settings ... not officially, anyway. One of my buddies found a loophole and exploited it to set his account priority to 9. By way of comparison, student accounts were 3 or 4, depending on class level; instructors were at either 5 or 6 (I forget which). Anyway, with his account priority juiced up, my friend was doing stuff like compiling huge amounts of code while the ill-taught Pet majors were struggling to log on. He had a great time until the SysAdmin caught him.
In what was apparently a plea-bargain style arrangement, in exchange for telling the admins how he'd changed the priority, the only punishment he got was having his main account "demoted" to the slower VAX. And yes, after that the admins closed the loophole. Such was life in VAX-land. π
[/quote]
November 22, 2010 at 6:38 pm
I think doing it correct the first time is better than whipping it together incorrectly. There may be a bug or an assumption or business requirement that is wrong that will require rework. But just whipping something together as a band-aid is more costly in the end than to have done it correctly the first time around (factor in rework and research).
As Gus said, there are tradeoffs to be known and knowing that point is quintessential to effective coding.
Jason...AKA CirqueDeSQLeil
_______________________________________________
I have given a name to my pain...MCM SQL Server, MVP
SQL RNNR
Posting Performance Based Questions - Gail Shaw[/url]
Learn Extended Events
Viewing 9 posts - 16 through 23 (of 23 total)
You must be logged in to reply to this topic. Login to reply