The human programmer
Life was hard for opinionated intellectuals before the internet. Getting published at all was an ordeal, even for the world’s greatest programmer. He got to consider things harmful once in a while, but he didn’t get to post a dispatch every week on any punch-card version of Wordpress.
The only thing 1970s intellectuals could do to foist their opinions on the world was to give long speeches. People would transcribe these, and much later they would be put on the world wide web and periodically, reverently linked to. One of these exhumed speeches is about cargo cult science, which Dive into Mark thought relevant to Apple and open source. The metaphor was bent into service in a post titled “When the bough breaks,” though that metaphor was not intended to refer to the other.
The thing about these speeches is that they are deadly boring. They were prepared for captive, worshipful audiences. Every word did not count. Apparently not even every paragraph counted. Most people pressing that digg button for them can’t have gotten much farther than the esteemed by and date lines. But regarded as cultural artifacts, they’re fascinating.
One that’s been around the block lately is Dijkstra’s speech, “The Humble Programmer.” Coderspiel felt obligated to slough through it, given its contrast with our “glory of software programming” dedication. So here we are, reporting from 1972.
Running in high heels considered silly
Dijkstra’s coming-of-age story is compelling. He became a programmer before it was a recognized profession, simply because he wanted to. Nowadays, there’s really no such thing as a recognized profession. Or perhaps anything you can dream up immediately becomes a recognized profession. But in the old days you had to do something respectable to get a marriage license, so Dijkstra had to lie and say he was a theoretical physicist. And so programming as a job emerged slowly and without much respect.
We’re summarizing by about one order of magnitude, but this is still boring. So let’s skip down to some trash talking. There were these horrible “third generation” computers back then that were so bad they made every one retarded:
But the design embodied such serious flaws that I felt that with a single stroke the progress of computing science had been retarded by at least ten years: it was then that I had the blackest week in the whole of my professional life. … still so many people honestly believe that some law of nature tells us that machines have to be that way. They silence their doubts by observing how many of these machines have been sold, and derive from that observation the false sense of security that, after all, the design cannot have been that bad. But upon closer inspection, that line of defense has the same convincing strength as the argument that cigarette smoking must be healthy because so many people do it.
Sweet. Dude, where is ur blog? Oh right, didn’t have them then. But this quote finds its echo, surprisingly enough, in something posted just last week about how Groovy is as slow as an abacus:
Subj: 400 billion flies cannot be wrong
Most probably you will soon hear “If groovy were so slow, Fortune 500 companies would not use it - so it cannot be so slow.”
I have already given up on the subject.
Ha ha! Boom. Naturally, the reply message dutifully says that, yes, it does work in somebody’s big company without turning the hardware to a pile of smoldering plastic, so there! It’s all about the “context,” he says in capital letters. Hey, I get it. I’m going to write a script that is for some reason ideally suited to running in high heels. Something that demands dynamic typing, and also demands being run on a virtual machine that is horrible at dynamic typing. Let’s go!
The software chopportunity
This Dijkstra speech is really about “the software crisis,” and how to solve it. As machines became faster, humanity’s appetite for software only increased and programmers were increasingly failing to satisfy it. Dijkstra concludes (spoiler alert!—but you should be grateful) that programmers must take a modest view of their own abilities to avoid drowning themselves in software complexity.
This isn’t exactly what we expected, as it’s a personal application of modesty and it doesn’t conflict with programming being glorious. We could code modestly and still be carried down the streets by thankful masses who just have to have their Facebook, et cetera. Still, “glorious” is starting to wear thin and we’ll probably change it anyway. For the love of programming? Well, something like that.
But whatever happened to that software crisis? Dijkstra was particularly taking aim at the programming language PL/1, which helped the U.S. land on the moon but was at least as complicated as a lunar lander. Dijkstra being the killer app that he was, was able to bury PL/1 in a trench somewhere off the coast of Madagascar. Near the gulf of goto
.
But, surprise! We’re still having a software crisis. The man knew we would be, of course. The point of any speech is to motivate people, using white lies as needed. And who knows where we’d be if PL/1 hadn’t been completely destroyed. This could be a computing dark age, and it isn’t. But The New York Times tells us that since we now have “interconnected and interdependent networks” (hmm) we’re in real big trouble:
problems arising from flawed systems, increasingly complex networks and even technology headaches from corporate mergers can make computer systems less reliable. Meanwhile, society as a whole is growing ever more dependent on computers and computer networks, as automated controls become the norm for air traffic, pipelines, dams, the electrical grid and more.
Striving to be timely, the article casts the problem as one heavily aggravated by modern networks. Computers themselves have, of course, been involved with important (and often military) business as long as they’ve been around. And they can’t help being complex. (The demands always are.) Networks have been connecting them for almost as long, wreaking havoc while being enormously useful. Reporters might not have had much interaction with them until the mainstream internet, but networks have always been out there threatening to malfunction and blow us up with H-bombs.
Here’s a theory of software quality for you: software must be nurtured. The existence of bugs isn’t mysterious to any honest programmer. They are the product of neglect. Finding a bug in one’s code isn’t so much a surprise as a feeling of déjà vu. Ohhhh yesssss, I remember thinking I should check that condition. Programmers have complete control over the quality of their code and, when working on code they care about, tend to produce things that work. The secret is to care for the programmers, so that they take good care of the software.
Large organizations generally drive their software bees into combat formations, rather than nurturing colonies. Standardization is the mantra. Everyone will use one language, one set of libraries, and three ostentatious software “patterns.” In such forced conditions the feeling of code ownership vanishes and quality goes out with it. Weeks of planned system testing only reinforces the idea that bad code is expected and responsibility doesn’t exist. The late, unreliable, and inefficient software that is produced is everyone’s baby and no one’s, raised in a dysfunctional orphanage.
The alternative is obvious: break systems down into their smallest practical units, and let small groups of programmers determine implementations from top to bottom. Necessary interfaces between modules are hammered out according to the best available standards; unnecessary interfaces are not built. Coders get to work in their favorite technology, or learn a new one.
There are obvious downsides to that approach, the first out of everyone’s mouth being that the code will be impossible to maintain because no one knows anything but Struts 1.0. Let’s imagine a portion of a system is built in Django, because there’s a lot of excitement about it these days. The programmers nail it, and build the software twice as well and in half the time that they would have in a forced Struts march. Not just because Django makes the work easier, but because they are motivated. Then they go out to celebrate and are poached by a rival corporation. Oh my God, how will you ever do the phase two?!
Craigslist. Python programmers are out there, looking for a job where they can actually use their favorite language. They probably aren’t even used to being paid as much as corporate Java programmers. So, that could be good. Incidentally, Dijkstra had some thoughts on salaries:
in the coming years programmer salaries may be expected to go down. … perhaps the programmers of the past decade have not done so good a job as they should have done. Society is getting dissatisfied with the performance of programmers and of their products. … the price to be paid for the development of the software is of the same order of magnitude as the price of the hardware needed, and society more or less accepts that. But hardware manufacturers tell us that in the next decade hardware prices can be expected to drop with a factor of ten. … You cannot expect society to accept this, and therefore we must learn to program an order of magnitude more effectively.
Oops. His braininess just fell into a common pricing fallacy. It isn’t about what the buyer (in this case the employer) is willing to pay, it’s about what price the sellers (programmers) are willing to accept. Perversely, dissatisfaction with software produced in large organizations tends to increase salaries. The more programmers are disliked for failed software the more poorly they are treated, and the more poorly they are treated the more it costs to keep their seats filled. If only those seats didn’t have to be filled! Yeah. If only you didn’t need that software. And so it pays more to write bad software for large organizations than good software for small ones, if you can put up with constant, thinly veiled contempt. And regimental work conditions. And regular failure.
This is not a crisis—it is a condition. It’s brought on by lack of understanding between programmers and everyone else. We won’t make the mistake of predicting that it will be healed this decade, but we can at least say that the outlines of the problem are becoming clearer. It isn’t software in general that is so intolerably bad; it’s software in large organizations. As raganwald points out, that’s becoming more obvious:
As if that wasn’t enough, the really bad news is, when our users go home they have this thing called the Internet. … You see, the users get exposed to other ways of doing things, ways that are more convenient for users, ways that make them more productive, and they incorrectly think we ought to do things that way for them.
Again, let’s recognize that resentment by itself only makes the problem worse. You can fire an entire IT department and within five years the replacements will be smothered, and lulled, into the same state of malfunction. But as it becomes clear that the chaos of the internet runs circles around the lockstep of corporate methods, someone is bound to take the risk of modeling systems after the internet, allowing programmers to act independently, and internal competition to thrive. And when that works, it will spread. One thing those battleships do quickly is imitate any hint of success in their competition. Legitimate enterprise competition, of course.
Glory, glory, hallelujah
We didn’t find what we were looking for about professional attitudes from Dijkstra. He was too concerned, naturally, with the relationship between programmer and computer. We’ll have to ask Humphrey Cobbler instead. He’s a fictional character in Robertson Davies’s The Salterton Trilogy. Cobbler is an organist, an absurdly enthusiastic musician, and a constant scandal in society for his refusal to conform. And this what he says about his work:
That’s what wrong with my job, too, you know. Too much talk about the nobility of it, and how the public ought to get down on its knees before the artist simply because he has the infernal gall to say that he is an artist, and not enough honest admission that he does what he does because that is the way he is made. [p. 365]
Since the 1954 publication of those words musicians have been so successful in rebranding themselves as “artists” that, if for some reason you’re listening to the most recent Christina Aguilera album, you can hear one of her fans pledging in a non-musical interlude that Christina is absolutely the best artist ever. Yikes.
Maybe programmers are just like the 1950s musicians that lacked Cobbler’s confidence, snatching desperately for public nobility. Thus far our attempts at title theft have been less successful than theirs, though not for lack of trying (e.g. software developer / architect / engineer). Like musicians, our work requires not only talent but years of practice, and we see ourselves as “different.” The average person cannot walk up to a piano, or a computer keyboard, and produce anything of value. Writers? Throw them in here too, certainly. Are reporters not a little too serious about being called “journalists”? Of course, this line of thought would annoy both of those professions, them being old, established, and respected compared to programming. How pathetic.
Creation is a wonderful thing, without a doubt. But there’s no point in trying to explain why, or how, one way of doing it is so wonderful; inevitably it makes you ridiculous. (We’ll be changing that Coderspiel dedication!) The best homage you can make to the work is simply to do it. Make something for yourself, and make something for other people. If it gives them pleasure, you just might be recognized for it.
But not as much as Christina Aguilera. Running, in high heels. While giving a speech. Bye!
Codercomments
Nice rant!
This was a rant? It was meant to be a calm rumination on the professional and personal culture of software programming. Oh well! That is what I get for all the coffee.
Wonderful essay. And thanks for the link!
“The best homage you can make to the work is simply to do it.”
I think you mean “do it well” :P
“We didn’t find what we were looking for about professional attitudes from Dijkstra.”
I don’t understand this part. What’s the definition of “professional attitudes”?
You would want to do programming well to make a good homage to it; I think that’s implied. What I wanted to emphasize is doing the job vs. describing it (as development, or engineering, or whatever comes next).
As for professional attitudes, you’re right that it’s not clear as written. I was talking about how programmers relate to the job rather than how they relate to the code (managing complexity, etc.). Dijkstra was talking about a different kind of “humility” than I expected.
http://www.cs.utexas.edu/users/EWD/transcriptions/EWD04xx/EWD498.html
Check that out. It sheds a bit of light on professional attitudes I think. A like for math, writing and the initiative to learn?
Please give me back my 30 mts.
Read faster.
“But as it becomes clear that the chaos of the internet runs circles around the lockstep of corporate methods, someone is bound to take the risk of modeling systems after the internet, allowing programmers to act independently, and internal competition to thrive.”
Huh. Would you call someone “Google”? Although you can’t work in any language, you do get your choice of a few, and Python isn’t half bad.
I’m talking about companies that use the phrase “for the enterprise” without irony. Whatever Google does, it is assumed to be uniquely possible because of their revenue and brains. But if a mobile phone operator, for example, ever shows marked improvement by implementing internal systems using internet methods, that might trigger changes in their competition and across the landscape of mediocre programming.
loved it!
Add a comment