Doddery Fodder: Better late than never

by Frank Wales

I'm sorry I couldn't write sooner, but I've been too busy trying to Get things Done to get anything done.

What's that you say? Why did I capitalize 'G' and 'D', but not 't' in "Get things Done"? It's funny you should ask me that. You see, there's this new way of working that's been spreading through geek circles faster than the combined fears of avian flu and broken iPod screens: 'Getting things Done'. It's a way of organizing all the projects in your life that's been formulated by David Allen, and isn't remotely like a cult (although I, for one, welcome our Next Action Overlord).

Alpha-geeks and the plain text fetish

Now, I'm sufficiently bad at organizing myself that I require professional psychiatric help, so I'm a sucker for anything that might boost my personal productivity. Oddly enough, GtD (as it's known among aficionados) actually seems to work for me, and does so without the need for expensive paper supplies or fancy software packages.

Over the last year or so, many web sites and mailing lists have arisen to feed the GtD frenzy, of which I list a piffling amount:

  • 43 Folders (a reference to the GtD method for reminders)
  • Moleskinerie (all about the trendy little notebooks we GtD'ers seems to carry now)
  • 37 Signals (web-based organizing tools)

Those of us who have been programming computers since before PCs existed, and who have somehow managed to avoid having our brains addled by fancy development environments, still value very simple, architecture-neutral ways of handling information: plain-text files, hand-written notes, yelling. So it's not really a surprise to discover that many so-called alpha-geeks (as well as beta-geeks, and even released geeks) still rely on paper, big text files and other quaintly old-fashioned ways to manage personal work flow.

A Curious Coincidence

"That's all well and good", I somehow hear you say, "but how does this relate to programming?" Once again, I'm glad you asked.

Those who've dunked their head into the eXtreme Programming bucket will no doubt be aware of the XP notion of "stories" as a way to chop the endless serpent of new features into bite-sized chunks of billable work. One recommended way to manage the size of a "story" is to write it in your neatest hand-writing on a 3-inch-by-5-inch index card. Such a raw, physical constraint on the amount of writing (even when not done in crayon) helps to limit a customer's natural tendency to expand any particular "story" into Harry Potter and the Half-Baked Program.

Oddly enough, the GtD horde has independently discovered the virtue of using 3x5 index cards as the basis for tracking Next Actions associated with projects. In fact, they've even gone so far as to invent the Hipster PDA, a card-and-clip alternative to the moribund Palm-type PDA.

Consequently, it seems just too obvious to use 3x5 cards with simple pieces of software functionality described on them, combined into GtD-like projects and managed according to GtD principles, as the basis for personal management of software development.

GsD

While he was head of HP Labs, Joel Birnbaum once gave a talk about how the fixed, simple standard of the mains wall socket has enabled startling innovation on either side of it. Using the miracle of Inappropriate Metaphor Transfer, beloved of desktop application designers everywhere, I have therefore decided that the 3x5 index card is the mains socket of stability that connects the 230 volts of eXtreme Programming with the Cuisinart of GtD. The result: something I've given the rather catchy name of "Getting software Done".

That's right, I'm trying to Get software Done using nothing but 3x5 cards, paper clips, cardboard files, little magnets, paper clips, clear plastic sleeves, lots of paper clips, and my very own labelling machine. Oh, and paper clips.

Lest you fear that I'm some kind of maniac for thinking like this, let me reassure you that I'm not alone. At least one other person on the face of the Earth is doing software management this way, and I bow before this awesome index card majesty.

A Theory of Testing

"But surely", I imagine you objecting in a tone of exasperated effrontery, "if GsD is to be of genuine value, it needs to be scientifically verified as being effective. Mankind cannot merely rely on the witty and erudite writing of one charismatic genius to be convinced. Nor, for that matter, on anything you might say." Well, now you're just being insulting.

The problem, of course, is that I can't redevelop the same software under identical circumstances using a wholly different method, in order to compare the outcomes. Fortunately, theatrical cosmology (also known as Star Trek) offers a solution. According to its "many worlds" theory, there are other mes in parallel universes already using other methods on the same software projects. So, all that all of me has to do is find a way to send notes inter-universally (perhaps on 3"x5"x7"x9" superhyper-index cards), and we'll discover which is best.

Given that I've thought of this in our universe, there must logically be a version of me that is more fired up about solving this problem than finishing this article. So, I can continue writing, safe in the knowledge that some-me else is working to complete this study, and communicate the results to all the rest of me. Hence, I can ignore the problem and wait for me to answer it anyway.

The only way this won't work is if it turns out that certain episodes of Star Trek are impossible: therefore, proving that GsD is valid becomes a special case of proving that Star Trek is completely possible, the so-called ST-complete theorem. (This is distinct from the ST-incomplete theorem, which posits that there is still at least one unmade Star Trek episode worth watching; unfortunately, Star Trek: Nemesis is an astonishing proof that the ST-incomplete theorem is false.)

Second-system Syndrome

To demonstrate that I'm not only in the GtD groove, but also Web 2.0-aware, I have put pictures of my GsD set-up on Flickr, and I've added the 'ppig' tag to some of my del.icio.us bookmarks (which you can get as an RSS feed too). Unless enough of you complain, I shall also be forced to create a podcast version.

"Web 2.0?!" comes your plaintive cry. And you thought the existing Web wasn't even out of beta-testing for version 1.0 yet. Well, listen up. Silicon Valley's cash hydrants are once again being loosened, in preparation for dousing dangerously inane and trivial ideas with suffocating amounts of filthy lucre. And this time, the danger has been identified as Web 2.0.

In short, it's like Web 1.0, but doubled.

Helpfully, Wired has a lengthy article that is even more hubristic and self-important than usual. But a simple way of imagining web 2.0 is to remove anything from web pages that isn't computer-crunchable data, label it at random with misspelled words, and then let other people convert it into live video widgets, creating so-called "Ajaxified Tagsonomy Mash-Up Streams" (ATMUS). Worrying about what that means for society is called 'ATMUS fear', which is something we need more of around here.

Still, don't get your hopes up for a trendy programming job with Aerons and lattes, since Paul Graham thinks hiring is obsolete, so you're going have to lose your own money this time. (Unless you disagree with Paul Graham, of course.)

And have some pity for Ted Nelson, who invented 'hypertext', but who seems to be as far from achieving his visions as ever. (Nelson also advocates creating software according to a cinematic model, with a visionary director in charge. Much as I'd be morbidly curious to see a spreadsheet by Quentin Tarantino, I don't think I'd trust my taxes to it.)

Jason Ajax: the crime-busting programmer with over-organized hair

Web 2.0's calling card is AJAX, a term invented by Adaptive Path's Jesse James Garrett to explain to management what we indignant programmers now whine that we've been doing for years anyway. (A valuable side-effect of the buzz around 'AJAX' is that many Dutch football fans have been driven into apoplexy upon discovering that 'ajax' is one of the hottest search terms online, leading them to worry that their favourite football club Ajax was in trouble.)

AJAX is short for 'Asynchronous Javascript and XML', but really refers to anything that tarts up the user interface by communicating with web servers without having to refresh the whole page. This decreases the time before some other web site you claim you didn't visit starts displaying pornography anyway, thus leading to the other kind of tarting up.

The poster boy for AJAX is Google Maps, although the AJAX part of that is actually pretty straightforward compared with the back-end system that serves up tileable, scalable maps of the entire civilized world plus Kansas.

But, as in the movies, it's the surface gloss that attracts the attention and the babes, so get ready to start polishing.

Those who have a constitutional dislike of XML-anything can instead consider using JSON (pronounced 'Jason'). 'Javascript Object Notation', to give it its fairly dowdy name, is a way of bundling data as a little Javascript program that gets executed to set variables.

There is now a kerfuffle of companies announcing 'Web 2.0' products and services, including many that seem like they're re-trying online business models that failed back in the last millennium. But with every new buzzword comes new money, new programmers and the potential for new Superbowl commercials.

With all this, we also get the chance to write software for the alleged Web 2.0 platform, with its partially debugged, subtly incompatible, and often wholly absent features. But, of course, we'll be egged on by those clueless oldbies who feel a storm coming.

Programming for people with short attention...oh, look, a giraffe!

It seems that, with the incredible fragmentation of software creation that Web 2.0 represents, the traditional notion of 'application development' is a bust. Instead, we now have to consider the merits and difficulties of progressively assembling shards of software on an undulating environment that is distributed, ever-changing, and (thanks to Greasemonkey) completely unpredictable. It's almost as if we want to encourage programmers to ignore the big picture, and just do incremental, little stuff, as a way of keeping development problems within the limits of human comprehension. We also get to slap the label 'beta' on all our newly web-enabled systems as the universal excuse for why they're still not done yet.

Scripting now dominates programming at both ends of the web, but with languages based on quite different models: in the browser Javascript is prototype-based, while on the server Python and Ruby are class-based, and PHP and perl are drugs-based. Moreover, there is no shared data-representation syntax, so we must talk in poxy XML-mediated Chinese whispers. Meanwhile, RESTafarians are inadvertently working to migrate transactions into the client where they belong, until you want them to work.

Web 2.0 involves publishing, transforming and merging previously disconnected services in hitherto unexpected ways, with hilarious consequences. This routinely involves writing programs that write programs in other programming languages, generally with incompatible quoting conventions, no useful visualization tools, and sufficiently loose interpretive semantics as to guarantee debugging opportunities until retirement or legal action, whichever comes sooner. And once you get all the web stuff working, then you get to ponder the relative semantics to physicists, programmers and biologists of a term like 'vector', before deciding to become a lion tamer instead.

More research required

Importantly for PPIGlets, Web 2.0 offers lots of opportunities for relevant psychology of programming research. It does this by being fertile ground for the kind of highly entertaining, large-scale software disaster that only copiously over-funded naivete can create. Just think, this could be your once-in-a-career chance to get away from doing research on those student web pages that you happened to find lying around the campus.

If all this frightens you into thinking that using your computer in our evolving, connected world is like driving across some endless metaphorical bridge while it's being re-built by us crazy programmers, don't worry. As long as you drive fast enough, you'll probably be okay. Just wave to us as we argue about which chisel we should use to hammer in the screws that hold the road together.

Of course, this shows the limits of metaphor; everyone one knows that you don't hammer in screws with a chisel: you remove them with a chisel - you hammer them in with a screwdriver. In my next article, I will therefore show you how to build a robust metaphor out of index cards, which you can use to explain away the unexpected success of almost any project.

Frank Wales
frank(at)limov.com

Recent comments

No comments available.