The Super-Geek Top Ten

A human-in-the-machine festive pop chart

20 minute read

Tags: ,

Let’s end the year on a lighter note. Working in corporate IT is tough enough without another 2000 word essay on architecture or the pop-psychology of a management technique for the holiday period.

Recently I’ve been putting together a treatment for a TV documentary on some of the extraordinary computer scientists behind the bodacious awesomeness that powers the best IT today. During some random research I came across a game called “Programming Language Inventor or Serial Killer?”, an amusing distraction that needs no explanation. I got 10/10 and was told my liver was probably safe, because I’d spot Hannibal Lecter at an Open Source convention, though I attribute this to my recent research rather than any innate psychopathic discernment capability.

I suppose the obvious theme is that eccentrics tend to live on the fringes of society and whether their objectives are benign or mortal, detachment of this sort often leads to an unkempt appearance, crazy hair and a preoccupied look that might be saying “I’m developing the next paradigm of multi-threaded computing” or just “You’d look nice made into a coat. Wibble”.

Sticking safely with the first lot, I’ve always been struck by how fascinating many individual computer scientists are. Applied IT is unlike any other part of business. It’s tough because you are dealing with a science that’s still very much evolving, a desperate desire by many to apply engineering principles to it, a creative process that’s part design and part fine-art, and customers who for the most part really have no idea what they want until you deliver it (then they know it’s not what they want).

Steve McConnell described software writing in his bodaciously awesome book “Code Complete” as:

a love letter from the programmer to the hardware, all of the intimate details known only to partners in an affair

Love letter? Affair? Ninety percent of this planet’s inhabitants think that IT is boring, that software is dull. All but the most enlightened of businesses see IT as an annoying, costly, but necessary inconvenience (both no doubt contributing factors to the decline in the number of IT graduates in recent years).

And yet if you want an interesting career could you think of a better field than one which is incomplete, contradictory, scientific, artistic, fundamental to modern business, and where book authors casually talk about hardware and love affairs in the same sentence? No sir. Working in IT in the early twenty first century is better than being a rock star (minus the groupies, the fame, the money .. ok .. but it’s still better than being an accountant or a lawyer). Great Software Engineers are to modern life what James Watt was to the industrial revolution. Chances like this come along once every two hundred years. So as the US Marines say, “embrace the suck”. You stand on the shoulders of super-geeks. And here, for your delectation, is a festive, wholly biased, top-ten count-down of some of the most extraordinary examples of the genre.

10. Seymour Cray

Seymour Roger Cray’s name was for the latter part of the twentieth century synonymous with super-computing. He breaks our festive top ten both because of his genius and notorious unflappability in the face of complex problem-solving and because he sets the scene for a thread of strangeness that all the greats exhibit.

Despite decades pushing the boundaries in hardware design, Cray for most of his work preferred to use only pencil and paper. He didn’t even have a phone in his office, seeing them, and anything else not directly related to his immediate focus, as unnecessary distractions. He’s less well honoured than he might otherwise have been because distractions to him also included publicity - he turned down many awards for this reason.

He had many sporting hobbies, but the oddest of his pastimes has to be digging tunnels. Cray loved the challenge of digging underground passageways and this included a reasonably sophisticated example in his own garden. Mental blocks gave him a reason to continue digging:

I work for three hours, and then I get stumped, and I’m not making progress. So I quit, and I go and work in the tunnel.

Some say that his tunnel obsession was exaggerated to make him sound more quirky, but I like it. What is true is that he designed the Cray-2 to be cooled by human blood substitute (a substance called Fluorinert). Unfortunately Cray died following a car crash in 1996, though as an indication of his influence at the time - he was driving a Jeep Cherokee, a vehicle designed on a Cray supercomputer.

There’s a nice tribute to him on the Computer Graphics Lab site at UCSF.

9. Claude Shannon

It’s astonishing to think that as far back as 1948 Claude Elwood Shannon published his paper on Information Theory and defined the basis for nearly all formal models that control information today, from how it’s stored, structured to what all search engines strive for - finding what you want, not blindly matching the keywords you put down.

Not content with mathematical theory alone Shannon was rather curious inventor. He created a machine that could solve the Rubik’s Cube, a machine that could juggle, rocket-powered flying discs, a motorized pogo stick, a mind-reading machine, and a trumpet that doubled as a flame thrower. But the invention I like the most is a plain box with a switch on it that he kept on his desk simply labelled “The Ultimate Machine”. If the switch were pressed, the lid of the box opened, a robotic hand came out and reached around and turned the switch back off then went back in the box.

Not curious enough? Shannon was also one of the founders of the Unicycling Society of America, was known to unicycle up and down the corridors of Bell Labs where he worked whilst juggling. He’s still highly regarded within the juggling and unicycling communities who see his circuit design and information work as an interesting hobby he indulged in whilst juggling and unicycling.

8. Alan Kay

Alan Kay started out as a jazz guitarist in the 1960’s, though this didn’t stop him obtaining a degree in Molecular Biology followed by a Masters in Computer Science and later a PhD. Next time you’re on a train being irritated by the clickety clack of keys coming from the nerd next to you, you can blame Kay because he practically invented the laptop in 1968 when he conceived of the Dynabook a theme he continues to this day with his involvement in the One Laptop Per Child programme.

In the 1970’s he joined Xerox and led the team that created Smalltalk - a language that managed to both perplex college students in the eighties and change the face of programming forever. Even if you don’t like Smalltalk in practice (and that would be me) you can guarantee that many features of the language you do love were influenced by it, for Smalltalk introduced objects (though OO traces its roots back to Norwegians Nygaard & Dahl in the early sixties), clean syntax, message passing, the IDE and the GUI that inspired the look of the Apple Macintosh.

Phil Windley wrote two great and readable articles summarising a talk Alan Kay gave in 2006 at the University of Utah, but I’ll leave the last word to the man himself.

Some people worry that artificial intelligence will make us feel inferior, but then, anybody in his right mind should have an inferiority complex every time he looks at a flower.

7. Edsgar Dijkstra

Born in Rotterdam in 1930, Dijkstra is the man that killed the GOTO statement with his 1968 paper “Go To Statement Considered Harmful”, spawning a number of copycat papers for anything else that their respective authors didn’t like. Something he wasn’t too happy about as he found himself..

frequently referenced, regrettably, however, often by authors who had seen no more of it than its title, which became a cornerstone of my fame by becoming a template

Template or not, he’s probably most famous for inventing the first formal algorithm to work out the shortest distance from A to B on a map, thus laying the foundations for the in-car satnav and putting to an end marital discord on long journeys. For a bit of fun I visually recreated his algorithm in August 2007.

Like many computer scientists he saw the creative aspects of IT as being rooted in art, suggesting that programmers tend towards being either a Mozart (generate fully-formed ideas and code them) or a Beethoven (get something down and tinker until fully-formed). He purportedly had a wry sense of humour (another common theme in this top ten) saying once that teaching COBOL should be made a criminal offence and that anyone in IT who writes a paper that promises salvation and calls their idea ‘structured’, ‘virtual’, ‘abstract’, ‘distributed’, ‘higher-order’ or ‘applicative’ can almost be certain of starting a new cult. Quite what he would think of the bollocks that’s hefted about by consultants these days I don’t know.

Stiff and sometimes rude in public he was said to be charming and funny in private. As part of the team that invented the ALGOL compiler he made a pact not to shave until project was done, and indeed never shaved again, keeping the beard until his death in 2002. Like Cray he wrote most of his personal articles using a pen, each one sequentially numbered (his last being 1318, giving an indication of just how prolific he was) and never owned a TV or a mobile phone, though he did have a car which he called the Turing Machine.

At his funeral the chairman of his Computing Science Department in Austin, Texas said:

He was like a man with a light in the darkness. He illuminated virtually every issue he discussed.

6. Donald Knuth

Donald Knuth was marked out as someone to watch from early on. In 1960, when he got his degree, his university also gave him a masters as a bonus because he was just so damn smart. He’s in this list primarily for his work on The Art of Computer Programming which is, in my opinion, the greatest single programming support tool of the twentieth century (I am not alone. American Scientist included it in their “One Hundred Books that Shaped a Century of Science”).

Knuth’s dry humour is well known. He used to pay $2.56 (one hexadecimal dollar) to anyone who spotted an error in his books. He’s an accomplished organ player and to mark the fact that his royalties financed his home organ, a reference in the index on royalties leads to a page with a picture of an organ. He does not use email, making a declaration to that affect in 1990.

The Art of Computer Programming isn’t exactly gentle bedtime reading, not least because all its examples are in machine code, but as a painstakingly researched and argued series of volumes (four are released, three more are planned) it’s hard to beat. Knuth started work on the book in 1962, he’s now 70, at his present rate he will finish the series around 2042. He may even make it.

The hardest thing is to go to sleep at night, when there are so many urgent things needing to be done. A huge gap exists between what we know is possible with today’s machines and what we have so far been able to finish.

5. Charles Babbage

No top ten would be complete without the man who started it all - Charles Babbage. The web is loaded with details of his work on the (during his lifetime) incomplete Difference Engine and the theoretical Analytical Engine, though he’s less well known for the eccentricities that would enable him to fit right in at a 1960’s computer science conference.

He was something of a polymath - contributing to adoption of the decimal currency, using wave motion to generate power, the signalling mechanism used by lighthouses and even a kind of precursor to the aircraft black box, though his was designed for railway trains.

His relationship to the real world of “normal” people though is strange to say the least. He apparently hated music unless it was of the most refined form and yet lived in a London whose streets were packed with noisy merry-makers of all kinds. This antipathy coupled with what seems to have been a pretty credulous view of the social situation at that time (industrialisation in the mid nineteenth century had created a London rife with bawdiness, chicanery and theft) led to some bizarre situations - one time around a hundred people followed him down the street cursing at him, a brass band played for five hours outside his house just to annoy him and a neighbour directed his tin whistle playing at his garden for months.

Despite all this he remained obsessed with data, proposing in 1856 the creation of a “Tables of Constants of Nature and Art” which would hold all perceivable measures of, well, pretty much everything. Presumably by example, a year later he analysed all the broken windows in a factory and published the “Table of the Relative Frequency of the Causes of Breaking of Plate Glass Windows” a great coffee table companion to his “Conjectures on the Conditions of the Surface of the Moon” I am sure.

Clearly he led a frustrated existence. Having a deeply inquiring mind in an oftentimes hostile environment took its toll. In 1861 he professed to have never lived a happy day and suggested he would give up what remained of his life (though he was seventy by then) if he could come back five hundred years into the future to live a mere three days more. He died ten years later. What he’d do with three days in the twenty first century I don’t know. By the time he’d set himself up on Facebook, written a blog entry and sent out a few tweets on Twitter there wouldn’t be much time left. He’d have no time to read Donald Knuth’s books.

4. John McCarthy

McCarthy is at number four in the list because, despite the true awesomeness of the contributions of Shannon, Kay, Dijkstra and Knuth to programming, McCarthy’s 1960 paper Recursive functions of symbolic expressions and their computation by machine created the basis for places we’ve not even been to yet. Rather than even attempt to summarise the significance of his work I’ll leave that to Paul Graham who did a much better job than I ever could. If you like Ruby, F#, Groovy, the way even Java’s headed, or any recent “cool” language come to that, you are experiencing ripples that go all the way back to McCarthy’s work in 1960.

Now emeritus professor at Stanford, McCarthy spends more time on other things, including a sci-fi short story called “The Robot and the Baby” - a curious endeavour for a man that said “language is froth on the surface of thought”.

3. Tommy Flowers

Tommy Flowers makes it into the top three because whatever he lacked in the theoretical brilliance shown by his compatriots on this list, he more than makes up for in everything else. Born Thomas Harold Flowers in 1905, the son of a bricklayer in the East End of London, his start in life might seem fairly inauspicious but Tommy had what used to be called “gumption”, and it was to serve him, and us, well.

He undertook a four year apprenticeship in Mechanical Engineering and then funded himself through night school to get a degree in Engineering from London University. After that he worked at the General Post Office in Dollis Hill on the electro-mechanical switching gear that would later enable telephone direct dialling (these were the days when you phoned the operator to manually put your call through). Our story would end here, and Tommy Flowers might have become at best a mention in a corporate history book, but for the intervention of a young man called Alan Turing who in 1943 suggested Tommy be brought to the government code-breaking HQ at Bletchley Park.

Tommy’s job was to sort out an unreliable decoding system called Robinson. His proposed solution was not to fix it up but to re-design it using valves, a view not appreciated by management who wanted something more tactical (who’d have thought .. clever guy proposes elegant strategic solution and management would rather go for the quick-and-dirty option .. that never happens these days). Undeterred, Tommy and his team built it anyway. They worked around the clock, cannibalised standard telephone exchange components and completed the job in ten months with Flowers partly funding the work from his own pocket. A demo of the new Robinson was shown on 8th December 1943. It contained 1,500 valves and ran with improved reliability at 5,000 characters/second (the old one ran about a fifth that speed). By June 1944 it was in production.

Tommy went on to produce a mark II that ran at 25,000 characters/second with 2,500 valves (note that’s a 66% increase in resource for a 500% increase in throughput - scalability most would only dream of today). By the end of the war ten of his machines were operational. That many individuals are alive today because of his work goes without saying.

Tommy Flowers not only invented the modern electronic computer, he did it against the direction of those who thought they knew better and had to partly pay for it himself. If that weren’t enough, once the war was over he had to disappear into relative obscurity and keep his mouth shut while Turing took most of the glory for what happened during that time. Tommy did as he was told, dismantled his machines and went back to fighting incompetent management in the Post Office (and later International Telegraph and Telephone) until he retired in 1969. It took the Freedom of Information Bill in the US for the secret to finally come out.

He has been variously described as quiet, well-spoken, polite and unassuming, making it all the more frustrating that he was not recognised more overtly for his great work. He was given £1000 after the war (not enough to repay what he had spent himself) and later an MBE. There are rumours he was being considered for a knighthood in 1998, the year he sadly died.

Frankly, I would have changed the rules and awarded a posthumous knighthood.

2. Ada Lovelace

Ada Lovelace is probably best known in IT because of the Ada programming language. Before I knew anything else about her I had a vague idea that she’d funded Babbage and thus been rewarded for her patronage by having a dull highly-structured strongly-typed language named after her. Not true (the bit about Ada the person I mean, the bit about the language is true).

Born in 1815, Augusta Ada Lovelace was the daughter of Anne Isabella Milbanke and Lord Byron (yes.. that Lord Byron). Unfortunately the marriage didn’t last and Anne remarried in 1835, this time to William King who became, three years later, the first Earl of Lovelace, making Augusta Ada Byron/King the Ada Lovelace we know today.

Ada Lovelace lived a life of privilege and wealth. Her mother spared no expense on her education. She was tutored by, among others, the great mathematician Mary Somerville. She may have been remembered as nothing more than smart posh totty had it not been for the meeting with Babbage in 1833. It’s clear from her 1842 translation and notes of Federico Menabrea’s presentation on Babbage’s Analytical Engine that she was not the backseat wealthy benefactor of myth (access to funding wasn’t an issue for Babbage, any lack of it was due to his insensitive dealings with others).

Lovelace was at the least Babbage’s intellectual equal. Though he designed the machines, and though they corresponded on her translation, their letters show that she was both an independent thinker and the greater visionary when it came to just what this technology could do. While Babbage saw a machine that could pedantically crank out tables of numbers based on fixed inputs, Ada saw a machine that

might act upon other things besides number, were objects found whose mutual fundamental relations could be expressed by those of the abstract science of operations

even as far as saying

the engine might compose elaborate and scientific pieces of music of any degree of complexity or extent.

And let’s be clear - this was 167 years ago with no frame of reference on which to base her ideas. It’s not like she dinked with a bit of javascript first.

Ada Lovelace makes number two on the list for a stupefying clarity of vision and for being the link that takes Babbage into the twentieth century. Alan Turing in his 1950 paper “Computing Machinery and Intelligence” picks up Lovelace’s thoughts on where the limits of a simulated (or not) intelligence might lie.

And talking of Turing…

1. Alan Turing

Of course Turing had to be number one. He is, in almost all respects, the imaginative and foundational force behind all that is modern computing. A mathematician at heart, he is also regarded as a philosopher, libertarian and wit -

Turing believes machines think
Turing lies with men
Therefore machines do not think

Turing either defined or contributed heavily to fundamental concepts such as what a computer is, the Universal Turing Machine, what we can understand about computability, in Turing Completeness, and how we might measure simulated intelligence in the Turing Test. For more details on his life and work the richest source of information has to be the main Turing Website run by Andrew Hodges.

The curiosity with Alan Turing’s nature runs deep, certainly exacerbated by the duality of his death. In June 1954 he was found dead by his cleaner having died of cyanide poisoning. Next to his bed was a half eaten apple. One theory is that he deliberately laced the apple with cyanide and took a bite (though surprisingly the apple was never tested) without further hard evidence it’s just as possible his death was an accident - he was known to have used and stored cyanide for his work.

The coroner went with the first option and ruled his death a suicide. Conspiracy theorists since have, perhaps inevitably, suggested that he was murdered by the British Secret Service paranoid about his homosexuality (though Turing’s progressive and enlightened attitude to his sexuality would hardly have made him a blackmail risk). It’s all too tempting to arrogate Turing’s death because he’s a well-known figure, when, of course, whatever his contributions to the world, his demise, chosen or not, is none of our business. But I’d like to think the dramatic symbolism that surrounds a man who had difficulty dealing with social situations (he was said to have been obsessed with the apple-poisoning scene in Snow White) is something he would find amusing.

There are many quotes about Turing, but my favourite (paraphrased) is from Tommy Flowers:

You’d be working on a problem and not able to solve it,
and sometimes someone would look over your shoulder and say
‘Have you tried doing it like this?’ and you’d think ‘Of course, that’s how you do it!’.

With Turing, he’d say ‘Have you tried doing it this way?’
and you’d know that in a hundred years you would never have thought of doing it that way.

Summary

Man that was hard. Of course the list is a reflection of my own subjective interests and views and the ranking should be taken with a couple of cups of salt. In choosing to do a top ten (the original idea was a top five) I’ve had to leave out Andrew Tannenbaum, Leslie Lamport, Leonid Levin, John Backus, Fred Brooks and a host of more contemporary contributors to the field. Still, there’s always next year.

As I said earlier, working in IT should be seen as a privilege, analogous to taking part in the industrial revolution, though without the subjugation aspects. Alan Kay, in a keynote speech in 2003 said he didn’t think the computer revolution had even happened yet.

Personally, I think it’s just about to begin. The answers to many of the most important questions may already have answers that are decades old, brought to us by a group of individuals with a healthy disrespect for the accepted order of things, who manage to balance a single-minded determination, an uncommon ability to focus, and a wide variety of interests well outside their field.

So arise eccentric social misfits of the world. The fact that you can’t go to a supermarket without redesigning the layout of the aisles, or that you just have to stack the shopping carts neatly into each other, even though nobody else does, marks you out as an heir to the super-geek legacy.

Your time has come.

Whatever your chosen method of spending this time of year, I wish you a pleasant and safe one.

Created: