100-Year Architecture
how the battle for the web was lost and won
Tags: architecture , development

In late 2011, I was asked to talk at a conference on digital trends. I didn’t have anything prepared so hastily suggested the topic “100-Year Architecture,” thinking that sounded clickbaity enough to get people to show up and I could fill in the blanks later. I did though have one thought in my head to build on - the 10,000-Year Clock being built by the Long Now Foundation - which I hoped might have enough parallels with software to prove thoughtful and thought-provoking. Twelve years later, this is the write-up of that talk.
In the end, the clock barely figured in it. Instead, it became a cautionary tale of remarkable divergent thinking and practical convergent innovation. And I offer this as the first in a two-part series that will (in this article) look at how the modern web came into being, then (next time) examine the rise, and inevitable demise, of blockchain and Web3.
Background
The 10,000-Year Clock
I came across the story of a clock being built to last 10,000 years in a Wired Magazine article from 1995. In it Danny Hillis (an MIT grad who had a pretty stellar career in tech, including founding Metaweb who created the Knowledge Graph info box you see alongside Google search results) said this:
I think of the oak beams in the ceiling of College Hall at New College, Oxford. Last century, when the beams needed replacing, carpenters used oak trees that had been planted in 1386 when the dining hall was first built. The 14th-century builder had planted the trees in anticipation of the time, hundreds of years in the future, when the beams would need replacing.
And it made me think of software architecture and the way we (at least try to) anticipate the needs of future developers to make life a little bit easier. Architecture is just systemic requirements and systemic requirements are essentially just needs changing over time. We sometimes forget them. And sometimes they tempt us to overthink and over-design. But when we think of time, we are thinking of architecture.
The Many Webs
The 30-Year Web
If you want an example of a ‘good’ architectural design then you need to find something that’s been around for a long time (in software terms) and hasn’t changed its fundamental architecture much.
The web (and the internet technology that underpins it) is one such example.
The World Wide Web has been in existence since 1990 (the first web page was published in December 1990, although it wasn’t until August 1991 that people outside of CERN would start to hear about it). In that time its purpose and societal impact have changed dramatically, but the underlying architecture is pretty much the same.

It seemed to me that exploring the history and design principles of the web would provide some clues as to how we should think about software architectural modelling if we want our products to last.
And this is where things took a strange turn. My expectation was that I’d read Weaving The Web, make some notes about REST, and have a useful talk. And I had this expectation because, like a lot of people, I’d always assumed that Tim Berners-Lee and Robert Cailliau had an information-sharing itch to scratch at CERN and had more or less just invented the web. But oh no.
Read on, dear reader, read on.
Enquire
The late eighties web project we think of as being the genesis of the modern web was actually an iteration of a previous system called Enquire, named after an old almanac that ran for 60 years called “Enquire Within upon Everything.” Tim Berners-Lee later said of it:
With its title suggestive of magic, the book served as a portal to a world of information, everything from how to remove clothing stains to tips on investing money.
Enquire wasn’t a distributed system based on widely-accepted standards though, which meant it needed a lot of maintenance and so wasn’t sustainable. It did, though, embody the concept of links.
In May 1990, the proposal for the modern web said:
We should work toward a universal linked information system, in which generality and portability are more important than fancy graphics techniques and complex extra facilities.
(clearly, we’ve adjusted those priorities a bit since)
So there I am, digging through papers and reading about hypermedia and hyperlinks and I come across this image from the Hypertext Conference of 1991 which was held in San Antonio

You can find the proceedings for the event at the ACM library.
Hold up, I thought. It’s the hypertext conference in 1991 and Tim Berners-Lee isn’t the keynote speaker carried in on a palanquin? They didn’t rebrand it the HyperTim conference? He was just an attendee like everyone else? Telling them about his idea on a pinboard?
Well, yes. Because the origins of all these concepts go further back in time.
Way way back.
The 80-Year Web

Back, in fact, to the 1940s and the invention of the atomic bomb.
In July 1945 The Atlantic Magazine published an article entitled As We May Think by Vannevar Bush. Bush had been one of the leaders of the Manhattan Project to create the atomic bomb and had fiercely championed cooperative information management to make it happen.
Seventy-eight years later, it makes for portentous reading:
The human mind … operates by association … With one item in its grasp, it snaps instantly to the next … in accordance with some intricate web of trails
And he predicted that:
Whole new forms of encyclopaedias will appear, ready made with a mesh of associative trails running through them
Due to the limitations of technology at the time, he envisaged the machine that would power this mesh of linked information as connected microfiches, but the concept was there. As horrendous as nuclear weapons are, they were born from an environment sufficiently complex that sophisticated semantic knowledge navigation was required.
Xanadu
If Bush put the ideas into the ether, it was Ted Nelson who grabbed them all to create the conceptual building blocks of the web as it is today. Although I am fairly sure Nelson would violently disagree with that statement because Ted Nelson’s building blocks were intended for a quite different future. One we haven’t realised yet.
Nelson was (and still is) a prolific outside-the-box thinker and coiner of words. He coined hypertext and hyperlink in 1963, hypermedia in 1965, softcopy in 1967, virtuality in 1975, micropayments in 1992 and my favourite word of all time: intertwingularity in 1974.
Intertwingularities are exactly what Bush was talking about: the web of trails that represent the complex interlinks in human knowledge.
Nelson saw information processing technology as an opportunity for humans to think of knowledge representation in very different ways than we’d been using for thousands of years before. He saw all those ones and zeros as a way to break free from paper. Paper is an intrinsically limiting medium in that even as you begin to capture knowledge on it, your brain is making leaps it cannot support, which is why we have footnotes, endnotes, indexes, and bibliographies. Our minds are free but paper keeps our thoughts in a prison. Nelson saw tech as a way to escape that prison.
The four walls of paper are like a prison because every idea wants to spring out in all directions - everything is connected with everything else, sometimes more than others.
He raged against the creation of a digital system that took paper as its metaphor. In this mission, he was building on the ideas of Benjamin Lee Whorf who is responsible for the Sapir–Whorf hypothesis which states that the construction of language influences what we think, perceive and believe. A good example of this is Newspeak from 1984 a language designed to imprison the minds of the people.
Once you start talking about files and folders and desktops, you limit the discourse and the potential for technology.
Computing is made up of files and directories and that’s a tradition left behind from the 1940s that no one questions.
Another tradition is that one file equals one document.
Because the real world doesn’t succumb to files and folders and paper. It’s intertwingled.
Everything is deeply intertwingled. In an important sense there are no “subjects” at all; there is only all knowledge, since the cross-connections among the myriad topics of this world simply cannot be divided up neatly.
Hierarchical and sequential structures, especially popular since Gutenberg, are usually forced and artificial. Intertwingularity is not generally acknowledged — people keep pretending they can make things hierarchical, categorisable and sequential when they can’t.
Berners-Lee himself included a representation of his web idea as an intertwingled mind map.

Nelson tried, many times, to deliver on his vision, which he termed Project Xanadu. Demos have been released but alas none of them has been practical enough to topple the web we have today.
Xanadu even had its own file system called ZigZag:
This is not about reality, it’s about imaginary structures that are useful
We should not impose regularity where it does not exist
We have been forced by the conventionally available software to warp and break the information we want to store
Xanadu has had many believers along the way. In 1988 John Walker of Autodesk backed it saying:
In 1964 Xanadu was a dream in a single mind.
In 1980, it was the shared goal of a small group of brilliant technologists.
By 1989, it will be a product.
And by 1995, it will begin to change the world.
Xanadont
And indeed by 1995 the web was starting to change the world, but it was a web derived from paper-based thinking, not Xanadu.
Also in 1995, the same year Wired Magazine published the article on the 10,000-Year Clock, they ran a particularly harsh article called The Curse of Xanadu
Ted Nelson’s Xanadu project was supposed to be the universal, democratic hypertext library that would help human life evolve into an entirely new form.
Instead, it sucked Nelson and his intrepid band of true believers into what became the longest-running vaporware project in the history of computing.
A 30-year saga of rabid prototyping and heart-slashing despair.
Ouch.
Xanadu and its associated components and actions led to yet more word coinage like enfilade, ent, flink, clink, transzigular and sworph but it also came with concepts that highlight features we don’t have in the web.
Like transclusion where quoted text from a remote document isn’t a copy but a live view that changes when the primary source changes. Xanadu links are also bidirectional and have a type, so if a document links to its author the reader could see this semantic relationship.
Imagine a web today where 404s couldn’t exist and hyperlinks had meaning beyond this takes us to another document.
Personally, I agree with Jonathon Vos Post’s summary of Xanadu.
Xanadu is not .. “total insanity” .. someday this amazing hypertext program will be here and will fulfil all the fantasies of its true believers.
But Bill Gates himself couldn’t pay me enough to manage the software development.
100-Year Architecture
Hyperworlds
So where does this leave us in the quest for clues about great stable architectures?
Ted Nelson was right to say the rise of digital technology was an opportunity to think differently. He blames not doing so on myopic engineers taking over rather than letting creative visionary artists lead the way. I think the problem is the web was always going to be a distributed system and so the laws of physics and time apply. In a way that’s why we don’t use Enquire today. It had some Xanadonian features but wasn’t maintainable in the long term.
What did come from Nelson’s brilliance was a hyperworld that fostered years and years of research and debate. And from those conversations we got HTTP in 1990, which became the convention HTTP/1.0 in 1996, which grew to HTTP/1.1 in 1997 as RFC2068 and to HTTP/1.1 in 1999 as RFC2616 and so on until the present day.
We also got the paper “Architectural Styles and the Design of Network-based Software Architectures” better known as Roy Fielding’s origin document for RESTful design.
REST
REST is interesting because it’s both an architectural approach that underpins the web (“the REST architectural style has been used to guide the design and development of the architecture for the modern web”) and one that is inspired by it (“the REST style is an abstraction of the architectural elements within a distributed hypermedia system”).
Fielding’s aim was as follows:
REST provides a set of architectural constraints that, when applied as a whole, emphasizes scalability of component interactions, generality of interfaces, independent deployment of components, and intermediary components to reduce interaction latency, enforce security, and encapsulate legacy systems.
Fielding worked on all the HTTP RFC specifications and was a co-founder of The Apache Group (which went on to found The Apache Software Foundation). But has always been keen to distinguish the REST architectural style from HTTP itself.
Some people think that REST means “use HTTP”. It doesn’t.
It is very easy to use HTTP in non-RESTful ways and also to use custom protocols other than HTTP to provide REST-based architectures.
What is key to REST, and our search for longevity in architectural design, is it decomposes the web into its raw persisting domain components.
Resources - the conceptual target of our interaction - each of which has a Resource Identifier (URLs on the web), that point to a Representation of the resource (in the web these are Nelson’s hypermedia documents but they can be images, PDFs, etc), Metadata about these representations (the web has media-types and modified times, Nelson went much further) and Control Data to enable the system to manage representational artefacts (e.g. cache until changed etc). When a resource changes, we simply update its representation and make its new state available for transfer to interested parties.
It leaves us able to describe the web as simply a collection of locatable hypermedia documents which are modified and exchanged over an application layer using verbs to signify the intent of the exchange.
And that definition both constrains us and sets us free.
Constraints
REST works because it’s constrained by these elements and rules. It’s not designed for the convenience of future developers, I’ve seen horrendous implementations of non-restful HTTP applications that have favoured convenience over robustness. Good RESTful designs aren’t always obvious. It’s easy to break the constraints and add subtle fragility to designs. That is essentially why web services based on SOAP died out.
After years of building monoliths, and finding they were hard to manage, the entire industry wet its pants about SOA, the Service Oriented Architecture, and SOAP. The thinking behind both was solid enough: decompose your domain into reusable services and build those services according to interoperable standards. SOA lost its fascination with XML and has been repopularised as microservices but I can’t help thinking one reason microservices projects run into so many difficulties is that we don’t have the constrained guide rails the web has.
Finalment
The architecture of the web is plain and boring and beautiful. By design, it has many limitations which can be annoying to work with. That’s good architecture. A brick is plain and boring and beautiful but once you have bricks your imagination can take over.
Ted Nelson wasn’t wrong. There are many tantalising glimpses of the future in his work. He just wasn’t able to find his bricks in time. Tim Berners-Lee took a world that had internet protocols like FTP (1971), SMTP (1981) and NNTP (1986) and made HTTP look like it had always been at the party.
It looks easy now, but it was hard (as Ted Nelson said: “Making things easy is hard”).
I think because it is so hard we can only make sustainable progress in very tiny increments. We don’t stand on the shoulders of giants, we get small lifts from the tiniest of advances. Each one added to the pile we stand on once it’s been proven to work. There are a hundred years of these to come for the web. It will become more semantically rich and distributed and perhaps even the way we think about hypermedia identifiers will change.
One profound insight can be extracted from the long and sometimes painful Xanadu story: The most powerful results often come from constraining ambition and designing only micro-standards on top of which a rich exploration of applications and concepts can be supported.
Vint Cerf - Chief Internet Strategist, Google
Notes
-
The image is from Grave of the Fireflies, Isao Takahata’s masterpiece adaptation of Akiyuki Nosaka’s heartbreaking novel of the same name.
Nosaka’s book, like the web, was born in the aftermath of the atomic bomb.
You can find a copy here. -
The caption under the image is from the 1980 film Xanadu
-
You can download the original proposal for the web here