Amdahl’s Law

Not all performance improvements are created equal

2 minute read

Tags: ,

Gene Amdal in 1976
Gene Amdal in 1976

Business software is generally used to optimise processes so that they run faster. Competitive advantage for companies can come from doing the boring stuff better as well as doing the new things other companies don’t do. But if we look at a process end-to-end it can rarely be optimised all the way through. Often what we find is that parts of the process cause disproportionate delays and so we focus on those. Other times it may be for good reason that some parts of the process simply cannot be addressed. One way to look at speeding something up is to divide it into independent parts and run those parts in parallel.

When attempting to optimise one or more steps in a process chain, it’s important for cost-benefit reasons to understand exactly what the net result will be to the whole process before starting. Gene Amdahl formulated the basis of what became known as Amdahl’s Law, or Amdahl’s Argument, in 1967 whilst at IBM. Essentially the law states that whilst a process can be decomposed into steps which may then be run in parallel, the time taken for the whole process will be significantly limited by the steps that remain serialised.

Amdahl’s Law states when you “parallel up” parts of a business process, any step within that process which is sequential (or has a speed-limiting factor) and which takes up a fraction f of the overall process time, will prevent the whole process being speeded up by more than a factor of 1/f.

That means a single unaddressed process step that makes up a mere 10% of the time in a whole process chain will actually prevent the whole process from being speeded up by more than 10x regardless of how well the rest of the process steps are optimised. Obviously the effect is greater the fewer process steps there are.

To apply this to something like an ecommerce fulfillment proceess, let’s say there’s are four equally lengthy steps to delivering customer orders: collect order, take payment, allocate stock, despatch. If the stock allocation is manual and performed by one person (serial, unoptimised) then we could build the best web site on the planet for order collection and payments and use turbo jet robots for despatch but we can never improve our overall fulfillment capability by more than 4x.

f = 0.25 // each step is 25% of the process
1/f = 4  // 4x is the maximum speed improvement we can attain

In process optimisation Amdahl’s Law can help keep targets within realistic boundaries and in distributed systems it can help size networks of nodes for a blend of serial and parallel processing.

Here’s a calculator to try it out:

Notes

  • You can read more about Amdahl’s Law on Wikipedia

  • This article was updated in 2013 when the calculator was simplified ported to JQuery. The source is available on github.

  • This article was updated again in 2023 when the code was modularised a but better and ported to webpack and the latest version of jQuery.

Last Updated: