Laws, Principles, and Adages

May 15, 2022

This is a curated list of laws, principles, or adages that I found either helpful, interesting, or funny. I’ll continuously update the list as I learn more. The list is sorted alphabetically. I did not create those laws and their description. I added a link for each law to the source, from which I got it.

If you have an interesting law that would fit this list, don’t hesitate to approach me. Have fun exploring!

Last updated: Jan 3, 2024.

Allen Curve

The exponential drop in frequency of communication between engineers as the distance between them increases.

Rather than finding that the probability of telephone communication increases with georaphical distance, as face-to-face probability decays, the use of all communication media decays with distance.

Wikipedia: Allen curve

Atwood’s Law

Any software that can be written in JavaScript will eventually be written in JavaScript.

schegge.de: Atwood’s law

Brandolini’s Law

The amount of energy needed to refute bullshit is an order of magnitude larger than to produce it.

Brandolini’s law is an internet adage which emphasizes the difficulty of debunking bullshit.

Also known as: bullshit asymmetry principle

Wikipedia: Brandolini’s law

Brooks’s Law

Adding manpower to a late software project makes it later.

Adding more people to an already late project, as a desperate attempt by management to finish the project on time, will only make it even later.

Wikipedia: Brook’s law

Conway’s Law

Any organization that designs a system will produce a design whose structure is a copy of the organization’s communication structure.

In short: You will ship your org chart.

Wikipedia: Conway’s Law

Eagleson’s Law

Any code of your own that you haven’t looked at for six or more months might as well have been written by someone else.

exceptionnotfound.net: 15 Fundamental Laws of Software Development

Gall’s Law

A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over with a working simple system.

Wikipedia: John Gall

Gell-Mann Amnesia

You open the newspaper to an article on some subject you know well. You read the article and see the journalist has absolutely no understanding of either the facts or the issues. Often, the article is so wrong it actually presents the story backward - reversing cause and effect: “wet streets cause rain” stories. Newspapers are full of them. Then turn the page to national or international affairs, and read as if the rest of the newspaper was somehow more accurate about Palestine than the baloney you just read. You turn the page, and forget what you know.

epsilontheory.com: Gell-Mann Amnesia

Godwin’s Law

As an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches 1.

If an online discussion (regardless of topic or scope) goes on long enough, sooner or later someone will compare someone or something to Adolf Hitler or his deeds, the point at which effectively the discussion or thread often ends.

Also known as: Godwin’s rule of Hitler analogies

Wikpiedia: Godwin’s law

Goodhart’s Law

When a measure becomes a target, it ceases to be a good measure.

Example: Car manufacturer optimizes their cars to perform very well in emission tests, not under real-world conditions. (Volkswagen emissions scandal)

Wikipedia: Goodhart’s law

Greenspun’s 10th Rule of Programming

Any sufficiently complicated C or Fortran program contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of Common Lisp.

Adjusted for Authentication: Any custom developed authentication system contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of Kerberos.

Generalized: Any custom developed system contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of the industry standard you refused to adopt.

Signs of Triviality: 10 Software Engineering Laws Everybody Loves to Ignore

Handicap Principle

Costly “signals” must be reliable signals, proofing “fitness” of the signaller, because unfit signallers cannot affort the costly signal.

Example: A peacock’s energetically expensive tail feathers indicate the peacock’s genetic quality and ability to survive despite the handicap of carrying such a conspicuous trait. For humans, expensive cars, watches, costly engagement rings, and luxury products in general can be construed with the Handicap Principle. Those expensive items can be acquired and maintained by the signaller and are hard to fake, therefore the signal can be considered reliable.

Also known as: Zahavian Signal

Wikipedia: Handicap principle

Hanlon’s Razor

Never attribute to malice that which is adequately explained by stupidity.

Or in other words: If you have a terrible experience, likely, the cause isn’t malicious intent but just stupidity.

Wikipedia: Hanlon’s razor

Hofstadter’s Law

It always takes longer than you expect, even when you take into account Hofstadter’s Law.

Douglas Hofstadter coined this self-referentialadage in his book Gödel, Escher, Bach: An Eternal Golden Braid (1979) to describe the widely experienced difficulty of accurately estimating the time it will take to complete tasks of substantial complexity.

Wikipedia: Hofstadter’s law

Hyrum’s Law

With a sufficient number of users of an API, it does not matter what you promise in the contract: all observable behaviors of your system will be depended on by somebody.

Example: It’s impossible to fix a wrong behavior in the API because there are people who depend exactly on that bug. A fix would break a lot of consumers of the API.

hyrumslaw.com

Kuykendall’s Rule

If you get close enough to any organization, you find out it is a total disaster.

Twitter: @CavedaleRhones

Layer 8

Layer 8 is a term used to refer to “user” or “political” layer on top of the 7-layer OSI model of computer networking.

Example: This sounds like a layer 8 problem.

Wikipedia: Layer 8

Lindy Effect

The longer a period something has survived to exist, the longer its remaining life expectancy.

Example: If a book has been in print for forty years, I can expect it to be in print for another forty years. But, and that is the main difference, if it survives another decade, then it will be expected to be in print another fifty years. This, simply, as a rule, tells you why things that have been around for a long time are not “aging” like persons, but “aging” in reverse.

Wikipedia: Lindy effect

Matthew Effect

The rich get richer and the poor get poorer.

The concept is applicable to matters of fame, status, popularity, friends, wealth, but may also be applied literally to cumulative advantage of economic capital.

Also known as: Matthew effect of accumulated advantage, Matthew principle

Wikipedia: Matthew effect

Moore’s Law

The number of transistors in a dense integrated circuit (IC) doubles about every two years.

Wikipedia: Moore’s law

Mosher’s Law of Software Engineering

Don’t worry if it doesn’t work right. If everything did, you’d be out of a job.

IT History: Mosher’s Law of Software Engineering

Occam’s Razor

Different forms:

  • If you have two theories that both explain the observed facts, then you should use the simplest until more evidence comes along
  • The simplest explanation for some phenomenon is more likely to be accurate than more complicated explanations.
  • If you have two equally likely solutions to a problem, choose the simplest.
  • The explanation requiring the fewest assumptions is most likely to be correct.

Phil Gibbs: What is Occam’s Razor?

Ostrich Algorithm

The ostrich algorithm is a strategy of ignoring potential problems by assuming that they are very rare. It is named after the ostrich effect which is defined as “to stick one’s head in the sand and pretend there is no problem”. It is used when it is more cost-effective to allow the problem to occur than to attempt its prevention.

Wikipedia: Ostrich algorithm

Pareto’s Fallacy

Misinterpretation of the Pareto Principle that leads to the fallacy: When you’re 80% done, you think you only have 20% left. The critical part that’s overlooked here is that those 20% will require 80% of your time.

Signs of Triviality: 10 Software Engineering Laws Everybody Loves to Ignore

Parkinson’s Law

Work expands to fill the time available for its completion.

Example: A meeting always takes the entire time scheduled. Whether it’s 30min or 2 hours.

Wikipedia: Parkinson’s law

Parkinson’s Law of Triviality

Members of an organization give disproportionate weight to trivial issues.

Example: A fictional committee whose job was to approve the plans for a nuclear power plant spending the majority of its time on discussions about relatively minor but easy-to-grasp issues, such as what materials to use for the staff bike shed while neglecting the proposed design of the plant itself, which is far more critical and a far more difficult and complex task.

Also known as: bicycle-shed effect, bike-shed effect, bike-shedding, The Bikeshedders’ Blind Spot

Wikipedia: Law of triviality

Paula Principle

Mirrors the Peter Principle, that people rise to their level of incompetence. You go on being promoted until you’re doing the job poorly enough not to be promoted any further. While the Peter Principle concerns about men, the Paula Principle is the mirror image for women: Working women tend to stick at a level below that of their full competence or qualification.

PaulaPrinciple.com: About

Peter Principle

People in a hierarchy tend to rise to their “level of incompetence”.

Example: An employee is promoted based on their success in previous jobs until they reach a level at which they are no longer competent, as skills in one job do not necessarily translate to another.

Wikipedia: Peter principle

Poe’s Law

Without the voice inflection and body language of personal communication online text communication is easily misinterpreted.

Without a winking smiley or other blatant display of humor, it is utterly impossible to parody a Creationist in such a way that someone won’t mistake for the genuine article. Poe’s law is an adage of Internet culture stating that, without a clear indicator of the author’s intent, every parody of extreme views can be mistaken by some readers for a sincere expression of the views being parodied.

Wikipedia: Poe’s law

Postel’s Law

Be conservative in what you do, be liberal in what you accept from others.

In other words, programs that send messages to other machines (or to other programs on the same machine) should conform completely to the specifications, but programs that receive messages should accept non-conformant input as long as the meaning is clear.

Also known as: Robustness principle

Wikipedia: Robustness principle

Precautionary Principle

Decision-making approach to innovations with potential for causing harm when extensive scientific knowledge on the matter is lacking. It emphasizes caution, pausing and review before leaping into new innovations that may prove disastrous. Critics argue that it is vague, self-cancelling, unscientific and an obstacle to progress.

Example: A government may decide to limit or restrict the widespread release of a vaccine for a pandemic. (Yes, this example is inspired by COVID). The vaccine may help, but until there’s enough evidence, it could also be worse than the pandemic if many people get vaccinated with an unsafe vaccine. The precautionary principle suggests waiting until there is sufficient proof of safety.

Wikipedia: Precautionary principle

Price’s Law

Price’s law says that 50% of the work is done by the square root of the total number of people who participate in the work.

Darius Foroux: Price’s Law

Pyramid of Doom

In computer programming, the pyramid of doom is a common problem that arises when a program uses many levels of nested indentation to control access to a function. It is commonly seen when checking for null pointers or handling callbacks.

Example:

if let id = item["id"] {
    if let name = item["name"] {
        if let quantity = item["quantity"] {
            if let origin = item["origin"] {
                // ...
            }
        }
    }
}

Also known as: Callback Hell

Wikipedia: Pyramid of doom

Region-Beta Paradox

The region-beta paradox is the phenomenon that people can sometimes recover more quickly from more distressing experiences than from less distressing ones.

Example: If someone is in a bad, but bearable job, they will be less likely quit to find an ideal job than if their current job was unbearably bad.

Wikipedia: Region-beta paradox

Simpson’s Paradox

For any given statistical result and conclusion there exists a data set that produces the same result but opposite conclusion.

Example: There are two different treatments for kidney stones. Which one is better?

Treatment A Treatment B
273 successful out of 350 (78%) 289 successful out of 350 (83%)

The correct answer is: Treatment A! Wait, what?

Kidney stones can be classified as either large or small. Larger stones are harder to treat. The study for “Treatment A” just had a bigger amount of smaller, easier to treat kidney stones in the data set:

Treatment A Treatment B
Small Stones 81 successful out of 87 (93%) 234 successful out of 270 (87%)
Large Stones 192 successful out of 263 (73%) 55 successful out of 80 (69%)
273 successful out of 350 (78%) 289 successful out of 350 (83%)

Explanations and the source of the example:

ForrestTheWoods: My Favorite Paradox

Spolsky’s Law of Leaky Abstractions

In software development, a leaky abstraction is an abstraction that leaks details that it is supposed to abstract away. All non-trivial abstractions, to some degree, are leaky.

Example: Even though network libraries like NFS and SMB let you treat files on remote machines as if they were local, sometimes the connection becomes very slow or goes down, and the file stops acting like it was local. As a programmer you have to write code to deal with this, although the remote machine is supposed to be abstracted away.

Joel Spolsky: The Law of Leaky Abstractions

Sturgeon’s Law

90% of everything is crap.

Example: If you dislike poetry, or fine art, it’s possible you’ve only ever seen the crap of it. Go looking!

Also known as: Sturgeon’s Revelation

Wikipedia: Sturgeon’s law

Wirth’s Law

Software is getting slower more rapidly than hardware is becoming faster.

It explains why software doesn’t get faster and even slower, although hardware is getting better and better.

Wikipedia: Wirth’s law

Zawinski’s Law

Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones which can.

Coined by Jamie Zawinski to express his belief that all truly useful programs experience pressure to evolve into toolkits and application platforms (the mailer thing, he says, is just a side effect of that).

Adjustment for 2021: Every program attempts to expand until it includes a web server. Those programs which cannot so expand are replaced by ones which can.

Also known as: Law of Software Envelopment

Eric S. Raymond’s Home Page: Zawinski’s Law