This is a curated list of laws, principles, or adages that I found either helpful, interesting, or funny. I’ll continuously update the list as I learn more. I did not create those laws and their description. I added a link for each law to the source, which I got it.
If you have an interesting law that would fit this list, don’t hesitate to approach me. Have fun exploring!
The exponential drop in frequency of communication between engineers as the distance between them increases.
Rather than finding that the probability of telephone communication increases with georaphical distance, as face-to-face probability decays, the use of all communication media decays with distance.
The amount of energy needed to refute bullshit is an order of magnitude larger than to produce it.
Brandolini’s law is an internet adage which emphasizes the difficulty of debunking bullshit.
Also known as: bullshit asymmetry principle
Adding manpower to a late software project makes it later.
Adding more people to an already late project, as a desperate attempt by management to finish the project on time, will only make it even later.
Any organization that designs a system will produce a design whose structure is a copy of the organization’s communication structure.
In short: You will ship your org chart.
Any code of your own that you haven’t looked at for six or more months might as well have been written by someone else.
You open the newspaper to an article on some subject you know well. You read the article and see the journalist has absolutely no understanding of either the facts or the issues. Often, the article is so wrong it actually presents the story backward - reversing cause and effect: “wet streets cause rain” stories. Newspapers are full of them. Then turn the page to national or international affairs, and read as if the rest of the newspaper was somehow more accurate about Palestine than the baloney you just read. You turn the page, and forget what you know.
As an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches 1.
If an online discussion (regardless of topic or scope) goes on long enough, sooner or later someone will compare someone or something to Adolf Hitler or his deeds, the point at which effectively the discussion or thread often ends.
Also known as: Godwin’s rule of Hitler analogies
When a measure becomes a target, it ceases to be a good measure.
Greenspun’s 10th Rule of Programming
Any sufficiently complicated C or Fortran program contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of Common Lisp.
Adjusted for Authentication: Any custom developed authentication system contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of Kerberos.
Generalized: Any custom developed system contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of the industry standard you refused to adopt.
With a sufficient number of users of an API, it does not matter what you promise in the contract: all observable behaviors of your system will be depended on by somebody.
Example: It’s impossible to fix a wrong behavior in the API because there are people who depend exactly on that bug. A fix would break a lot of consumers of the API.
If you get close enough to any organization, you find out it is a total disaster.
Layer 8 is a term used to refer to “user” or “political” layer on top of the 7-layer OSI model of computer networking.
Example: This sounds like a layer 8 problem.
The rich get richer and the poor get poorer.
The concept is applicable to matters of fame, status, popularity, friends, wealth, but may also be applied literally to cumulative advantage of economic capital.
Also known as: Matthew effect of accumulated advantage, Matthew principle
The number of transistors in a dense integrated circuit (IC) doubles about every two years.
Mosher’s Law of Software Engineering
Don’t worry if it doesn’t work right. If everything did, you’d be out of a job.
- If you have two theories that both explain the observed facts, then you should use the simplest until more evidence comes along
- The simplest explanation for some phenomenon is more likely to be accurate than more complicated explanations.
- If you have two equally likely solutions to a problem, choose the simplest.
- The explanation requiring the fewest assumptions is most likely to be correct.
Misinterpretation of the Pareto Principle that leads to the fallacy: When you’re 80% done, you think you only have 20% left. The critical part that’s overlooked here is that those 20% will require 80% of your time.
Work expands to fill the time available for its completion.
Example: A meeting always takes the entire time scheduled. Whether it’s 30min or 2 hours.
Parkinson’s Law of Triviality
Members of an organization give disproportionate weight to trivial issues.
Example: A fictional committee whose job was to approve the plans for a nuclear power plant spending the majority of its time on discussions about relatively minor but easy-to-grasp issues, such as what materials to use for the staff bike shed while neglecting the proposed design of the plant itself, which is far more critical and a far more difficult and complex task.
Also known as: bicycle-shed effect, bike-shed effect, bike-shedding, The Bikeshedders’ Blind Spot
People in a hierarchy tend to rise to their “level of incompetence”.
Example: An employee is promoted based on their success in previous jobs until they reach a level at which they are no longer competent, as skills in one job do not necessarily translate to another.
Without the voice inflection and body language of personal communication online text communication is easily misinterpreted.
Without a winking smiley or other blatant display of humor, it is utterly impossible to parody a Creationist in such a way that someone won’t mistake for the genuine article. Poe’s law is an adage of Internet culture stating that, without a clear indicator of the author’s intent, every parody of extreme views can be mistaken by some readers for a sincere expression of the views being parodied.
Be conservative in what you do, be liberal in what you accept from others.
In other words, programs that send messages to other machines (or to other programs on the same machine) should conform completely to the specifications, but programs that receive messages should accept non-conformant input as long as the meaning is clear.
Also known as: Robustness principle
Decision-making approach to innovations with potential for causing harm when extensive scientific knowledge on the matter is lacking. It emphasizes caution, pausing and review before leaping into new innovations that may prove disastrous. Critics argue that it is vague, self-cancelling, unscientific and an obstacle to progress.
Example: A government may decide to limit or restrict the widespread release of a vaccine for a pandemic. (Yes, this example is inspired by COVID). The vaccine may help, but until there’s enough evidence, it could also be worse than the pandemic if many people get vaccinated with an unsafe vaccine. The precautionary principle suggests waiting until there is sufficient proof of safety.
Price’s law says that 50% of the work is done by the square root of the total number of people who participate in the work.
For any given statistical result and conclusion there exists a data set that produces the same result but opposite conclusion.
Example: There are two different treatments for kidney stones. Which one is better?
|Treatment A||Treatment B|
|273 successful out of 350 (78%)||289 successful out of 350 (83%)|
The correct answer is: Treatment A! Wait, what?
Kidney stones can be classified as either large or small. Larger stones are harder to treat. The study for “Treatment A” just had a bigger amount of smaller, easier to treat kidney stones in the data set:
|Treatment A||Treatment B|
|Small Stones||81 successful out of 87 (93%)||234 successful out of 270 (87%)|
|Large Stones||192 successful out of 263 (73%)||55 successful out of 80 (69%)|
|∑||273 successful out of 350 (78%)||289 successful out of 350 (83%)|
Explanations and the source of the example:
Spolsky’s Law of Leaky Abstractions
In software development, a leaky abstraction is an abstraction that leaks details that it is supposed to abstract away. All non-trivial abstractions, to some degree, are leaky.
Example: Even though network libraries like NFS and SMB let you treat files on remote machines as if they were local, sometimes the connection becomes very slow or goes down, and the file stops acting like it was local. As a programmer you have to write code to deal with this, although the remote machine is supposed to be abstracted away.
90% of everything is crap.
Also known as: Sturgeon’s Revelation
Software is getting slower more rapidly than hardware is becoming faster.
It explains why software doesn’t get faster and even slower, although hardware is getting better and better.
Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones which can.
Coined by Jamie Zawinski to express his belief that all truly useful programs experience pressure to evolve into toolkits and application platforms (the mailer thing, he says, is just a side effect of that).
Adjustment for 2021: Every program attempts to expand until it includes a web server. Those programs which cannot so expand are replaced by ones which can.
Also known as: Law of Software Envelopment