John Gall, a pediatrician by profession, authored the influential book "Systemantics: How Systems Work and Especially How They Fail" in 1975. [1][2] The book, later retitled "The Systems Bible," offers a satirical and insightful critique of systems theory, arguing that complex systems are inherently prone to failure and often produce unexpected, counterintuitive results. [3]

Fundamental Principles and The Nature of Systems

  1. Gall's Law: "A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over with a working simple system." [1]
  2. The Universe as a System: "The universe is like a very large system." [5]
  3. The Intrinsic Nature of Failure: "Failure to function as expected is an intrinsic feature of systems, resulting from laws of systems-behavior that are as rigorous as any in Natural Science or Mathematics." [4]
  4. The Seductiveness of Systems: "Systems are seductive. They promise to do a hard job faster, better, and more easily than you could do it by yourself. But if you set up a system, you are likely to find your time and effort now being consumed in the care and feeding of the system itself." [6][7]
  5. Systems Create New Problems: "New systems generate new problems." [4]
  6. The Persistence and Growth of Systems: "Systems are like babies: once you get one, you have it. They don't go away. On the contrary, they display the most remarkable persistence. They not only persist; they grow." [4]
  7. Systems Encroach: "Systems tend to grow, and as they grow, they encroach." [4]
  8. The Generalized Uncertainty Principle (G.U.P.): "Complex systems exhibit unexpected behavior." [1] The reason for this is that no one can understand the real world well enough to predict the outcomes of complex processes. [1]
  9. Systems Display Antics: This playful term captures the concept that systems naturally "act up." [3][4]
  10. Problems Are Not the Problem: "Problems are not the problem; coping is the problem." [5][7]

How Systems Behave and Misbehave

  1. Systems Oppose Their Own Functions: "Complex systems tend to oppose their own proper function." [4] In more elegant language: "Systems tend to oppose their own proper functions." [5]
  2. The System Kicks Back: A fundamental property of systems is their tendency to react in unexpected ways to interventions. [5][7]
  3. The Real World vs. The System World: "People in systems do not do what the system says they are doing." [4]
  4. The System's Stated vs. Actual Function: "The system itself does not do what it says it is doing." [4]
  5. Operational Fallacy: "The function performed by a system is not operationally identical to the function of the same name performed by a man. In general, a function performed by a larger system is not operationally identical to the function of the same name as performed by a smaller system." [4]
  6. The Limits of Human Capacity: "Large complex systems are beyond human capacity to evaluate. (Large systems can't be fully known)." [5]
  7. Infinite Failure Modes: "A complex system can fail in an infinite number of ways." [8][9]
  8. Unpredictable Failures: "The mode of failure of a complex system cannot ordinarily be predicted from its structure." [4]
  9. Failure of Fail-Safe Systems: "When a fail-safe system fails, it fails by failing to fail-safe." [4]
  10. Triumph and Malfunction: "Systems tend to malfunction conspicuously just after their greatest triumph." [5]

The Goals and Purpose of Systems

  1. Systems Have Their Own Goals: "Systems develop goals of their own the instant they come into being. Intrasystem goals come first." [4]
  2. Systems Don't Work for You: "Systems don't work for you or for me. They work for their own goals." [5]
  3. The True Function vs. The Name: "In general, the larger and more complex the system, the less the resemblance between the true function and the name it bears." [4]
  4. Exploitation is Inevitable: "If a system can be exploited, it will be. Any system can be exploited." [5]
  5. The Purpose of a System is What It Does (POSIWID): While not a direct Gall quote, this concept is a core learning from his work, meaning the actual output of a system reveals its true purpose, regardless of its stated intentions.

On Information and Perception within Systems

  1. The Inaccessibility Theorem: "The information you have is not the information you want. The information you want is not the information you need. The information you need is not the information you can obtain." [5]
  2. Filtered Reality: "People in systems never deal with the real world that the rest of us have to live in but a filtered, distorted, and censored version which is all that can get past the sensory organs of the system itself." [6]
  3. The Reporting Problem: "Things are what they are reported to be." [4]
  4. Wishful Feedback: When a system's responses are inappropriate, the model of the universe it generates bears less and less resemblance to outside reality. The system hallucinates its way to terminal instability. [7]
  5. The Inevitability-of-Reality Fallacy: The belief that "Things have to be the way they are and not otherwise because that's just the way they are." [7]

Anergy and Problem Solving

  1. The Concept of Anergy: "Any state of condition of the Universe, or of any portion of it, that requires the expenditure of human effort or ingenuity to bring it into line with human desires, needs, or pleasures is defined as an Anergy-state." [4]
  2. Law of the Conservation of Anergy: "The total amount of anergy in the universe is fixed." [4]
  3. How Systems Use Anergy: "Systems operate by redistributing anergy into different forms and into accumulations of different sizes." [4]
  4. Systems Don't Solve Problems: "A System represents someone's solution to a problem. The System does not solve the problem." [6]
  5. Creating Two Problems from One: "Trying to design a System in the hope that the System will somehow solve the Problem, rather than simply solving the Problem in the first place, is to present oneself with two problems in place of one." [7]

Practical Advice and Learnings

  1. First Principle of Systems Design: "Do it without a system if possible." [1][4]
  2. Tread Softly: "In setting up a new system, tread softly. You may be disturbing another system that is actually working." [4][7]
  3. Loose Systems Last Longer: "Loose systems last longer and function better." [1]
  4. Design with Human Nature: "Systems run better when designed to run downhill." A corollary is that "Systems aligned with human motivational vectors will sometimes work. Systems opposing such vectors work poorly or not at all." [4][10]
  5. Start Small and Evolve: The most crucial learning is to begin with a simple, working system and allow it to evolve, rather than designing a complex system from the outset. [1][5]
  6. The Difficulty of Getting Out: "Almost anything is easier to get into than out of." [5]
  7. The "Systems-Person" Transformation: Long-term immersion in a system can distort your perspective until you believe the system's flawed output is what you wanted all along. At this point, you have become a "Systems-person." [7]
  8. Systems Attract Systems-People: Individuals who are drawn to the order and logic of systems are often the ones who end up managing them. [4]
  9. The End of Competition is Bizarreness: "The end result of extreme competition is bizarreness." [5]
  10. Perfection as a Symptom of Decay: "Perfection of planning is a symptom of decay." [5]
  11. Focus on Process, Not Just Outcome: Most people are overly concerned with the end result, when they should focus more on the process that produced the outcome. [4]
  12. Embrace Error as the Norm: "Error is our existential situation and that our successes are destined to be temporary and partial." [1]
  13. The Limits of Solutions: "The word 'Solution' is only a fancy term for the Response of System A (ourselves) to System B (the Problem)... it implies something that can be done once and for all. But System B is sure to Kick Back in response to our Response, and then we must respond once again." [7]
  14. The Observer Effect: "There can be no System without its Observer." [7]
  15. The Fundamental Problem: "The fundamental problem does not lie in any particular system but rather in systems as such. Salvation, if it is attainable at all, even partially, is to be sought in a deeper understanding of the ways of systems, not simply in a criticism of the errors of a particular system." [4]

Learn more:

  1. Systemantics: How Systems Work and Especially How They Fail by John Gall | Goodreads
  2. Systemantics: How Systems Work and Especially How They Fail by John Gall | Goodreads
  3. Systemantics - Wikipedia
  4. Systemantics: How Systems Work and Especially How They Fail by John Gall
  5. Systemantics by John Gall - Taylor Pearson
  6. Systemantics: How Systems Work & Especially How They Fail - My Notes - Point by point.
  7. Systemantics - The Systems Bible - Mathias Verraes
  8. Systems Thinking: Failure (10 Quotes) | by Adrian - Medium
  9. A complex system can fail in an infinite number of ways - Lib Quotes
  10. Book Review – Systemantics - offbeattesting