Why can't bugs be completely prevented?

  • What are the chief causes for defect leakage into software despite stringent quality measures being applied?

    Why do bugs still happen and how can they be mitigated?

    I know that with good coding practices and careful quality measures many bugs can be prevented, but I don't understand why they can't all be prevented. It can't be that the programmers aren't good enough because even very good programmers write code with flaws. It can't be that the testers aren't doing their jobs properly because they find many bugs even when the programmers are using the best possible combination of tools and techniques to write the code.

  • There are several factors involved in the reasons it's impossible to create non-trivial bug-free software.


    Even the simplest software has effectively infinite pathways through it. Consider a very basic calculator application, one that allows addition, subtraction, multiplication, and division. Not only does it need to detect keyboard input and know which calculations to perform, it needs to be able to follow the order of operations rules, handle nested brackets, and generate the correct answers.

    Then, on the user interface side, it also needs to be able to handle undo or cancel operations at any time. It has to maintain the full list of user input and not calculate anything until the user triggers calculation (generally with 😃 because otherwise input such as 1 + 2 * 3 would return 9 instead of 7.

    If that were not enough, the app is using the underlying architecture to perform its operations. In the calculator example, it might be using libraries from the language it was programmed in, but those libraries are calling on the machine GPU to actually perform the calculations. The notorious Pentium Bug comes to mind here.


    What a developer or tester sees as a bug is not necessarily what a user sees as a bug. If you're writing say, point of sale software, users will want the process of selling something and collecting payment to be quick and easy. They'll accept a clumsier product set-up process because they'll be spending most of the time they use the software making sales transactions.

    Imagine a word processor where the process of writing or editing a document was the most complicated part of using it - users would consider that a defect even though technically the software is doing what it's designed to do. Similarly, today's users would likely consider using anything other than an x icon to indicate a close button a bug because twenty-some years of personal computing has taught them that x means close.


    Modern software does not run in isolation. Even embedded device software is typically communicating with something. That means that the software has to be able to manage its communications, handle its memory requirements with grace, and not leave resources locked after it quits.

    General Comments

    As a rule, any software is trying to solve at least one problem for at least one person. Often software attempts to solve multiple related problems for multiple people with similar problems.

    Every interaction between modules, every potential configuration setting, every problem, every potential user forms an "edge", a place where some part of the software interacts with something else. Each new edge multiplies the complexity of the software exponentially, because it has to interact in some way with (usually) every other edge.

    That level of interaction quickly becomes impossible for human minds to manage. We can write software that has millions of interaction points, but we can't truly understand it. The best we can do is keep a small subset in our minds at a given time.

    This means that even if all the technical, user-related, and interactivity-related potential problems can be prevented (which usually isn't possible), we're still going to miss things because there's simply too much in any non-trivial application for any human to handle.

    The Short-Short Version

    By its nature software works in binary logic. Something is or it is not. Even the work towards AI eventually decomposes to binary in the form of something is above the probability threshold or it isn't.

    Reality is complex and often fuzzy/analog where something can be part of something else or partway through a process, or both.

    While this remains true, there will always be bugs because it's impossible for software to completely handle the kinds of things people do all the time instinctively, and people will always see the gaps as bugs.

Suggested Topics

  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2