• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar

Bollosoft

Engineering Leadership & Software Design

  • Culture
  • Leadership
  • Software Design
  • Strategy & Governance
  • AI & Automation

Why Broken Windows Lead to Broken Software

January 11, 2026 by Chris Bollerud

Young boy seen from behind throwing a rock at a decaying house in a neglected neighborhood during late afternoon light.

A single unpatched vulnerability. One function that violates your architecture. A test suite nobody trusts. These small compromises don’t stay small.

The Broken Windows Theory, originally developed to explain urban decay, offers a powerful lens for understanding why some codebases remain clean while others descend into chaos. The principle is simple: visible signs of neglect invite more neglect. Leave one window broken, and soon the entire building is vandalized. The same dynamic plays out in software systems, security programs, and engineering culture.

The Origin of the Theory

In 1982, criminologists James Q. Wilson and George L. Kelling published “Broken Windows” in The Atlantic, introducing a theory that would reshape urban policing. Their core argument: visible disorder signals that social norms are unenforced. A broken window left unrepaired tells potential vandals that nobody is watching, that nobody cares.

Stanford psychologist Philip Zimbardo tested this idea experimentally. He left identical cars in two neighborhoods: the Bronx and Palo Alto. The Bronx car was vandalized within minutes. The Palo Alto car sat untouched for over a week, until Zimbardo smashed part of it with a sledgehammer. Within hours, passersby (described as “primarily respectable whites”) had joined in destroying it. The lesson wasn’t about bad neighborhoods or bad people. It was about signaling. Once disorder becomes visible, it becomes permissible.

The theory has faced legitimate criticism for its application in policing, particularly regarding disparate impact and over-enforcement. But the underlying psychological mechanism remains sound: people calibrate their behavior to their environment. We unconsciously ask, “What are the standards here?” and adjust accordingly.

How Software Rots

The Pragmatic Programmer, now celebrating its 25th anniversary, applied this theory to software development: “Don’t leave broken windows unrepaired. Fix each one as soon as it is discovered.”

This isn’t just wisdom from experience. A 2024 study by Spinellis and colleagues, published in IEEE, provides empirical validation. Examining large codebases of C and Java code, they found that developers tailor the quality of their commits based on the quality of the file they’re modifying. History matters. Developers working in clean files write cleaner code. Developers working in messy files add to the mess.

The study’s conclusion is striking: current quality practices directly influence future code quality. This isn’t a character flaw in developers. It’s human nature responding to environmental cues.

Consider what happens when you open a file full of inconsistent naming, magic numbers, and commented-out code. Your mental calculation shifts. The bar has been set low. Matching that bar feels acceptable, even appropriate. After all, maintaining consistency has value, right? The rationalization writes itself.

Where Broken Windows Appear

The theory applies across multiple dimensions of software work.

Code Quality: A function that violates your naming conventions. A class that mixes responsibilities. Dead code nobody removes. These small violations accumulate. Each one makes the next violation easier to justify. Eventually, you’re not maintaining a codebase; you’re managing decay.

Architecture: Clean architecture depends on clear boundaries between layers. When a developer bypasses the service layer to call the repository directly (just this once, for performance), they’ve broken a window. Others will follow. Soon, your architecture diagram describes an aspiration, not reality.

Test Suites: Flaky tests are particularly insidious broken windows. Once teams learn to ignore test failures, they stop trusting the suite entirely. The suite exists, but nobody believes it. You’ve lost the feedback loop that makes automated testing valuable.

Vulnerability Management: This is where broken windows become genuinely dangerous. According to IBM’s 2024 X-Force Threat Intelligence Index, 78% of data breaches traced back to known but unpatched vulnerabilities. Not zero-days. Known issues with available patches that organizations simply hadn’t applied.

Security debt compounds like interest. Every unpatched system signals to both attackers and internal teams that vulnerabilities are tolerable. Mean time to remediate stretches. Backlogs grow. As of 2025, nearly 58% of global organizations still run at least one system beyond its vendor-supported lifecycle. That’s not a technical problem. It’s a cultural one.

The Compounding Effect

What makes broken windows so dangerous is the compounding nature of the damage.

Technical debt isn’t linear. Each shortcut makes the next shortcut more likely and more costly. Messy code makes refactoring harder, which means less refactoring happens, which means code gets messier. Architecture violations make boundaries unclear, which makes future violations harder to even recognize.

The Consortium for Information & Software Quality estimates the cost of technical debt to fix globally has ballooned to $1.52 trillion. Much of that cost wasn’t created by major failures. It accumulated through thousands of small compromises, each one reasonable in isolation, catastrophic in aggregate.

This extends beyond code. Teams that tolerate missed deadlines learn to expect them. Organizations that accept security exceptions develop cultures where exceptions are normal. The standard you walk past is the standard you accept.

Breaking the Cycle

The good news: the mechanism works in both directions. Just as disorder invites disorder, care invites care.

Fix issues immediately. When you see a problem, address it. If you can’t fix it properly (time constraints, dependencies, risk), board the window up. Add a TODO with a ticket number. Throw an exception. Do something that signals attention and intention. The worst response is silent acceptance.

Make standards visible. Code reviews, linters, and static analysis tools serve as automated enforcement. They catch violations before they become patterns. More importantly, they communicate expectations to every developer on every commit.

Prioritize ruthlessly. You can’t fix every broken window at once in a legacy system, and attempting to do so will likely make things worse. Focus on high-traffic areas: code that changes frequently, systems that handle sensitive data, components that new team members will touch first. These are the windows people see.

Measure and track. What gets measured gets managed. Track your vulnerability backlog, your mean time to remediate, your test suite reliability. Make the health of your systems visible to leadership. Broken windows thrive in darkness.

Model the behavior. Culture flows from leadership. If senior engineers cut corners under pressure, junior engineers learn that cutting corners is acceptable. If architects violate their own patterns, those patterns become suggestions rather than standards.

Beyond Software

The broken windows principle operates wherever standards exist to be maintained or violated. How clean you keep your house. How consistently you exercise. How promptly you respond to emails. Each small choice either reinforces or erodes the standard.

My daughter once asked why I bother making my bed every morning when nobody sees it. The answer is the broken windows principle in miniature. The made bed sets a tone. It signals to myself that standards matter here, that this is a space worth maintaining. An unmade bed invites clothes on the floor, which invites dishes on the nightstand, which invites decay.

If you’re in shape, you want to stay in shape. The standard reinforces itself. If you’ve let things slide, getting started feels harder. Not because the workout is different, but because you’re working against momentum.

The Culture Connection

Organizations are built from the inside out. Technical excellence, security discipline, and operational efficiency only create lasting value when they’re embedded in culture rather than imposed through process.

Policies and tools matter, but they’re not sufficient. A patching policy doesn’t fix windows. People fix windows. And people respond to the signals around them. If the codebase is clean, they’ll keep it clean. If security matters to leadership, it will matter to teams. If quality is celebrated, quality will emerge.

The broken windows theory is ultimately about culture: what behaviors a community accepts, what standards it enforces, what signals it sends about what matters. Software is no different.

Look around your codebase, your security program, your architecture. What windows are broken? What signals are you sending about what’s acceptable here?

The first broken window is always the hardest to fix. But it’s also the most important.

© 2026 Chris Bollerud, Bollosoft. Unauthorized reproduction prohibited.

Filed Under: Culture, Software Design

Primary Sidebar

Culture to Customer

Great organizations build great products. Engineering culture, security leadership, and software design connect to create teams that deliver real value. Lessons from two decades of building and leading technical organizations.

About Chris Bollerud

Recent Posts

  • The Single Wringable Neck: Why Your RACI Is Putting Accountability in the Wrong Place
  • Managing AI Agents: The Career Skill Nobody’s Teaching Yet
  • Stop Calling Everything AI
  • The Myth of the Intuitive Interface
  • The Hidden Tax of Bad Architecture

Copyright © 2026 - Chris Bollerud, Bollosoft. All rights reserved.