Vulnerability As Engineering Practice

January 19, 2024

You know this person. Perhaps you've tried to become them. The engineer who never admits confusion, who stays three steps ahead of everyone else, who responds to complexity with unwavering confidence. The one who, according to the unwritten rules of our profession, should never show weakness.

This archetype is killing us from the inside. It's time we put it to rest.

The Vulnerability Gap

During a particularly grueling project in 2018, our team was implementing a complex distributed transaction system. Deadlines tightened. Sleep diminished. In our daily standups, a ritual of false confidence emerged—each of us reporting progress while privately drowning in uncertainty.

I maintained my invincible facade until the night before launch when my carefully constructed solution collapsed spectacularly during final testing. Hours before deployment, I finally broke: "I don't know how to fix this."

The relief was immediate. Within minutes, three colleagues were collaboratively debugging with me. By morning, we had a solution none of us could have developed alone. It wasn't my technical knowledge that had failed me—it was my inability to access our collective intelligence because I feared looking incompetent.

This experience mirrors what organizational psychologist Amy Edmondson found in her seminal work on psychological safety. Her research demonstrated that teams performing complex, interdependent work were significantly more effective when members felt safe to admit mistakes and gaps in knowledge (Edmondson, 1999). The highest-performing teams weren't those with the most technically brilliant individuals, but those where vulnerability was treated as a resource rather than a liability.

The Science of Engineering Vulnerability

The need for vulnerability isn't just anecdotal—it's backed by robust research:

Google's extensive Project Aristotle study analyzed 180+ teams to identify what made some excel while others faltered. Their surprising conclusion: the differentiating factor wasn't collective IQ, experience, or even complementary skills. It was psychological safety—the shared belief that the team was safe for interpersonal risk-taking (Duhigg, 2016).

Similarly, researchers at MIT's Human Dynamics Laboratory found that the best predictor of team success wasn't individual excellence but rather patterns of communication that enabled equal participation and social sensitivity—both facilitated by vulnerability (Pentland, 2012).

As Nicole Forsgren's work with DevOps Research and Assessment (DORA) demonstrates, high-performing engineering organizations consistently prioritize learning culture over blame culture. Her research shows that teams where failure is seen as an opportunity for growth consistently outperform those where failure is hidden (Forsgren et al., 2018).

The Empathy Engineering Imperative

I've come to understand that technical problems are rarely just technical. Behind every bug, stalled feature, or system outage is a human element that our industry's focus on technical excellence often obscures.

When a junior engineer on my team recently struggled with a memory leak issue, my first instinct was to jump to technical solutions. Instead, I asked, "What's making this problem particularly challenging for you?" His response revealed he was comparing his debugging process to a senior colleague's and feeling inadequate. The technical issue remained, but by addressing the human dimension first, we created space for actual problem-solving rather than performance.

This approach aligns with what Richard S. Lazarus termed "cognitive appraisal theory"—the idea that our emotional responses to challenges significantly impact our cognitive ability to solve them (Lazarus, 1991). By acknowledging the emotional dimensions of technical work, we don't detract from excellence; we enable it.

Vulnerability As Engineering Practice

In my current role leading a platform engineering team, we've operationalized vulnerability in several ways:

Blameless Postmortems: Drawing from John Allspaw's work at Etsy, we've implemented postmortem processes that focus on systemic factors rather than individual mistakes (Allspaw, 2012). When an outage occurs, the first question isn't "who caused it?" but "what enabled this to happen?"

Learning Circles: Monthly sessions where team members present not their successes but their struggles. These structured vulnerability spaces have become our most valuable knowledge-sharing mechanism.

Calibrated Confidence: When estimating work or making technical assertions, we explicitly state our confidence level. "I'm 60% confident this approach will work" communicates information that "This will work" obscures.

Documentation of Confusion: We maintain "confusion logs" alongside our code, recording where engineers get stuck. These logs have proven more valuable for onboarding than our carefully crafted knowledge base.

The results have been measurable. Our team's mean time to recovery from incidents has decreased by 40%. Onboarding time for new engineers has been cut nearly in half. Most importantly, engineering satisfaction scores have improved significantly.

The Neuroscience of Engineering Safety

Underlying these practices is a neurological reality: the brain's threat-detection system doesn't distinguish between physical threats and social ones. As Matthew Lieberman's neuroimaging research shows, the pain of social rejection activates the same brain regions as physical pain (Lieberman, 2013).

When an engineer fears looking incompetent in front of peers, their brain enters a threat state. Executive function diminishes. Creativity narrows. The very cognitive resources needed to solve complex problems become less accessible.

By creating environments where vulnerability is normalized, we're not just being "nice"—we're optimizing for cognitive performance. We're ensuring that engineers can access their full intellectual capabilities rather than spending mental resources maintaining a façade of invincibility.

Practical Vulnerability

For individual engineers looking to practice vulnerability:

  1. Start with questions, not answers: Frame your contributions as investigations rather than conclusions. "I'm exploring this approach because..." rather than "We should do this."
  2. Calibrate your confidence: Explicitly communicate your certainty level when making technical assertions.
  3. Document your learning journey: Share not just what you learned but how you learned it, including the misconceptions and detours.
  4. Distinguish between identity and output: Your code, designs, and technical decisions are not you—they are products of your current understanding, which is always evolving.

For engineering leaders:

  1. Model vulnerability deliberately: Share your own uncertainties, mistakes, and learning processes explicitly.
  2. Create structured vulnerability spaces: Dedicate time specifically for discussing challenges, confusions, and failures.
  3. Reward learning narratives: Celebrate stories of growth and adaptation as much as stories of flawless execution.
  4. Measure psychological safety: Regularly assess whether team members feel safe taking interpersonal risks.

The Future of Engineering Excellence

The most powerful lesson I've learned in two decades of engineering isn't about algorithms, languages, or architectures. It's that our industry's greatest untapped resource is our humanity—our capacity to connect authentically, to learn collaboratively, and to build on each other's partial knowledge.

The future belongs not to invincible engineers but to vulnerability-capable ones who can navigate the complex human dimensions of technical work. As Brown's research consistently shows, vulnerability isn't weakness—it's our most accurate measure of courage (Brown, 2012).

In the words of Atul Gawande reflecting on medical practice (equally applicable to software): "The successful professional is not someone who's never failed; it's someone who's learned to manage their failures effectively" (Gawande, 2007).

So let the invincible engineer die. In their place, let's nurture engineers who have the courage to be seen—with all their brilliance and all their limitations. Our code will be better for it. Our systems will be more resilient. And perhaps most importantly, we will finally be able to bring our whole selves to work that matters.


References

Allspaw, J. (2012). "Blameless PostMortems and a Just Culture." Code as Craft, Etsy Engineering Blog.

Brown, B. (2012). Daring Greatly: How the Courage to Be Vulnerable Transforms the Way We Live, Love, Parent, and Lead. Gotham Books.

Duhigg, C. (2016). "What Google Learned From Its Quest to Build the Perfect Team." The New York Times Magazine.

Edmondson, A. (1999). "Psychological Safety and Learning Behaviour in Work Teams." Administrative Science Quarterly, 44(2), 350-383.

Forsgren, N., Humble, J., & Kim, G. (2018). Accelerate: The Science of Lean Software and DevOps. IT Revolution Press.

Gawande, A. (2007). Better: A Surgeon's Notes on Performance. Picador.

Lazarus, R. S. (1991). Emotion and Adaptation. Oxford University Press.

Lieberman, M. D. (2013). Social: Why Our Brains Are Wired to Connect. Crown.

Pentland, A. (2012). "The New Science of Building Great Teams." Harvard Business Review, 90(4), 60-69.