Cognitive Biases in the World of IT
In the technology industry, dozens of decisions are made every day, from architectural choices and implementation details to estimates and broader product decisions. Many of these decisions are not based purely on knowledge or experience, but are influenced by psychological mechanisms that shape how problems are perceived, risks are evaluated, and consequences are anticipated.
These mechanisms are known as cognitive biases. They are simplified patterns of thinking that help us function efficiently, but at the same time can lead to poor decisions and serious design mistakes. In IT, their impact is especially visible. Code, architecture, and processes are unforgiving, and every decision leaves a trace – in how easy or difficult a system is to maintain, how well it scales, how much it costs to develop and run, and how much technical debt it accumulates over time.
Cognitive Biases in the World of IT
Below are some of the most common cognitive biases that influence the daily work of developers and the way technical teams operate.
The Curse of Knowledge
The curse of knowledge occurs when someone who is very familiar with a topic struggles to see it from the perspective of someone who is not. Once something becomes obvious to us, we forget that it may be completely new or confusing to others.
In IT, this bias often appears when explaining complex topics to junior developers, clients, or non-technical stakeholders. Experienced developers may assume that concepts like APIs, commits, or algorithmic complexity are obvious, even when they are not. This can lead to misunderstandings, frustrating discussions, and mistakes caused by missing or unclear explanations.
The curse of knowledge also affects documentation, which is often written from an expert’s perspective and skips steps that are only obvious to insiders. It can even impact code reviews, where experienced developers may not notice that their mental shortcuts are hard for others to follow.
This bias is not caused by bad intentions, but by a natural way the brain simplifies thinking. Reducing its impact requires slowing down, asking clarifying questions, and checking whether others are actually following the reasoning. Overcoming the curse of knowledge leads to better communication and smoother collaboration in diverse teams.
The Dunning–Kruger Effect
The Dunning–Kruger effect is a cognitive bias where people with low levels of knowledge or skill tend to overestimate their abilities. This happens because they lack the tools needed to recognize their own gaps. At the same time, more competent individuals often underestimate themselves, because they are aware of the complexity that beginners do not yet see.
As a result, confidence is often highest among the least experienced, while the most experienced tend to be more cautious and humble.
This effect was described in 1999 by David Dunning and Justin Kruger and has been observed across many fields – from programming and science to everyday skills.
In IT, it is particularly visible because technologies are complex and beginners often feel confident until they face real-world problems. As knowledge grows, awareness of one’s own limitations grows as well, which can reduce confidence even as actual competence increases.
The Dunning–Kruger effect is not about arrogance or bad intent. It is a natural mental shortcut that confuses limited knowledge with certainty. Understanding it helps build humility, plan growth more realistically, and assess one’s skills more accurately.
The Halo Effect
The halo effect is a bias where one positive trait influences the overall perception of a person’s competence or character. If someone appears confident, intelligent, or likable, additional qualities are often attributed to them automatically, without real evidence.
This happens subconsciously and distorts judgment based on first impressions. In IT, the halo effect is commonly associated with seniors, tech leads, or highly confident individuals. Their ideas may be accepted without deeper analysis simply because they are perceived as experts. This can lead to weak solutions being implemented because no one felt comfortable questioning them.
The halo effect also appears in recruitment, where one strong attribute can overshadow gaps in other areas. To reduce its impact, teams need clear evaluation criteria, review processes, and a culture where questions are encouraged regardless of seniority. Awareness of this bias helps teams make more balanced decisions and avoid over-reliance on authority.
Confirmation Bias
Confirmation bias is the tendency to seek out information that supports existing beliefs while ignoring evidence that contradicts them.
In IT, this often shows up during debugging, when a developer looks for the cause of a problem only where they believe the bug must be, overlooking other possibilities. It also appears in technology choices, when someone favors a specific tool or framework and selectively collects arguments that support that preference, leading to poor architectural decisions.
This bias can also influence recruitment, where initial impressions guide how a candidate is evaluated. Reducing confirmation bias requires consciously looking for evidence that challenges assumptions and basing decisions on data rather than intuition alone.
The IKEA Effect
The IKEA effect is a cognitive bias where people overvalue things they created themselves, even if those things are not objectively better than alternatives.
In IT, this often means developers overestimate the quality or importance of a solution simply because they built it. This can make it difficult to accept criticism, let go of personal code, or replace it with a simpler or more efficient approach.
The IKEA effect can lead to unnecessary complexity and resistance to refactoring. It is also visible during code reviews, when an author defends a solution not because it is the best one, but because a lot of effort went into creating it.
Being aware of this bias makes it easier to accept feedback and evaluate solutions more objectively.
Authority Bias
Authority bias occurs when people tend to trust and accept opinions from perceived authorities, regardless of their actual quality. In practice, this means that ideas from those with senior roles, strong reputations, or higher positions are often accepted without critical analysis.
In IT, this is especially visible in architectural or technology decisions. If a senior developer, tech lead, or architect suggests something, the rest of the team may hesitate to question it. Authority bias can lead to poor decisions, limit diverse perspectives, and suppress constructive feedback.
Reducing its impact requires building a culture where ideas can be questioned regardless of role, and where decisions are based on arguments and evidence rather than hierarchy.