Understanding How the Human Brain Shapes — and Sometimes Sabotages — Operational Problem Solving
If you’ve worked in manufacturing long enough, you’ve probably faced it: the same downtime keeps reappearing, the same quality defect creeps back, or productivity never improves beyond a certain point—despite multiple meetings and corrective actions.
Ever wonder why that happens?
It’s easy to blame tools, machines, or even procedures, but research in human behaviour suggests that our brains themselves might be part of the problem.
When we solve recurring issues, our thinking patterns — shaped by cognitive shortcuts and biases — often nudge us down familiar but flawed paths.
In this post, we’ll explore:
- How the human brain actually approaches problem solving.
- How problem solving bias subtly affects engineers and managers on the shop floor.
- And how you can recognise, question, and reduce those biases to solve problems faster and smarter.
So as you read, ask yourself:
“How do I typically approach a reoccurring issue?” Do you jump to a known fix, or pause to re-frame the problem?
Scientific Thinking : Navigating between fact in statistics and human psychological bias
Scientific thinking were one of Toyota 4P’s principle that emphaize most of to problem solving. As practicioner in lean Scientific thinking s,this is term that I start using during problem solving seminars. Since after lead several seminar within decade of my career. One thing I notice is the ability to solve an issue or problem solving were not only depend on our logical capability only, it is much more that that. We need to had wisdom and open mind mentally to embrace what the data tells.
When faced by an issue or problem by default we ought to had an opinion based our past experience. Thats how in many cases when I led a problem solving workshop people to tend . wired to choose solution that has been work before . It is not because they are not good in logical reasoning, thats because it is how human brain works.
below I will breakdown some of the cause of bias within our brain and how to navigate trough problem solving
How the Brain process an information (Thinking ) — Fast vs. Slow Thinking
One of tye most prominent bias caused by we were unable to know how our brain process an information and then start make decision. Cognitive scientists Daniel Kahneman and Amos Tversky describe two key thinking systems:
- System 1 — fast, automatic, intuitive
- System 2 — slow, deliberate, analytical
for example in day in the life , System 1 thinking is what makes you act quickly when you smell some smoke within your house, you might be go to the kitchen check the oven or the stove — sincesmoke smells were danger and need s0eedy response your experience kicks in automatically without much thinking since under pressure.
System 2 is what you use during a something that need time to make decision for example — when you cooka bread and it not went well you will analyse why your cookies or bread not developed , is it the yeast, the flour , the egg or you arevover cooked. during this timing your brain able to assess information slowly
But here’s the catch: under pressure, your brain prefers the fast lane. It reuses known solutions, even when the context has changed. This “shortcut” behaviour is what researchers call a problem solving bias — a tendency to rely on habitual pathways that feel familiar and efficient, but may be misleading.
According to cognitive neuroscience studies (see De Neys & Goel, Heuristics and Biases in the Brain, Taylor & Francis, 2011), these two systems compete for control. Under time constraints or stress (a common factory condition), the heuristic, “fast” brain wins.
Bias On preceptions : Hierarchical and Counterfactual Pathways
On of the Recent research from MIT and Columbia University found that the brain uses hierarchical reasoning (breaking a large task into sub-tasks) and counterfactual reasoning (asking “what if I had done X instead?”) when solving complex problems.
This sounds very basic yet this is how the most problem solving mistake lies. Majority of the rootcuse analysis that Im coached mistake happens at confirming it is correlated with factual data. Many times confimration backed by data sound so luxurious since we often not having a time to conduct comprehensive analysis.
Thus instead we tried several schenario and implement Design Of experiment as a way to confirm with scientific way. we often just confirm by accumulatedknowledge of the consensus backed by stakeholder approval.
In manufacturing, this resembles how engineers use a 5-Why or Fishbone diagram—decomposing problems into smaller pieces, or simulating alternative scenarios.
But when biases dominate, we often skip these deeper pathways. Instead of examining causes at the system level, we jump to surface fixes.
Example: increasing inspection frequency instead of addressing the variation source in upstream machining.
Your brain’s natural shortcutting may explain why that same defect reappears six months later.
Tips to Navigate Biases Creep Into Manufacturing Problem Solving
2.1 Mental Set Bias — “We Fixed It This Way Before”
In manufacturing, success can create a trap. When a previous fix worked once, the brain tags it as a “go-to” pattern.
This is known as mental set bias — the tendency to reuse an old solution even when a new one is required.
A 2025 study in Cerebral Cortex (Zhang et al.) showed that repeating a familiar strategy strengthens specific brain networks, making switching harder over time.
So, every time your team reuses an old solution template, it becomes mentally “cheaper” to repeat it again — even if the conditions have changed.
Ask yourself: When was the last time your line-stop solution was truly new?
2.2 Functional Fixedness — Tools and Processes as Blind Spots
Functional fixedness means we see tools only for their traditional purpose.
Example: You may think of the torque wrench only as a tightening tool, not as a measurement device for detecting operator variation.
Or you may think of a “daily meeting” as a reporting ritual rather than a live problem-solving lab.
This bias limits innovation. It’s not a lack of intelligence; it’s the brain’s way of saving energy.
Next time, when you’re stuck with a recurring issue, ask: “Am I constrained by how I view this tool or process?”
2.3 Additive Bias — The “More Action Items” Trap
In Scientific American (2021), researchers discovered that humans tend to add rather than subtract when solving problems.
In operations, this shows up as:
- Adding new checklists.
- Adding new SOP steps.
- Adding another meeting.
Sound familiar?
While it feels productive, it often increases complexity and fatigue. True improvement may come from removing — a redundant approval, an unnecessary inspection, a double-entry form.
Ask: “What can I remove to make this process flow easier?”
2.4 Confirmation Bias — The “We Already Know the Cause” Syndrome
Once we believe we know the cause, we subconsciously seek data that supports it.
That’s confirmation bias — one of the strongest cognitive traps in technical teams.
It makes us ignore data that contradicts our first hunch.
In a line audit, you might see evidence that doesn’t fit your hypothesis — but your brain quickly rationalises it away.
That’s why structured tools like DOE (Design of Experiments) or Pareto Analysis work: they counteract confirmation bias by forcing objective data checks.
Before your next analysis, ask: “What would convince me I’m wrong?”
The Problem-Solving Pathway in Real Operations
Let’s map a typical case:
- Framing the problem
A team labels the issue: “High defect rate in assembly.”
But they may mis-frame it — perhaps the issue is actually variation in material feeding, not the assembly method itself. - Generating initial solutions
Because they’ve seen this before, they tighten inspection, add training, or adjust torque values. - Evaluating results
The issue improves briefly, then returns. Fatigue sets in. - Re-framing / Counterfactual thinking
A mature team eventually asks: “What if the issue starts earlier? What if we change the supplier material spec?”
This step often happens too late — if at all. - Implementing and monitoring
The cycle repeats unless the mental model changes.
At each step, problem solving bias can intervene:
- Mental set at framing,
- Confirmation bias during analysis,
- Additive bias in solutions,
- Status-quo bias in implementation.
4. How to Recognise and Reduce Problem Solving Bias
4.1 Ask Meta-Questions
Train your team to think about their own thinking.
Before approving a countermeasure, ask:
- “What assumptions are we making?”
- “What would prove this idea wrong?”
- “Have we subtracted options as well as added?”
This activates your brain’s monitoring system (the anterior cingulate cortex) which detects conflicts between automatic and deliberate thinking — helping you shift from “reflex” to “reflection.”
4.2 Visualise the Biases
Create a “Bias Board” in your problem-solving room — a simple visual with common traps:
- Mental Set
- Functional Fixedness
- Additive Bias
- Confirmation Bias
Each time you review a project, tick which bias might have influenced the team’s decision.
Turning bias into something visible helps neutralise it.
4.3 Rotate Investigators
Biases grow when the same people always lead the analysis.
Rotate facilitators or include cross-functional observers (e.g., maintenance in a quality project).
Fresh eyes break habitual reasoning.
4.4 Reward Subtraction
Recognise teams that simplify rather than complicate.
For instance, a supervisor who eliminates a redundant approval step should get the same recognition as one who implements a new digital checklist.
It rewires cultural reinforcement away from “add more” toward “think smarter.”
4.5 Build “Cognitive Cooling Time”
Fast decisions save time but increase error.
Research shows pausing even a few minutes before finalising a decision activates analytic processing networks.
Encourage teams to sleep on complex decisions — or review them at another shift meeting.
Ask: “Are we deciding fast because it’s urgent — or because it’s comfortable?”
5. What Neuroscience Says About Repeated Operational Mistakes
A 2025 neuroscience study (Neuroscience News, 2025) found that when people solve repeated problems, their brains rely on “habitual neural circuits.”
Unless a strong error signal interrupts, the brain will reuse the same mental pathway even if it no longer fits.
This explains why even with Kaizen and Lean tools, recurring issues persist: the mind’s default circuits resist change.
To trigger a switch, teams must experience cognitive conflict — something that makes the brain notice “this strategy isn’t working.”
In practice, that could mean:
- Showing clear data that contradicts expectations.
- Rotating problem owners.
- Simulating extreme “what-if” conditions (counterfactual thinking).
In essence: you can’t fix recurring plant issues without fixing recurring thinking loops.
6. Turning Awareness into a Habit
It’s not enough to understand bias — the real goal is to embed bias-awareness into daily problem solving.
Here’s a checklist you can apply at your next production meeting:
- Have we framed the problem correctly?
- Are we using a previous fix without testing its relevance?
- What have we added unnecessarily?
- What could we remove?
- What evidence challenges our assumption?
- Are we rotating perspectives?
By asking these repeatedly, you’re training your team’s System 2 thinking. Over time, it becomes part of the operational culture — the same way safety consciousness became second nature after years of reinforcement.
7. Reflection: The Human Factor Behind Continuous Improvement
Lean, Six Sigma, TPM — all these systems assume humans can think clearly and act rationally. But reality shows otherwise.
Biases are the hidden variables in the equation of continuous improvement.
That’s why Toyota’s philosophy of “hansei” (reflection) is so powerful — it’s not just process review, but self-review.
It’s about asking, “How did I think about this problem?” not only “What did I do?”
The next time a recurring issue resurfaces, resist the urge to reopen the same countermeasure file.
Instead, open your mind:
- Which bias led us here?
- How can we re-frame the problem differently?
- What would it look like if we simplified rather than added?
Because ultimately, improving the process starts with improving the processor — the human brain.
Conclusion
For operational managers, supervisors, and engineers, solving production problems is a daily mission. Yet, the biggest obstacle isn’t always the machine — it’s the mindset.
By understanding how the brain naturally shortcuts decisions, you can identify where problem solving bias is hijacking your efforts.
Train your teams to slow down, re-frame, subtract, and question assumptions.
Use reflection and diversity of thought as built-in checks against bias.
So next time that same defect, delay, or downtime reappears — don’t just fix the symptom.
Fix the thinking that keeps recreating it.
References & Recommended Reading
- De Neys, W., & Goel, V. (2011). Heuristics and Biases in the Brain: Dual Neural Pathways for Decision-Making. Taylor & Francis.
- Zhang, Z. et al. (2025). The Induction of a Specific Mental Set for Problem Solving … Cerebral Cortex. PubMed 40474500
- Our Brain Typically Overlooks This Brilliant Problem-Solving Strategy, Scientific American, 2021.
- Ahn, K. (2022). Bias on the Brain: Yale Psychologist Examines Common Thinking Problems, Yale News.
- Problem-Solving in the Brain: Neuroscience of Hierarchical Reasoning, Neuroscience News, 2025.






