When Your Values Walk into the Room Before You Do…and Create Havoc

You're about to interview a 47-year-old male accused of sexually abusing multiple children over several years. The allegations are detailed. The victims are young. The evidence suggests a pattern of manipulation and grooming. You've reviewed the file. You know what he's accused of. And you are about to walk into that interview room.

Let me be clear about why we're starting here, with something this unpleasant: because it is in these unpleasant situations that we most often fail. This is where well-trained interviewers revert to their worst instincts. This is where your training goes out the window, and you don't even realize it's happening. It is in these situations where your brain is going to try to fail you…and you need to understand what is going on in your brain to bring things back into focus.

The problem isn't that you don't know how to conduct an interview. You've been trained and you've done hundreds of interviews. You understand rapport-building, active listening, open-ended questions, strategic use of evidence, preserving autonomy, adaptation, empathy and so forth. You know that your goal is to gather accurate information, not to punish or moralize. You know all of this.

But when you sit down across from this particular subject, something else walks into the room with you: your values and beliefs. And unless you understand what those are and how they operate, they are ready to sabotage all of your plans for this interview.

What I hope to outline here is an explanation of the mechanism by which your brain will hijack your methodology — and the only defense that actually works against that, awareness and practiced control.

Values, Beliefs, and Why the Order Matters

Most people say "beliefs and values" as though they're interchangeable, or as though beliefs come first. They don't. The order matters, and I'm deliberate about putting values first because values are the surface level — the things we care about, the principles we hold. Values are things like: protecting children, seeking justice, honesty, fairness, and respect for authority.

Beliefs lie underneath values. Beliefs are older. Beliefs are the deeper architecture — the foundational assumptions about how the world works that make our values feel necessary and right. If your value is "protecting children," your beliefs might include things like "children are innocent and vulnerable", "adults who harm children are fundamentally different from other people", "sexual abuse causes irreparable damage", and "people who commit these acts are beyond redemption".

Your beliefs justify your values. And both of them — values and beliefs together — drive your behavior (Brosch & Sander, 2014). Behavior is what other people can see: your tone, your questions, your body language, whether you lean in or pull back, whether your voice stays level or gets sharp. Your behavior is the only thing visible in that interview room. The subject can't see your values. They can't see your beliefs. They can only see what you do.

And just as they cannot see your values and beliefs, you cannot see theirs. You both are operating in the blind. But we can change this. If you are aware of your own values and beliefs, you are only 50 percent blind and now you have the advantage. But we'll get to that.

The Filtering Mechanism

When a subject across from you starts talking — explaining what happened, denying allegations, providing his version of events — you think you're hearing what he's saying. You are not.

Every word that comes out of his mouth is being filtered through the framework of your values and beliefs before it reaches your conscious awareness. Research on motivated reasoning demonstrates that we process information in ways designed to arrive at conclusions we're already motivated to reach (Kunda, 1990; Epley & Gilovich, 2016). You are not hearing his words; you're hearing your interpretation of his words, processed through your belief system about child abusers, innocence, guilt, deception, redemption.

He says: "I care about those kids."

You hear: "He's trying to manipulate me by claiming to care about his victims."

He says: "I never intended to hurt anyone."

You hear: "He's minimizing and deflecting responsibility."

This phenomenon — biased assimilation — has been well documented. Lord, Ross, and Lepper (1979) demonstrated that when people with opposing beliefs are shown the same evidence, each side interprets that evidence as supporting their own position. More recently, Dan Kahan's work on cultural cognition has shown that people with different values will look at identical factual information and reach opposite conclusions based on their worldviews (Kahan, 2012).

He might be doing exactly what you think he's doing. Or he might not be. The point is, you have no idea which it is, because you're not actually hearing him — you're hearing your filtered interpretation. And you're completely unaware this translation is happening. It feels like objective perception. It's not.

The Accumulation

Here is the essence of the problem. Each statement that conflicts with your belief system registers as a small threat signal. One statement might not trigger much. He minimizes the impact on the victims — okay, you notice that, but you stay steady. He suggests the allegations are exaggerated — another signal, your gut tightens, but you're still functioning.

But these signals accumulate. He denies elements you believe are corroborated. He suggests the children misunderstood his intentions. He frames himself as the real victim. Each statement that contradicts your deeply held beliefs about child abuse, innocence, and responsibility stacks up.

You're not consciously counting these contradictions. Your nervous system is tracking it. Research shows that challenges to our beliefs are experienced not as abstract intellectual disagreements, but as threats to our sense of self and our way of life (Kahan, 2013). Each contradiction challenges your worldview, your sense of how things should be, your identity as someone who protects children.

Antonio Damasio's somatic marker hypothesis provides insight into this mechanism. When we experience situations with negative outcomes, our bodies create physiological associations — somatic markers — that activate when we encounter similar situations in the future (Damasio, 1994). If you've had previous encounters with subjects accused of similar crimes — encounters that were frustrating, threatening, or that violated your values — your body has encoded those experiences. When this subject's statements echo those patterns, your somatic markers activate, flooding you with the feelings you experienced before, often without your conscious awareness.

The Threshold

At some point — and you won't know exactly when — you cross a threshold. The accumulation of challenges to your belief system tips you from information-gathering mode into defense mode. Your sympathetic nervous system activates. Stephen Porges' Polyvagal Theory describes how our autonomic nervous system shifts from a state of social engagement (where learning and complex communication are possible) to sympathetic activation (the fight-or-flight response) when we perceive threat (Porges, 2022).

This doesn't feel like fight-or-flight because you're sitting in a chair in a controlled environment. It feels like clarity. It feels like appropriate anger. It feels like finally cutting through his bullshit.

Your heart rate increases. Your breathing shallows. Your muscles tense. Adrenaline flows. And your behavior changes. But let's be clear. Just because this feels like an appropriate response does not mean that it is.

Your questions shift from open-ended exploration to pointed challenges. Your tone hardens. You interrupt more. You present evidence not strategically, but confrontationally — to prove he's lying, to corner him, to make him admit what you already believe is true. You're no longer gathering information. You're defending your belief system. You're prosecuting. You're punishing.

And you don't realize any of this is happening…and you don't realize your goal of gathering accurate information has just left the room.

From your perspective, you're responding appropriately to a manipulative subject who's clearly lying. You think you're still in control, still professional, still doing your job. What you don't see is that you've abandoned your methodology. The training is gone. You're running on autopilot, and the autopilot program is: "Defend your beliefs about what child abusers are like and how they should be treated."

Why This Destroys Your Goal

Everything going on inside your body and brain is making it less likely that you'll get accurate information.

The subject feels the shift. Even if he's guilty of everything alleged, he's not stupid. He feels you move from curious to hostile. He feels the judgment. He feels the attack. And he responds the way humans respond when they feel attacked: he gets defensive, he shuts down, he tells you what he thinks will end the pain, or he digs in harder on his denials.

If he was going to provide information that contradicts his initial story — details that might actually advance your investigation — he's not going to do it now. If there's nuance to what happened, if there are partial admissions possible, if there's information about other victims or subjects, you've just made it vastly less likely he'll share it.

And if — and this is a scenario we must consider — if he's not guilty, if there's been a misunderstanding or false allegations or mistaken identity, you've just conducted an interview that confirms your pre-existing beliefs rather than discovering what actually happened. Research on confirmation bias consistently shows that once we form a hypothesis, we seek information that confirms it while dismissing or minimizing contradictory evidence (Nickerson, 1998; Hill et al, 2008).

Either way, you've failed at your actual goal: gathering accurate information.

The Solution

There's no technique that fixes this. No interview method, no matter how well-designed, can override your unconscious defensive response to threats against your core belief system. The Strategic Use of Evidence doesn't help if you're deploying it in attack mode. Rapport-building doesn't work if you're radiating judgment. Open-ended questions don't gather information if they're really just opportunities for the subject to incriminate himself in ways you've already decided are true.

The only solution is self-knowledge.

You need to know what your values and beliefs actually are — not the ones you wish you had, but the ones that actually drive your behavior. You need to know what it feels like in your body when those beliefs are challenged. Does your jaw tighten? Does your breathing change? Does your inner monologue get sharper? You need to recognize the accumulation happening before you cross the threshold. And you need to catch yourself in that moment — to notice "I'm feeling defensive, my belief system is activated, I'm about to abandon my methodology" — and make a different choice.

Without self-knowledge, you're just hoping you'll stay calm and professional. Hope isn't a strategy. When you're sitting across from someone accused of something that violates your deepest values, hope fails.

Self-knowledge is what keeps you in the room as an interviewer instead of a prosecutor. It's what lets you hear what's actually being said instead of your interpretation. It's what allows you to gather information instead of confirming what you already believe.

And in the end, that's your job. Not to defend your worldview. Not to punish people who violate your values. But to gather accurate information in service of finding out, as best we can, what actually happened.

Everything else is just you, working out your own issues at the expense of your mission.

References

Brosch, T., & Sander, D. (2014). Appraising value: The role of universal core values and emotions in decision-making. Cortex, 59, 203-205.

 Damasio, A. R. (1994). Descartes' Error: Emotion, Reason, and the Human Brain. Putnam.

Epley, N., & Gilovich, T. (2016). The mechanics of motivated reasoning. Journal of Economic perspectives, 30(3), 133-140.

Hill, C., Memon, A., & McGeorge, P. (2008). The role of confirmation bias in suspect interviews: A systematic evaluation. Legal and criminological psychology, 13(2), 357-371.

Kahan, D. M. (2013). Ideology, motivated reasoning, and cognitive reflection. Judgment and Decision Making, 8(4), 407-424.

Kunda, Z. (1990). The case for motivated reasoning. Psychological Bulletin, 108(3), 480-498.

Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence. Journal of Personality and Social Psychology, 37(11), 2098-2109.

Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175-220.

Porges, S. W. (2022). Polyvagal theory: A science of safety. Frontiers in integrative neuroscience, 16, 871227.

Next
Next

Stop Seeking "The Truth" in Interviews—Build the Foundation Instead