The Illusion of Objectivity
What happens when truth isn’t denied outright, but quietly edited out?
That question cuts to the heart of one of the most powerful—and least recognized—forms of influence: the control of information by selective omission. In my Influence & Control Meta-Framework, I’ve emphasized how manipulation rarely begins with boldfaced lies. More often, it begins with careful pruning. Remove one fact here, downplay another there, and eventually a narrative emerges that feels complete precisely because the reader never sees what’s missing.
This post looks at a striking example involving Hamas propaganda and the complicity—sometimes reluctant, sometimes ideological—of major Western media outlets. It’s a case that exposes something uncomfortable: bias isn’t always what is said. Sometimes bias is what is never allowed to be said at all.
| Related Posts for Influence & Control Meta-Framework |
| Explore my series exploring the psychological tools, logical distortions, and social mechanisms that shape how influence and undue control operate. |
| Posts | References |
And that’s the problem. The audience has no clue what was cut out. Mainstream reporting is a one-way conversation. You and I can’t raise our hands and ask, “Why didn’t you include this detail?” We can’t interrogate the reporter, challenge assumptions, or request the missing context. Instead, we receive a polished narrative that appears objective only because its rough edges have been sanded away outside our view.
This is influence through omission: a method that shapes perception not by fabricating facts but by curating reality. As we’ll explore in this post, the result isn’t simply biased reporting—it’s narrative control. And when militant groups like Hamas apply direct pressure, the problem becomes something far more serious: propaganda laundered through the language of journalism.
But before diving in, let me say this up front: journalism always involves selection. Truth is messy, complex, and impossibly large. No article can cover the full life experiences of everyone involved. Every reporter has to use a scalpel. But what happens when the scalpel is replaced with a hacksaw? Or when the surgeon is being watched—and threatened—by the people they’re supposed to report on?
That’s where discernment becomes not just a skill, but a civic responsibility. Because in the age of curated information, it isn’t enough to ask, “Is this true?” We must also ask, “What would I think if I knew what was left out?”
The Mechanism — Omission as Influence
When people think of propaganda, they often imagine blatant lies or state-run media pumping out slogans. But the most effective form of propaganda isn’t the lie—it’s the half-truth. It’s the carefully curated story that omits the inconvenient details, the contradictory data, or the context that would complicate the narrative being delivered.
Omission is powerful precisely because it is invisible. A lie, once detected, shatters trust. But an unreported fact? The audience never knows it existed. And because mainstream media is a one-way conversation, the reader can’t raise questions or probe for what’s missing. The journalist speaks; the public receives.
In the framework we’ve been building throughout this series, this is classic System 1 exploitation.
System 1 wants coherence.
- It wants a simple, emotionally satisfying story.
- It wants good guys and bad guys.
- And it fills in gaps automatically, without ever noticing the gaps exist.
When reporters leave out key information, the audience’s mind completes the narrative—smoothly, intuitively, and uncritically.
This is also where framing effects and bias cascades show their teeth. The first emotional impression—the first frame—sets the stage for all later interpretations. If the initial report frames a conflict as one of disproportionate suffering or aggression, every subsequent detail is filtered through that frame. A bias cascade follows:
- An emotional trigger (sympathy, outrage).
- Availability bias (the vivid images define the story).
- Confirmation bias (we seek what reinforces the frame).
- Groupthink (shared interpretation solidifies).
- Polarization (nuance disappears).
All of this can unfold without a single false sentence. Influence doesn’t require fabricating reality; it only requires curating it.
And when a militant organization like Hamas directly controls what information can be reported from within Gaza—through threats, intimidation, or censorship—then omission stops being a side effect of journalism and becomes a deliberate weapon. Journalists aren’t just selecting what’s newsworthy—they’re navigating what’s allowed. And when dangerous actors set the boundaries of what can be said, the reporting that emerges, no matter how polished or professional, is structurally shaped by coercion.
This is influence by subtraction: a narrative sculpted not by what is declared, but by what is deleted.
Case Study — Matti Friedman and the Associated Press
To understand how omission becomes influence, we need more than theory—we need to see how it plays out in real reporting. Few examples are as stark or as well-documented as journalist Matti Friedman’s account of censorship and narrative control inside the Associated Press (AP) during his years covering Gaza.
Friedman served in the AP’s Jerusalem bureau from 2006 to 2011. In 2014 and again in later talks—including a 2025 AJC forum—he went public with his experiences, describing how the global press corps in Gaza operated under direct and indirect Hamas censorship. What he revealed was not a conspiracy theory, not speculation, but firsthand testimony: the story the world receives from Gaza is not a neutral product of journalism—it is a negotiated narrative shaped by fear, pressure, and ideology.
The Detail That Disappeared
The pivotal moment came in 2008. A Palestinian AP reporter in Gaza—someone Friedman deeply respected as an honest and capable journalist—filed a story noting a crucial fact:
Hamas fighters were disguising themselves as civilians and being counted as civilian casualties.
This wasn’t an opinion. It wasn’t political positioning. It was fact.
And then the phone rang.
The reporter told Friedman:
“You have to take that detail out.”
It was clear what had happened. Someone had spoken to him. Someone who could threaten his life. The threat didn’t need to be spelled out. It was understood.
Friedman removed the detail.
He suggested that AP add an editor’s note acknowledging Hamas censorship. AP leadership refused.
From that moment on, he says, AP worked under de facto Hamas control. Not because AP wanted to support Hamas, but because the reporters on the ground were operating in an environment where telling the whole truth could get you killed. Every story that came out of Gaza after that point was, by necessity, a curated narrative—shaped by what Hamas would allow.
The Information Pipeline
Friedman explained that:
- Reporters in Gaza are all Palestinians.
- They fall into three groups:
- Those who sympathize with Hamas.
- Those who fear Hamas.
- Those who belong to Hamas.
This is the entire information pipeline.
There are no Western reporters working independently inside Gaza. Every photograph, every casualty figure, every quote, every “on the ground” detail is filtered through Hamas’s apparatus.
And then the kicker—the part that most readers never realize:
Casualty numbers presented as neutral statistics come from Hamas’s own “Health Ministry.”
Once those numbers appear at the top of a story—“50 Palestinians killed, 1 Israeli”—the emotional framing is set. The context, the causes, the combatant status—it doesn’t matter. System 1 has already decided how to feel.
As Friedman put it:
“It settles the story before any other information is presented.”
This is influence by framing, accomplished through omission.
The Scale of the Problem
Friedman noted that during his time at AP:
- The organization had 40 full-time reporters covering Israel—more than China, India, or all of Sub-Saharan Africa.
- Stories unflattering to Hamas were suppressed.
- Stories portraying Israel negatively were prioritized.
The issue isn’t that one side is always right or wrong. The issue is this:
When only one side of a story is allowed to be told, journalism becomes propaganda.
And the audience, unaware of what was removed, accepts the curated narrative as objective truth.
This is not an indictment of individual reporters. Friedman, himself a journalist, repeatedly stresses that many were doing their best under impossible circumstances. It’s an indictment of a structure where militant groups can coerce, intimidate, and shape the flow of information—and where global media, out of fear or ideological sympathy, plays along.
“All warfare is based on deception.” – Sun Tzu
Why It Matters — Perception Management at Scale
The stakes here are far bigger than one censored detail or one compromised article. What we’re dealing with is perception management—the shaping of how entire populations understand a conflict, assign moral weight, and form political convictions. And this shaping often occurs before the audience even realizes a narrative battle is happening.
Once a single emotional frame is established, System 1 takes over. The vivid image, the casualty number, the dramatic headline—these hit first, fast, and hard. System 2, which should evaluate nuance and seek additional context, either never fully engages or gets recruited into rationalizing what System 1 already decided.
This is the same dynamic we explored in earlier posts on bias cascades: one emotional trigger cascades into an entire architecture of belief.
- Availability bias exaggerates the significance of vivid images.
- Framing effects shape moral interpretation.
- Confirmation bias filters incoming information.
- Group polarization hardens the narrative further.
What begins as a single omission becomes a self-reinforcing worldview.
This isn’t new. It’s the psychological engine behind propaganda campaigns, identity-based movements, and ideological possession. In my earlier post Possessed by Ideology, I explored how a worldview can become so totalizing that it begins to think through a person rather than being thought about by them. That phenomenon doesn’t emerge in a vacuum. It’s cultivated—often unintentionally, sometimes deliberately—through repeated exposure to a curated set of facts.
This is where Steven Hassan’s Influence Continuum becomes a crucial lens. Healthy environments encourage:
- Transparency
- Open inquiry
- Multiple viewpoints
- Accountability
Unhealthy environments rely on:
- Information control
- Emotional manipulation
- Us-vs-them thinking
- Suppression of dissent
Media operating under coercion—even indirectly—slides toward the unhealthy end of the continuum. When militant groups shape what can be seen or reported, the audience receives not journalism but a perception residue—a filtered reality optimized to evoke a specific emotional and political response.
This is exactly what Sun Tzu warned about:
“Engage people with what they expect… It settles them into predictable patterns of response.”
Emotionally predictable populations are influence-ready populations. You don’t need to lie to shape perception. You only need to control what the audience sees first, most, and most vividly.
And when a story is settled by the opening frame—“50 civilians killed in Gaza, 1 Israeli”—almost no amount of later nuance will overcome the emotional template that System 1 locked in place. The framing becomes the story. The omitted details become invisible. And the resulting worldview feels not manipulated but self-evident.
That is the quiet power of curated information: it doesn’t force belief—it guides it.
The Reader’s Responsibility — Thinking Beyond the Feed
So what do we do with all of this?
We can’t personally send reporters back into Gaza. We can’t reverse years of editorial pressure, political agendas, ideological sympathies, or the coercive reach of militant groups. But we can control how we, as readers, approach the information that reaches us—even when that information is incomplete.
This is where your Influence & Control Meta-Framework moves from abstraction to practice: the first defense against manipulation is discernment. Not cynicism, not distrust of everything—but disciplined, thoughtful discernment rooted in truth, not emotion.
Mainstream media is, by nature, a one-way conversation. It speaks; we listen. There is no cross-examination. No follow-up questions. No ability to ask, “What was left out?” That asymmetry doubles the responsibility of the reader.
Here are a few habits that strengthen your defenses against curated narratives and influence by omission:
1. Look for What’s Missing
Every report has a frame. Ask:
- What would change if I knew more context?
- Whose voices are missing here?
- What details might have been excluded because they complicate the story?
This simple question shifts your brain from System 1 intuition to System 2 analysis. It interrupts passive consumption and forces active engagement.
2. Compare Across Sources—Especially Those You Disagree With
Bias isn’t neutralized by reading “both sides” but by triangulation:
- Compare outlets with different editorial slants.
- Follow independent journalists who challenge mainstream narratives.
- Read across national boundaries—foreign reporting often exposes what local outlets miss.
The goal isn’t to find who is “right” but to uncover the pieces missing from each narrative.
3. Trace the Chain of Custody
Before accepting an emotionally powerful statistic or image, ask:
- Where did this come from?
- Who is allowed to report from this location?
- Is the source independent or controlled?
As the Matti Friedman case shows, the chain of custody matters. If all Gaza casualty figures come from Hamas’s “Health Ministry,” they are not neutral data—they are curated inputs.
4. Slow Down Your Thinking
Information designed to provoke emotional urgency—fear, outrage, tribal loyalty—bypasses System 2.
Pause.
Ask questions.
Let the emotional impulse settle.
In previous posts, you described how bias cascades begin with a fast emotional trigger. In this context, slowing down is not merely intellectual—it is a moral act of resistance.
5. Embrace Intellectual Humility
This is not about having the perfect take. It’s about acknowledging:
- We never have all the facts.
- No outlet tells the whole story.
- Our perspective is always partial.
But humility is not paralysis. It’s a commitment to align with truth—even when that truth is complex or uncomfortable.
The philosopher in you knows this instinctively: truth corresponds to reality, not to preference. And reality is always more complicated than the narratives curated for public consumption. If we want to avoid being possessed by an ideology, or swept into a bias cascade, or manipulated by curated information, then we must cultivate discernment—not as a hobby, but as a civic duty.
Truth Requires Effort
The Matti Friedman case isn’t just a media critique. It’s a demonstration of how easily perception can be shaped when information is curated, censored, or quietly removed. Influence doesn’t always arrive with grand speeches or ideological slogans. Sometimes it shows up as a missing sentence—a detail erased before you ever had the chance to weigh it.
And that’s what makes this type of influence so potent. You can’t question what you don’t know exists. You can’t challenge a narrative formed from half-truths if the other half never reached you. By the time your System 1 processes the emotional frame, the impression has already settled. The cascade begins.
This is why discernment matters—because truth is rarely simple, and it is never effortless. It requires us to resist the comfort of tidy narratives, to see ambiguity where others demand certainty, and to acknowledge that our own perspectives are incomplete. But the reward is integrity. And integrity is the only shield strong enough to protect us from manipulation, whether the source is a narcissist in our living room, a cult leader on a stage, or a militant group shaping international headlines.
“Engage people with what they expect… It settles them into predictable patterns of response.” – Sun Tzu
Propagandists, politicians, and ideologues all rely on that principle. They exploit what we are predisposed to see, feel, and believe. They use curated information to guide perception, emotional framing to bypass critical thought, and omission to create a reality that feels self-evident—because the reader never sees the scaffolding beneath it.
The world is full of narratives competing for your attention and allegiance. Some are earnest, some are manipulative, and some are crafted by people who would gladly let you mistake propaganda for truth. Your task isn’t to trust nothing—it’s to trust wisely. To read carefully. To slow down. To question what is absent as much as what is present.
Because truth is not a perspective, nor a feeling, nor a narrative. Truth is what corresponds to reality—and finding it requires effort.
In an age of information abundance and discernment scarcity, the discipline to seek truth is not just a philosophical virtue. It is a form of freedom.
Excerpt
Media influence isn’t always about lies—it’s often about what gets left out. Using Matti Friedman’s firsthand account of Hamas censorship, this post explores how omission, framing, and emotional bias shape public perception. Understanding these tactics is essential to resisting manipulation and safeguarding truth.
Resources
- Hamas Controls the Headlines. A Former AP Reporter Explains How, Jay Engelmayer, Explainer 05 August 2025 https://thejudean.com/index.php/opinions/39-explainer/3988-hamas-controls-the-headlines-a-former-ap-reporter-explains-how
- Journalist Matti Friedman Exposes Media Bias Against Israel, AJC, July 3, 2025 https://www.ajc.org/news/podcast/journalist-matti-friedman-exposes-media-bias-against-israel
- Watch This Ex-AP Reporter Directly Confirm That the AP ‘Collaborates with Hamas’ – And He Doesn’t Stop There by Samuel Short, The Western Journal https://www.msn.com/en-us/politics/government/watch-this-ex-ap-reporter-directly-confirm-that-the-ap-collaborates-with-hamas-and-he-doesn-t-stop-there/ar-AA1K6BDr



Leave a comment