The Vision
Learn
Interviews
Resources
About

Media Literacy

The information environment we live in is noisy, fast, and often deliberately misleading. This section gives you the tools to navigate it — so you can tell the difference between what's real and what's designed to manipulate you, and make decisions based on what's actually true.

In this section you'll learn:

  • How to evaluate whether a source is trustworthy
  • How algorithms decide what you see — and what you don't
  • The tactics people use to manipulate your beliefs and emotions
  • Practical methods for fact-checking claims before you share them
  • How to step back from outrage-driven media cycles

When something matters, go to the source directly. Your county's website, the actual text of a bill, public records. These are boring and sometimes hard to read but they are the actual thing. Headlines about laws are often written by people who didn't read them fully or have an interest in making them sound a certain way.

Cross referencing matters too. If one outlet is reporting something and nobody else is, that's worth pausing on. Significant events get covered by multiple sources independently. If the same story is everywhere but traces back to one original report, you're still really looking at one source.

A real example: when West Virginia passed its permitless carry law, headlines ranged from "WV Becomes Most Dangerous State" to "WV Restores Constitutional Rights." The law did one specific thing. Reading it took ten minutes. Most people who argued about it online had done neither.

Try this: The next time a law is in the news that affects you, find the actual text and read the first page. Notice how different it feels from the headline.

Social media platforms are not in the business of informing you. They are in the business of keeping you on the app as long as possible because your attention is what they sell to advertisers. Everything else — the content, the news, the outrage — is just the mechanism for keeping you scrolling.

The algorithm learns what keeps you engaged and feeds you more of it. If you engage with content that makes you angry, you get more content designed to make you angry. Not because anyone planned it maliciously but because anger drives engagement and the system optimizes for engagement above everything else.

A real example: Facebook's own internal research showed their algorithm was actively amplifying divisive content because it generated more engagement. They knew. The documents were leaked by a whistleblower. They kept the algorithm running anyway.

Try this: For one day, notice every time you feel a strong reaction to something online. Ask yourself — did this make me feel informed or did it make me feel activated? Those are different things.

A lot of media — not all, but a lot — isn't trying to inform you. It's trying to keep you afraid, confused, and dependent on them for the truth. Fear keeps people watching. Confusion keeps people from trusting anything except what feels familiar. That's not an accident. It's a model.

This isn't a conspiracy theory. Fox News settled a defamation lawsuit for $787 million after internal messages showed their hosts privately didn't believe what they were saying publicly. The media industry has a structural problem that crosses political lines.

Watch for: false urgency that makes everything feel like a crisis requiring your immediate reaction. Headlines that contradict the actual article. Emotional language designed to trigger rather than inform. Anonymous sources used to make unverified claims sound credible.

Try this: Find a news story that made you emotional this week. Read it again and notice every word designed to make you feel something rather than convey information.

Fact checking has gotten more complicated. We have AI tools, browser extensions, and dedicated organizations like Snopes and PolitiFact. All useful. None of it replaces your own thinking.

The most reliable process is still the oldest one: go to primary sources, find multiple independent accounts, and look for context. A statistic without context is almost meaningless. "Crime is up 40%" — up from what? Over what time period? Which crimes? Numbers can be technically accurate and deeply misleading at the same time.

Real situations are complicated. A policy can be both well-intentioned and harmful. "True" and "false" are often not enough to capture what's actually going on. Nuance is where most fact-checking fails.

Try this: Find a viral claim you've seen recently and spend 15 minutes trying to find the original source. Not an article about it — the actual original. Notice how far back the trail goes.

Not everything requires your immediate reaction. Most things don't. The outrage cycle works by making everything feel urgent so you stay engaged, share things, argue in comment sections, and come back tomorrow for more. The energy that could go into building something real gets burned up in arguments that change nothing.

You don't owe anyone an immediate response. You don't owe it to a stranger online to get angry because they won't listen to reason. You are allowed to disengage. That's not giving up — that's protecting your energy for things that actually matter.

The people who make the most real difference are rarely the loudest ones online. They got off the phone, went outside, and did something. Outrage without direction is just noise. Outrage with a plan is how things actually change.

Try this: Pick one issue you've spent emotional energy on online this month. Ask yourself honestly — did any of that energy produce a real world result? If not, what would have?