Back to Philosophy

Philosophy of Mind Part 6: Intentionality

May 1, 2026Wasil Zafar 15 min read

Mental states have a peculiar property: they are about something. A belief is about a fact; a desire is for an outcome; a fear is of a possibility. Brentano called this aboutness "the mark of the mental." How a chunk of brain matter can be about the Eiffel Tower or the number seven is one of the deepest puzzles in philosophy.

Table of Contents

  1. Brentano's Mark of the Mental
  2. Twin Earth & Wide Content
  3. Intrinsic vs Derived
  4. Fodor's Mentalese
  5. The Intentional Stance
  6. Naturalizing Content

Brentano's Mark of the Mental

The Austrian philosopher Franz Brentano, in Psychology from an Empirical Standpoint (1874), revived a medieval scholastic notion. Every mental phenomenon, he claimed, is characterized by what the scholastics called intentional inexistence — by a direction toward an object. In love there is something loved; in hate, something hated; in judgment, something accepted or rejected.

This is unlike anything in the physical world. A rock does not refer to anything; the planet Mercury is not about Venus. But the thought of Mercury is about Mercury. Brentano concluded that intentionality is the irreducible mark of the mental — and on its irreducibility, he hung an argument for the autonomy of psychology from physics.

Note on terminology: "Intentionality" in this technical sense has nothing to do with intending. It means directedness or aboutness. Almost all mental states are intentional in this sense — beliefs, desires, fears, hopes, perceptions, memories.

Twin Earth & Wide Content

Hilary Putnam's 1975 thought experiment "The Meaning of 'Meaning'" overturned a long tradition.

The Twin Earth Scenario

Putnam 1975

Imagine a planet identical to Earth except that what fills its lakes, rivers, and faucets is not H₂O but a chemically distinct substance, XYZ — though indistinguishable to ordinary observation. In 1750, before chemistry, an English speaker on Earth says "water is wet"; her molecule-for-molecule duplicate on Twin Earth says "water is wet." Internal brain states are identical. Are the thoughts identical?

Putnam: no. Earth-Mary's word "water" refers to H₂O; Twin-Mary's refers to XYZ. Same neural state, different content. "Meanings just ain't in the head." Mental content is partly determined by the environment (wide content), not solely by internal states.

This forced a distinction between narrow content (what is shared by physical duplicates) and wide content (what depends on environmental relations). Most mental content seems to be wide.

Intrinsic vs Derived Intentionality

John Searle distinguished two kinds of aboutness. The marks on this page are about meaning — but only because we, the readers and writers, treat them so. They have derived intentionality, parasitic on minds. Likewise, computer programs, signs, photographs, and (Searle insists) any AI system have at most derived intentionality.

By contrast, our beliefs and desires have intrinsic intentionality — they are about things "all on their own," not because some external interpreter takes them to be. The Chinese Room argument (Part 4) was Searle's attempt to show that derived intentionality cannot bootstrap into the intrinsic kind.

Fodor's Mentalese

Jerry Fodor proposed in The Language of Thought (1975) that cognition occurs in a structured mental language — Mentalese — with vocabulary and syntax. Beliefs are sentences in Mentalese stored in the "belief box"; desires are sentences in the "desire box." Reasoning is computational manipulation of these sentences.

The view explains productivity (we can think indefinitely many novel thoughts) and systematicity (anyone who can think "John loves Mary" can think "Mary loves John") via the same trick that explains them in language: combinatorial structure. Mentalese became a foundational hypothesis of classical cognitive science — and a perpetual target of connectionist alternatives.

Dennett's Intentional Stance

Daniel Dennett took a deflationary line. There is no fact-of-the-matter about whether a system "really" has beliefs. There are three useful predictive stances:

  • Physical stance: Predict using physical laws. Tedious but always available.
  • Design stance: Predict from how a system was designed to function (a thermostat, a clock).
  • Intentional stance: Predict by attributing beliefs, desires, and rationality. We use this for chess programs, ant colonies, and our spouses.

If the intentional stance predicts well, the system has the beliefs we attribute — there is nothing more to having beliefs than this. Dennett's view dissolves Searle's distinction: there is no "intrinsic" intentionality, only stances that succeed.

Naturalizing Content

If we want intentionality to fit a physicalist worldview, we need to explain how content arises from non-mental ingredients. Two leading projects:

  • Information-theoretic / indicator semantics (Fred Dretske): A mental state's content is what it reliably indicates. Smoke means fire because smoke reliably co-occurs with fire. This needs refinements to handle misrepresentation (what does it mean if smoke appears without fire?).
  • Teleosemantics (Ruth Millikan): Content is fixed by what a representation was selected to do by evolution or learning. A frog's snap-at-fly response is "about flies" because that's what it was selected for — even if it sometimes snaps at BBs. Misrepresentation becomes possible because selection establishes a norm.

Neither project is universally accepted, but both move the question of intentionality from mystery to research program — which is, perhaps, the most that any naturalistic philosophy can hope for.

Next in the Series

In Part 7: Personal Identity, we ask what makes you the same person you were ten years ago — Locke's memory criterion, Parfit's teletransporter, and the Buddhist parallel of no-self.