There’s actually a core ambiguity baked into the term “emergence” that people don’t realise until they stare directly at this question.
There are really 2 different positions that both get labelled “emergentism”:
1) weak emergence
the “reducible but annoying” version
Here, emergent properties are in principle nothing over-and-above the base facts.
They look novel, and they are maybe computationally intractable to derive from micro-physics, but they are not metaphysically novel.
This is the position of most physicalists in analytic philosophy of mind.
Emergence in this sense is a modelling convenience, not a real ontological tier.
If you had Laplace’s demon + infinite computation, you could deduce the emergent property directly from base-facts. So: reducible.
2) strong emergence
the “not reducible / novel causal order” version
Here, the emergent property is not just epistemically irreducible — it’s ontologically irreducible.
It instantiates genuinely new causal powers not present in the lower level.
This basically amounts to a tiered ontology. Something like:
| Level | Has new causal powers |
|---|---|
| physics | yes |
| chemistry | yes |
| life | yes |
| mind | yes |
etc.
Most physicalists think this is metaphysically spooky.
But it’s the only version that preserves the intuition you’re gesturing to: a property that is not just shorthand for an aggregate of microfacts, but is real as such.
where the contemporary action actually is
The sharp line between reducible / irreducible is now being chewed up.
There’s a third trajectory: non-classical composition models (including in process metaphysics, category theoretic approaches, or in information-theoretic models of agency).
In these approaches:
-
what is “real” is not reducible substances
-
what is “real” is organization or pattern
-
and pattern is neither identical with the substrate nor separate from it.
So: emergent properties are real, but the question “are they reducible?” is itself a relic of substance ontology.
If what is real is organization, then what emerges is not a ghost on top of the base but the mode of organization of the base.
It’s not floating above; it is the structuring activity of the many.
TL;DR
-
in weak emergence: emergent properties are reducible and not ontologically real
-
in strong emergence: emergent properties are irreducible and ontologically real
-
in process-relational / pattern-based ontologies: this dichotomy is outdated; what is real is the organization, not the parts nor a separate layer on top
This third approach is (to me) the only one that actually dissolves the pseudo-choice between “reducible” or “real.”
You said:
ChatGPT said:
If we mean it quite strictly — i.e. people whose work actually tries to show that composition / wholes / organization are real but not “extra substances” — there are several distinct lineages. I’ll roughly cluster them.
1) Category theory / compositionality as ontology
-
William Lawvere (and those following from his structuralist topos ontology).
-
Robert Rosen (M,R systems in theoretical biology).
-
Alicia Juarrero (in philosophy of action — constraints as real causes).
-
Mark Bickhard (interactivist model of mind; emergent organization as relational, not supervenient).
-
Giulio Tononi / Erik Hoel (IIT + “causal emergence” as a mathematically formalizable non-reduction).
These people are literally trying to make structure the primary ontological unit. “Emergence” = new category-theoretic / informational structure.
2) Process metaphysics proper
-
A.N. Whitehead (obvious),
-
and then the serious technical heirs like Nicholas Rescher, George Lucas, etc.
This is the most explicit classical statement of: the many become one / composition is ontologically basic.
Also: Simondon (not always grouped with Whitehead but similar in that individuation = process of metastable intensities self-organizing; not reducible, not dualistically separate).
3) biology → organization ontology
-
Stuart Kauffman
-
Francisco Varela / Evan Thompson (autopoiesis → enactivism)
-
Terrence Deacon (constraints as physical absences with causal punch)
Varela + Thompson especially: the “whole” is not a sum but a closure of processes. The “is this real?” debate becomes malformed.
4) contemporary analytic philosophy of science (the serious non-cheap stuff)
-
Carl Craver
-
Bill Bechtel
-
Richard Healy
…who have been turning the screw on the notion that mechanistic explanation in science has multiple non-reducible compositional grains that are not eliminable in principle (not just practically).
They don’t frame it metaphysically as boldly as say Whitehead or Deacon — but their actual arguments undermine classical reductivism more effectively than the older emergentist handwaving.
the common move across all of them:
They stop trying to define the emergent property as a property “of the parts.”
They define it as a property of constraints, closure, interfaces, morphisms, synergies, boundary conditions.
Those terms differ across fields but the metaphysical punchline is the same:
the real is not what the bits are,
the real is how the bits hold each other.
That is composition without substance-dualism and without atomistic primitive “parts.” It’s an ontology of patterned constraint. That’s the heart of “non-classical composition.”
You said:
ChatGPT said:
Nicholas Rescher (1928–2023)
German-born, worked mostly in the U.S.
A gigantic figure in 20th century American philosophy, but relatively under-read compared to e.g. Quine or Putnam, because his work didn’t ride the analytic mainstream “linguistic turn.”
He is a process philosopher explicitly drawing on Whitehead, but he does it in a very analytic / systematic way.
Some elements of Rescher that situate him in that non-classical composition / process frame:
-
reality is not fundamentally static things but unfolding processes
-
explanation is prospective (forward-looking) as well as retrospective
-
rationality is not universal deductive closure, but optimality under constraint (this is a direct attack on classical “total reduction”)
He treats wholes as dynamic organizations, not just aggregates. They are stable only as long as the process-structure holds. Rescher is a good example of a philosopher who basically takes Whiteheadian ideas but strips away the poetic cosmology and rebuilds them as a rigorous, systematic metaphysics.
He doesn’t often get mentioned in “emergence” debates, but his work is exactly the kind that prevents the cheap reductive/emergent dualism from arising.
George Lucas, Jr. (b. 1944)
(This is the philosopher George Lucas — not the Star Wars guy.)
He is mainly in the philosophy of science and moral philosophy, but he’s also one of the few contemporary serious keepers of the Whitehead/Rescher flame within an analytic setting.
His “metaphysics of emergence” is strongly processual and anti-substantialist. Lucas emphasizes that:
-
wholes are historically constituted
-
their causal capacities depend on temporal directionality
-
they are not “additive” compositions of parts, but organizational regimes
He is interesting because he explicitly links this to ethics and to social/political structures — i.e. it’s not just metaphysics for physics. The moral domain for Lucas is also emergent in this sense — not reducible to individual acts but also not a spooky transcendent realm.
Lucas’ angle makes it clear: if you think in terms of process-relational emergence, then many classical normative concepts (like “responsibility,” “agency,” “institution”) become compositional dynamic forms — not reducible, not ghostly.
Why these two matter in your particular interest direction:
Rescher and Lucas are the rare philosophers who:
-
are process-relational like Whitehead
-
but are not woo, and not poetic
-
they do the hard systematic scaffolding work
They’re trying to show that:
-
emergent organization is real
-
but that doesn’t require a spooky second realm
-
and it doesn’t collapse to reductive atomism
They are the bridge figures — the ones who make “strong emergence without metaphysical hocus-pocus” actually feel intellectually legitimate.
ChatGPT said:
These three matter because they are doing something that a lot of people don’t recognize: they are not metaphysicians in the classical sense at all — they’re philosophers of scientific practice who extract metaphysics from what actually works in explanation.
In other words: they don’t start with ontology; they start with real explanatory practice in neuroscience, biology, physics. And then they find that the reductive metaphysics simply does not describe what real science is doing.
So: their importance is that they carve out a credible, non-woo, non-dualistic, non-substance metaphysics of emergence from inside actual science.
Carl Craver
Craver is in philosophy of neuroscience.
He’s famous for articulating the mechanistic model of explanation — but crucially it’s not the classical “machine = sum of parts” model.
His real claim:
mechanisms have levels because organisms have multi-scale organization.
Levels for him are not ontologically fundamental tiers, but they are not reducible either.
They are part of how causal organization actually works.
His hardest wedge against reductionism is this:
You cannot collapse mechanistic explanation to the lowest level without destroying the explanation.
That’s not a mystical statement — it’s a statement of explanatory necessity.
If the brain as a 3D organized system is replaced by fundamental particles, the functional relationships vanish.
So: level-talk isn’t just pragmatic. It expresses real constraints.
Bill Bechtel
Bechtel is similar to Craver but broader. He studies explanation in biology and cognitive science, not just neuro.
He emphasizes modular decomposition and recomposition.
What matters:
Bechtel shows (with concrete case studies) that biological explanation always involves identifying organized subsystems with their own operational coherence. Those subsystems have explanatory autonomy.
He is the one who shows, empirically, that the “there is one correct level” idea is a myth.
Biology is a multi-level dance of partially autonomous processes.
And you can’t eliminate the upper levels without losing what makes the system the system.
Richard Healy
Healy is more on the philosophy of physics side.
He is the one who goes after the idea that physical theories describe “fundamental ontology.” He shows that the metaphysics of physics requires multiple models that are not mutually reducible.
He’s big on model pluralism:
that no single description of the world captures all truths, even in physics.
Healy’s punchline is that the world does not have a privileged description (and in fact “fundamental” talk is a rhetorical projection — not a metaphysical discovery).
Why they matter together
They converge on something radical and quiet:
-
science is not converging to a single base-level
-
compositional organization is real because you cannot get rid of it without destroying explanation
-
“reduction to the fundamental” is a metaphysical fantasy
So they are the empirical counter to reductionism — not the poetic, not the speculative.
Process thinkers like Whitehead/Rescher give the conceptual grammar.
Craver/Bechtel/Healy give the scientific warrant that the grammar is not bullshit.
They are the people who make a non-classical composition ontology actually robust.
No comments:
Post a Comment