Friday, January 22, 2010

Cognition and Digestion

Digestion is a way that an organism turns part of the world external to itself into a part of the world internal to itself. In the case of human beings, this process involves bringing the external bit into contact with the organism, and acting on the external bit in various ways that make it apt for absorption into the organism. There's no clear boundary between the time at which the external bit is fully external and the time at which it is fully absorbed. The boundary is fuzzy--but there are clear areas on either side of the boundary. The food on my plate is clearly external to me. The protein molecules doing their work inside my cells are clearly internal.

That's how humans digest. Amoebas digest a little differently, moving a part of themselves closer to the external object rather than bringing the external object closer to themselves.

Both approaches are ways to rearrange the environment so that external objects can be appropriated, their parts being made parts of the organism.

If beehives can be considered organisms in their own right (a notion taken seriously by a few biologists, google "superorganism") then they digest in yet a third way. Rather than reaching parts of themselves (still connected to themselves) out toward external objects, and rather than bringing external objects wholesale into themselves, they send out pieces of themselves as "agents" to digest the external object where it stands. The agents then bring the digested product back to the organism itself--the beehive.

I visualize extended cognition as working something like this third mode of digestion.

(For "organism," read "cognitive system." Right now I think the significance of Extended Cognition is that the boundary of the cognitive system is not the same as the boundary of the organism. In other words, "organism" is a good natural kind, "cognitive system" is a good natural kind, and instances of the latter are not always inside or coextensive with the latter. They sometimes merely overlap organisms. Extended cognitive systems would be examples. And they could potentiall exist overlapping no organism at all. Some form of artificial intelligence instantiated purely on electronic hardware would count as an example of that.)

The analogy isn't exact. My cognitive system doesn't send disconnected "agents" out to the environment to alter it in a way making it part of my cognitive structure. But neither, like an amoeba, does my cognitive system "envelope" the external. Rather, it sends out agents that remain connected to me. For example, my hands as they manipulate a pencil or a set of blocks. And the "manipulation" performed by my "agents" to make the external part of my cognitive system can be pretty subtle--having perhaps no physical effect on the external object at all but altering the norms that govern that physical object. (Which in turn necessitates, on my account of norms, which I don't have space to write about here, that the external object now has different dispositions in virtue of its relations to its environment, so it's not as though there's no physical cash-out to the account of manipulation I'm describing here.) A structure I'm using as an external memory store may never be physically contacted by me, but my cognitive acts with relation to that structure alter the norms governing that structure--i.e., make it the case that the structure should accurately reflect the state of something else, perhaps--in a way that incorporates the structure into my own cognitive structure.

Like digestion, cognition can be fuzzy. Food in my mouth is sort of outside me, sort of inside me. The structure I just mentioned, similarly, is "sort of" part of my cognitive structure, "sort of" not. Or put differently, it's only weakly part of my cognitive structure--not very robustly incorporated into me at all. But it's on the line somewhere not all the way on the "not cognitive" end of the scale. It's moved over a little bit, on account of my cognitive manipulations of the environment designed to make that structure part of my cognitive structure and, I think, part of myself. Sort of! (Fuzzily, just a little.)

Otto's notebook is much further into the cognitive end of things. An implant on my skull directly connected to my brain is even further. And so on.

That's what I think right now. Cognition (more properly, "cognitive incorporation") and digestion are both ways of making my environment part of myself. Digestion is a way of making my environment part of my metabolism, and chemical properties are probably the ones most relevant to this process. Cognitive incorporation is a way of making my environment part of my thinking (my information guided goal directed norm governed purpose unifying structure), and informational/representational properties are probably the ones most relevant to this process. Things can be more or less digested, and things can be more or less cognitively incorporated. Things can be digested at a distance, and things can be cognitively incorporated at a distance.

Not an argument of course! I'm just describing and analogizing.

I wrote the above in response to something in Rupert's new book, but I've strayed a bit from the point I was responding too and this is already long enough so I'll say something more substantive more directly about the Rupert in a future post.

3 comments:

  1. Interesting. I like the use of analogy, as I consider analogy (and by extension, story telling) a primary way of thinking about and conveying information and ideas.

    I wonder how closely the analogy might hold as we gain understanding of the mechanics of thought and thinking?

    Yours is an interesting place. I shall follow.

    Mike

    ReplyDelete
  2. I'm curious about your claim that a cognitive system is a good natural kind. I realize there's precedent for this going back to the first cybernetics movement. We also find it in Pylyshyn's "Computation and Cognition": "...what I have been calling cognitive phenomena are a 'natural kind' explainable entirely in terms of the nature of the representations and the structure of programs running on the cognitive functional architecture...". I do not believe that cognition is a natural kind, and I am wondering if you know of any literature that treats the issue explicitly, or even if there are other recent key articulations of the idea (or if, after Pylyshyn and Fodor, it just became received wisdom).

    ReplyDelete
  3. I'll admit I don't know the literature that addresses that question. It seems there are plenty of people who assume it though--for example the cognitive scientists. (Inasmuch as scientists think at all about concepts like "natural kind"...)

    For some leads you can read the book by Rupert (Cognitive Systems and the Extended Mind), as well as most of Andy Clark's recent books on the topic of extended cognition. They deal with foundational issues in the philosophy of cognitive science, so you're very likely to find leads in their bibliographies if you're looking for works explicitly asking the question whether "cognitive system" is a natural kind. Rupert and Clark themselves assume it is one, and to my recollection don't spend any time arguing for the point.

    Why do you think it's not a natural kind? (And which account of natural kinds are you relying on here?)

    ReplyDelete