An earlier version of this paper was presented on Oct. 21, 1984 as The Thoughts of Armchairs and Children at Middle Atlantic States Philosophy of Education Society Fall Conference, Rutgers University

"Thinking" Like Computers Do
2004 Edward G. Rozycki, Ed. D.

linkcheck 2/16/18

361.The armchair is thinking to itself... Where? In one of its parts? Or outside of its body in the air around it? Perhaps not anywhere? But what is then the difference between the internal speech of this armchair and that of another standing by it?
How is it then with a human being? Where does she talk to herself? Why is it that this question seems senseless -- no specification of place is necessary except to say that the human is talking to herself? On the other hand the question as to where the armchair is talking to itself seems to demand such an answer.
The reason is this: we want to know how the armchair is supposed to be like a human being, whether, for example, its head is the upper back of the chair, etc.
What process is occurring when one talks to oneself inwardly? How should I explain it?
Only in such a way as you could teach the meaning of the expression "talking to yourself". We certainly learn this meaning as children. However, let not anyone say that whoever teaches it to us must tell us "what is occurring in the process." --- Ludwig Wittgenstein (1)

Some Thoughts on Thinking

Should we teach children to think like computers? Professor Seymour Papert says yes; Professor Jeffrey Kane (2), no. I suggest we not take sides too hastily. Certain questions need answers before the dispute makes sense. Indeed, the very search for answers might dissuade one from partisanship.

The first question is: "What is thinking?" We might expand this to ask the following:

a. What does one claim when one says, "I know how I think"? Is this a claim to know how to describe a process?

b. What does one claim when one says, "I know how he thinks"?

c. On what basis does one make interpersonal comparisons? Interspecies comparisons?

Somewhere (I have lost the citation) Wittgenstein talks about two bushes trimmed in the shape of elephants. He comments that even though we can recognize that both have legs and trunks and ears we would not imagine that the the structure of twigs and leaves of one bush, cut to the shape, say, of an elephant's ear, would be identical to the structure of twigs and leaves in the other bush cut to the same shape.

You think of chocolate and type "chocolate". I think of chocolate and type "chocolate". My computer types "chocolate". It does not follow that a process has occurred in you which is isomorphic to a process which has occurred in me. A fortiori it does not follow that a process has occurred in my computer which is isomorphic, even similar, to what has occurred in either of us. (See Isomorphism: program, structure and process.)

In computerese we would say, "From similar --even identical-- outputs one cannot infer similar inputs or similar processing."

Can computers think? Germans can, because they denken. And when they wonder, they fragen sich. And when they consider, they ueberlegen. And when they dream they traeumen. Sometimes they don't think, things merely occur to them, Sie fallen ihnen ein.

Or their thoughts, like The Lorelei, persist in memory, Es kommt ihnen nicht aus dem Sinn. Germans can think because their language provides them with a usefully contrastive set of terms that can be employed in remarkably parallel fashion to our terms like think, wonder, consider, dream, occur to ... .

Armchairs don't think, not even German armchairs, Sessel, denken not because we have not discovered internal electronic processes that occur in them, but because useful, contrastive descriptions of their behavior do not parallel the think, consider, dream, ... contrasts that we make for humans.

The very idiosyncratic communication behavior among people we observe, we tend to reify into languages, English, German, French, etc., inventing grammars which hardly begin to describe even a part of these communication phenomena. "Thinking" is a reification of an even higher level of abstraction: the postulation of process identities based on certain assumptions of structural similarity and similarity of outcome, usually linguistic outcomes. Thinking becomes something to be compared across persons and groups and species. Computers come to be said to think.

What is at stake here? What does it matter if we say computers can or cannot think. This is not a matter of evidence although it often seems to be argued that way. Rather, we are making theoretical decisions about the extension of our terminology.

We can distinguish three concerns:

a. We may be creating minds and torturing or killing them inadvertantly, say, by turning computers on or off. In general, this is a concern for the rights of possibly intelligent beings. Someone so concerned might well opt to include automata among thinking beings, for caution's sake.

b. We may undermine certain considerations which tend to protect humans from exploitation. If machines can think then being a thinker is no criterion for distinguishing the exploitable from the non-exploitable. Such a concern presses one to opt for computers not being able to think.

c. If machines think, then illuminating their thinking processes, which we can often control, illuminates our thinking processes as well.

None of the arguments embedded in these concerns is overly coherent. Professor Kane via Weizenbaum articulates a variant on the "fear of exploitation" concern. I call this the "demise of culture" concern. A question relevant to this concern is "What desirable social relations are not enhanced, or may be even harmed, by interaction with computers?

I wonder, however, if the desire to maintain "culture" indicates little more than the desire to exact certain traditional kinds of deference as the prerequisite for access to knowledge. Is the threat that computers pose really a threat to liberate potential learners from subjugation to certain traditions of authority?

Let me pose this question more positively if in a somewhat more complicated manner: to the extent that it is possible to distinguish knowledge from the social matrices in which it is pursued, does the computer offer better access to such knowledge?

Certain knowledge cannot be "packaged" because it results from personal experience of social interaction. But so far as "packaged knowledge" is concerned, I would opt for the computer: it delivers more economically and with less pain than other modes of delivery(3). Much classroom instruction might be reasonably replaced by computer lessons. This perhaps touches another source of anxiety about computers.

Threats to the Culture

The essence of a culture is found in its traditions of deference. It is these traditions of deference which restrict access to knowledge. For developing certain kinds of skills and experiencing the power of their employment, the computer can provide access to knowledge heretofore purchased only by an emotionally costly submission to norms irrelevant to the use of that knowledge. The computer can minimize the organizational impediments to learning, e.g. minimal teacher attention, lack of practice time, lack of variety, and also minimize social impediments, e.g. lack of manners, lack of "interest", non-conformity with the agendas of "hidden curricula".

The computer is a "threat to culture" in that it puts the lie to the easy equation of authority of rank with authority of knowledge. With a computer, the socially humble can best their "superiors" in a competition where knowledge counts. The security agencies of the government ought to offer a ten-thousand dollar prize to every "hacker" who manages to break into their systems, instead of trying to hide their own lesser competence by prosecuting them. Software "pirates" perform a social good in an ancient and honorable tradition: they break restrictions to knowledge and power that only serve narrow economic interests. No teacher who has ever photocopied an article, taped a program, or for that matter, memorized a poem can take seriously the pious fulminations of software companies against copying software. The computer can help to undermine important parts of our cultural heritage: those traditions of dominance based on wealth, luck, and social position.

Professor Kane's Arguments

My lengthy prefatory remarks over, what I have to say may seem less paradoxical or perverse. I agree with Professor Kane that Education, when it is responsive to the breadth and richness of human experience concerns itself with expanding the mind rather than filling it. Children do not merely become informed but are transformed; they seek a sense of coherence and order creating energizing visions of themselves and their purpose.(4)

(Professor Kane speaks of their purpose, singular, and I don't know whether to see this as a commitment to a certain kind of monism, or to inquire whether purposes would be an allowable substitution.)

Though I agree with Kane's general conception of education, I would, were it in my power, require every teacher, administrator, parent and especially philosopher to learn to program a computer in two very different languages. It would have a prophylactic effect.

Much of what purports to be discussion of issues of mind, education and computer technology is either the hubris of technocratic sophisty or the ramblings of romantic ignorance. We are experiencing a resurrection of the Burris-Frazer "debate" from Walden II. Turing(5), for example, is willing to concede thinking to any machine that whose output he is unable to distinguish from a possible human interlocutor. A similar enthusiam exists in the area of Artificial Intelligence research. Mechanisms can do interesting and surprising things. Perhaps it appeals to our desires for apotheosis to refer to the works of our own hands as intelligent.

I believe Professor Kane's concerns are ill-served by his arguments. He invokes a distinction between "machines" and "vital, living beings" , and appeals to "our primary intuitive insight into our 'more than mechanical nature'". Such intuitions often turn out to be cultural blindnesses; in any case, such arguments would only persuade those already convinced. Kane worries that thinking has become conceptualized as "mere" information processing; ideas, "mere" data; judgment, "mere" calculative power. There is nothing "mere" about these concepts. They are unproblematic only to simple minds. The so-called information sciences have appropriated certain terms and given them substantially different meanings. Information, for example, need not have anything to do with someone's being informed as opposed to importuned, told, shown, ... etc. Rather, a fluctuation in the strength of an electric current beyond certain limits might count as "a bit of information."

Data is an obscure concept. It generally signals a theoretical boundary beyond which analysis will not be pursued. What is generally confused with judgment is some notion of decision. Now, decisions, in mechanisms, are not the results of decidings but rather the resultants of interacting forces or processes. The calculative power of a mechanism is determined by the number of such resultants obtained under standard conditions. What has happened is not that human concepts of thinking, ideas and judgment have been reduced; rather, they have been replaced and few have noticed the gradual substitution.(6) Professor Kane invokes what he cannot, the "ineffable context of human experience." This can no more serve his argument than an unthinkable thought or an unspeakable word.

Baldly asserted are the claims that

a. the information-processing concept of thinking cannot grasp the subtleties and nuances of the human being, and

b. Papert's program will not do more than reify the mechanistic thinking undergirding much of modern education.

Again, I fear that only true believers will be convinced.

Seymour Papert's program

Professor Kane fears that Papert's attempts to have children master the art of deliberately "thinking" like a computer, in a supposedly step-by-step, literal, mechanical fashion, will foreclose on their thinking in more meditative, less exploitative ways. I don't believe this need be a concern. I have taught children and adults to reconceptualize activities in terms of sequences of contingencies. Some of these sequences lend themselves more readily than others to rough translation into a programming language and a computer can be employed to achieve desired outcomes.

What I labor to teach along with programming logic is this:

a. the relativity of the "simples" which constitute the programming language -- we create our basic set of terms to suit our own convenience and the requirements for communicability;

b. the indeterminate nature of the heuristic or algorithm -- we should not confuse our habits of interaction with "what is really there"; we choose the approach, we decide what is an acceptable outcome.

To Papert I would reply:

a. Computers don't think;

b. Terms are only relatively literal;

c. Step-by-step is not uniform;

d. Mechanical in his context means no more than single- minded. That is, no adjustments to the process are permitted while the process is being undergone.   The idea is to develop skill in process-building. Socially, this is very artificial because we do not usually allow the loss of valued things just to maintain the purity of process.

Logo is great for kids; for adults, too. Moving that turtle around give one a real sense of the power of one's own mind. Even jaded adults like myself are astonished by the beauty of patterns based on simple recursions and iterations. There are dangers and I will mention some later. But they are avoidable. Rather than close the mind, an intelligent approach to the use of computers can raise all the ancient problems of philosophy and sensitize the user to them.

Calculative and Meditative Thinking

A major problem with Professor Kane's argument is his conflation of Heidegger's notion of calculative thinking with the notion Papert invokes of computational thinking. They are not the same. Computational thinking is basically what Kuhn(7) would call thinking within "normal science". That is, thinking restricted within the boundaries of a set of more-or-less well defined concepts according to a set of procedures accepted as valid within a research community. Now while it may be true, as Professor Kane musters up Heidegger to proclaim, that calculative thinking never does "contemplate something simply for what it is, only how it can be used", one can not assert that scientific thinking of even the most pedestrian, normal sort, i.e. computational thinking, is always utilitarian and pursued out of no intrinsic interest. There are libraries of books spinning out the consequences of particular systems of computational thinking; mathematics, as a discipline, does not justify its pursuits for their utility.

It is not systems of concepts or heuristics that inhibit contemplation or the appreciation of the Good, the Beautiful and the True, but rather the values we inculcate to be pursued. Kane would promote "the search for harmonies in nature" as part of the child's introduction to various aspects of the world, suggesting that such would be approached "with an unexpectedly high degree of enthusiasm and rigor." Is this not what religious education undertakes? Are we to begin with any assumed or postulated natural "harmonies"? I would question Kane's Realistic outlook that enables him to talk about the "squares and circles in" the child's world. There are windows and blocks and tables and people in a child's world, but not circles and squares until they are inducted into a heuristic than enables them to perceive them -- ecce Sesame Street!

Computer Knowledge: the Good and the Bad

I would like to close this already overlong reply by listing what I perceive to be the real benefits and dangers of computer knowledge. (Note I do not use the phrase "computer literacy" as this is the latest of our institutional dilettantisms being masqueraded in the trappings of knowledge.) Computers can be used as demystifiers. Mathematics, for example, can be made accessible to persons -- like myself -- who struggle with the standard algorithms that serve interests more traditional than enlightening.

But the same programming that enlightens can also indoctrinate. "Simulations" can be devised that subtlely inculcate attitudes in the guise of drilling fact. There exists, for example, a game called "Hammurabi" where one must decide how many people's lives to risk to pursue other interests as the ruler of a kingdom. The equations that determine the outcomes -- that serve as "laws of nature" -- are not brought to the players attention. The belief may be subtley developed that the continued health of a community must always require the sacrifice of lives. This is a simple simulation. I shudder to think what our think-tanks and military strategists are doing.

Computers are deference-neutral. Nor can they be inveigled to concede rewards unearned. A student need neither fear humiliation nor expect mercy at the computer keyboard. This is a salutary situation for many kinds of student. This deference-neutrality is, at the same time, a limitation. Some kinds of knowledge are inextricably embedded in social situations and can only be obtained through proper rituals of deference. The standard "normalities" of a culture cannot be obtained via computer. Neither can a great deal of what is called "academic" knowledge in those subject areas where what I would call "reflexive" rather than "causal" teaching occurs, e.g. philosophy, history, literature, music, art. The aim of such teaching is not predominantly to bring about outcomes in students as to exemplify scholarship within the domain of the discipline. These are the "educative" disciplines. Yet even the most "objective" of sciences -- you pick one, I am not sure -- contains a substantial amount of acculturative training.

Computers can give a readily understandable experience of the notion of paradigm and of the relativity of paradigms and paradigmatic thinking. If the instructor takes care to make the point. On the other hand, there is the danger of paradigmatic blindness, such as that suffered by a great many educational researchers, if the limitations of the particular computational system of concepts are overlooked.

I believe the distinctions among storage, processing and input-output variables provide a useful perspective for certain kinds of learning. One ought not treat theoretically what is most efficiently acquired by memorization or other means, e.g. vocabulary by play-acting, morphemic variants by memorization. Grammar goes out the window. It is generally talk about language that does not improve the production of language. (I find entirely perplexing Professor Kane's suggestion that students can study great writers and through them be brought to see how syntax can be used to bring tone, beauty, power and precision to human thinking. There is a category-confusion here.) The ultimate peril of computers is that of anything that provides us certain intrinsic satisfactions, e.g. music, sex, art. We may pursue it to the exclusion of other desirable things. On the whole the computer presents us dangers neither novel nor overwhelming. Well-used, we risk losing not our souls, but our chains.


1.Wittgenstein, Ludwig. Philosophische Untersuchungen. Frankfurt-am-Main: Suhrkamp, 1967. Page 143. (My translation.)

2. Kane, J. "Computer Education and the New Model Us". Paper presented at the 1984 Fall Conference of the Middle Atlantic States Philosophy of Education Society, Rutgers, New Brunswick, NJ, October 21, 1984. p.4

3. Kulik, J.A & Kulik, C.C. "College Teaching" in Peterson, P. and Walberg, H. (eds.) Research on Teaching Berkeley: McCutchan, 1979. Pages 70-93.

4. Kane, J. op.cit. p.2.

5. Turing, A.M. "Can a Machine Think" in The World of Mathematics, Vol. 4. New York: Simon & Schuster, 1956 2099-2123.

6. Rozycki, Edward G. "The Functional Analysis of Behavior" Educational Theory 25, 3 (Summer '75) pp.278-302.

7. Kuhn, T.S. The Structure of Scientific Revolutions Chicago: U. of Chicago Press, 1970.