Dusk Somewhere

Why are ideas from computer science absent from Marxist discourse?

Posted at — Aug 6, 2024 by Izzy Meckler

The introduction of the cipher 0 or the group concept was general nonsense too, and mathematics was more or less stagnating for thousands of years because nobody was around to take such childish steps

— Grothendieck, in a 1982 letter to Ronald Brown

Computer science is the study of precisely-specified discrete1 processes, their capabilities and limits. Such processes are called algorithms.

Algorithms can be implemented on the devices we call computers, but the conclusions of computer science are equally applicable to Euclid’s constructions on wax tablets. The basic model for what constitutes an ”algorithm” is a mathematical object called a “Turing machine”, although there are many extensions of the basic object that give them additional powers: the ability to use randomness, parallelism, embeddedness in space (cellular automata), quantum resources.

All of these extensions allow for computer scientists to study fine-grained aspects of algorithmic processing (e.g., resource-usage, time-efficiency, etc.) but do not expand the limits of what it is possible to compute.

The basic Turing machine is already capable of computing anything these extended versions can compute. In the 90 or so years since Turing’s definition, no one has been able to devise a device capable of performing a “computation” not simulable by a Turing machine.

In fact, it is a somewhat widely accepted conjecture among computer scientists and physicists that any physical process can be approximately simulated by a Turing machine! Of course, for a complicated enough process, actually constructing such a Turing machine to perform the simulation may be hopelessly beyond our technological capacity.

Social processes have an algorithmic aspect

Social processes are highly algorithmic — that is, they are not closely dependent on small-scale physical details. Objects are not encountered directly as physical objects but as representatives or “tokens” in a socially defined logic. From the point of view of what is relevant for the evolution of social processes, the physical details of an object are typically not that important. Of course physical properties do matter to an extent (e.g., rubber bullets vs. metal ones, the nutritive content of food, heavy metals in water, etc.). But a door is a door whether it is made of wood or metal, regardless of the specific configuration of atoms in the material, etc. That is, it is something you pull or push on to enter a space.

The object itself is actually much more than that — we could use it as a surface to paint on, hit it with an axe, use it as a percussion instrument, melt it down (if it’s made of metal), etc. But we usually do not do these things.

To summarize: we encounter objects in a heavily compressed form, which throws away most of the physical information associated with it, allowing us to operate on an abstracted version that carries with it the range of socially permissible interactions.

Given this, it is clear that the study of algorithms has relevance for those wish to analyze the functioning of the present social organism, and who wish to restructure it. For example, perhaps computer science can suggest paths for such restructuring, or point towards difficulties we may encounter.

Algorithmic-ish ideas in actually existing Marxism

The algorithmic aspect of social reality has been recognized in left discourse through the ideas of class-structure, ideology, reification, and at times through the language of semiotics. The strength of these approaches vis-a-vis computer science is attention to the specificity of the algorithmic processing performed by human beings. That is, the way that the algorithmic aspects are related to social reality, class structure, our specific material existence, etc.

The weakness is that they have been unable to move beyond extremely rudimentary high-level description of these phenomena due to the extreme complexity of the objects of study and the unwillingness to make precise but incomplete (or “toy”) models. In a word, they have been unable to see the forest for the trees. Or even to invent a word for forest.

Computer science tries to study algorithms divorced from social context or specificities of implementation. Of course that makes it vulnerable to unrecognized introduction of ruling-class ideology (the obsession with time-efficiency, or the model of rationality in game-theory for example). But it also enables it to use mathematical reasoning — extremely powerful due to its precision and potential generality — but impossible to use if one insists on describing everything directly in terms of social reality.

Example of what computer-science can give Marxism

Here are some examples of what computer-science can give to the movement toward communism:

What accounts for this absence?

It is possible that the knowledge generated by computer science is simply not useful for moving toward communism, although I think there are very good theoretical reasons to believe this is not the case.

Moreover, if this were the primary reason for the absence, we would have probably seen Marxists discussing or trying to apply computer scientific ideas and then rejecting them.

Marxist discourse is, for some time now, mostly produced in humanities departments by people who have very little contact with developments in computer science or the physical sciences generally.

Moreover, the integration of computer science with the technology industry plays a big role. The disastrous application of computer science toward capitalist ends has created instinctive distrust of the field in general among many of those interested in communism.


  1. Usually anyway — continuous processes are typically studied by the fields of dynamical systems, differential equations, and physics. In practice there is some overlap between all these things, I am just speaking generally. ↩︎