tis.so

Untitled piece on discursive warfare in the history of computing (or: the piece before the piece titled “Whitewashing the MIT death machine”)

by SQCU

If we look at the ancient history of computers, there was lot of craft, intuitionistic reasoning, and spectacular visions far preceding analysis and theory predicting their behavior. Jacquard machines used paper-punched sheets to physically control weaving robots for over a century before some weird old fellow called Babbage talks up a calculator that never hits production. Jigs, machine tools, and physical tokens encoding measurements as tools to control mechanical systems are omnipresent in the American 19th century, again, nearly 100 years before lay histories of technology imagine the industrial robot to be invented.

Even in a world not yet graced by a theory of control systems’, we cannot help but trip and stumble over pre-paradigmatic systems of machine control’. Weirdly, the systems of machine control seemed a lot better in these days before the paradigm of the computer.

The mathematics that informs systems like aircraft bomb sights, mechanical looms, or the radar-station-guided anti-aircraft missile systems is pretty light on computational complexity’ as we understand it today. Most of the design challenge we’ll discover if we stare at the Nike missile system’s implementation is in physically building machines that can simulate the computational processes needed for the entire system, as an aggregate, to steer a very large bottle rocket.

What we see, instead, if we focus on the latter half of the 20th century is remarkably dissociated from the triumphs and horrors of the early computer. There is no beautiful swords-to-plowshares mythopoetic closure, where the ancient analytical engine of death powering a million bombs, ten billion anti-aircraft cannon shots, and a hundred thousand hand-crushing sweatshops is defanged and given a new identity to upbeat corporate filler music and frutiger aero visuals.

Rather than a symbolic and meaningful transition of power, where a generation of military and industrial analysts designers cede their field, allowing a new generation of peace-scholars to wrap their death-machines in a theoretical general purpose’ computer, something else happens. A pure, abstract, and inadequate theory of computing emerges, driven by a bunch of hucksters who cannot build self-stabilizing systems with the detail or complexity of Ross Ashby or Isaac Newton Lewis.

This new school’, which we can presume to be centered on Marvin Minsky in particular, never take over the project of designing and controlling mechanical systms driven by feedback and abstractions over direct measurements in the world. Instead, the new school’ mostly… does nothing? Instead of doing analysis, building theory, or encapsulating previous fields into a greater field responsible for previously disparate concerns, the new programmers’ litigate conceptual and social divisions which weaken the responsibility and breadth of the computer. Internal combustion engines, themselves incredibly cybernetic machines in their operation and design are considered a totally different field from computers in abstract’, even while they are increasingly making use of integrated circuits to solve chemical reaction optimization problems. The design, implementation, and healing of computer networks is split off into a discipline of information technology’ even and especially as network design abstracts away the individual computer and develops greater theoretical strength. In fact, information technology’ is cultivated as a social stratification, inventing a social class coded as dullards unfit to handle or program any single computer in particular, suitable only to do maintenance’ over appliances’.

We can no doubt discover hundreds of examples of automatic control systems and computers used to implement, govern, or interrogate feedback control systems. What we will notice with distressing consistency is that any use of a computer-in-the-world is déclassé, profaned, less serious than the academic nexus of high prestige MIT type guys’. This is especially so for applications that are empirically rigorous, taxing, or expose more challenging mathematical limits.

I cannot speak to the specific history of what must have played out in the systems of social power that determine what is/is not (academically) meritorious, or those which determine what sort of problems are to be funded/centered, or defunded/marginalized. But we are afforded some certainty that the invention of the indexical term artificial intelligence’ marks a period in history where abstract’, pure’, or artificial intelligence’ programming is gentrified. Thereafter, the general’, universal’, or artificial’ intelligence is treated as a more serious or fundamental concern than all of the other ways matter can be brought to life and made to carry a human’s will past the human’s dying breath. Coincidentally, everything that Marvin Minsky was personally bad at also seems to have withered.

In place of a conclusion, a TL:DR;

It sounds like something happened in the politics of resource distribution in the sciences. This something’ seems structured to deprive the ford guys’ (engine designers, airplane designers, machine tool programmers) of citations, honor, and relevance to history. Students memorize the name of Babbage, a man who didn’t build a computer, instead of Jacquard, a man who did. However this resource-distribution politics played out, the winners seem to have won really hard, establishing stifling norms that create a mysterious discontinuity in how computing and information systems are understood by laymen. In the 21st century, it feels commonsense that computers must have only been invented in the 1990s, and are about to change everything.

Imagine how disastrous it would be if someone were to finally wire up one of those computers to a gun or a bomb! It could mean the end of all war.