Replication without a partner, self-replication, a concept first modeled in the late 1940s by mathematicians John von Neumann and Stanisław Ulam. Given a grid of cells—cellular automata—and a number of simple instructions, one can create an abstract machine that copies each of its parts to new a location along with the original set of replication instructions. The original designs also include more literal versions of these machines, and later scientists expand on the concept, proposing self-replicating spacecraft and factories, prototyping machines and robots. Working from these ideas, in 1961 Robert Morris Sr., Victor Vyssotsky and M. Douglas Mcllroy create a program at Bell Labs called Darwin, in which dueling computer programs vie for control over a sector of memory called an arena. Eventually an unbeatable predator program emerges, an apex to the pyramid (as well as the eventual inspiration perhaps for Tron’s Master Control Program). Several variations on Darwin and a decade later, Bob Thomas writes both the first computer virus and the first computer worm, Creeper. Like much in computing, fiction leads science: the term worm itself comes from one of the earliest examples of cyberpunk fiction, John Brunner’s Shockwave Rider, in which a fugitive phone phreaker, Nick Haflinger, uses a computer tapeworm to protect information from snooping corporations and governments. Outside of sci-fi political allegories, a worm is a variation on a virus, though in addition to being able to replicate, a worm can also transport itself across networks—no need for floppy-to-floppy transmissions. Like Brain, Creeper is harmless, but it also is a pest, and quickly a second program, Reaper, is written in order to eradicate it. The dynamic between Reaper and Creeper can be modeled along the same lines as participants in the Darwin game—emergent predator-prey dynamics—and, as the name Darwin suggests, programmers were fully aware that these automated programs had qualities of life—however life is defined.
Brain virus, 1986
When reading the literature on biological viruses, it is striking to see that the scientific community is divided as to whether a biological virus is alive at all. A biological virus is not a cell; it lacks the cell’s organelles that make proteins and generate energy. A biological virus is a protein wrapping DNA or RNA; a design, if that is the word, so simple that scientists cannot reach a consensus as to whether a virus is alive or dead. Instead of living thing, instead of nonliving thing, in some scientific literature viruses are described as on the edge of life. A vague phrase, the edge of life, raising images of shuffling undead, a twilight Interzone. But biological viruses are not so romantic, not so unknown. They are closer to what Descartes thought of animals: machines, clockwork things that can only remake themselves. Since viruses are not cells, since they lack many of the properties of life, the virus must bind to a host cell’s surface, and if by luck the receptor on the surface of the host cell can be opened by the virus and the virus is welcomed in, then the host cell’s machinery is available for the taking. The rest is coding: with a cell’s genome factory hijacked by the virus, the cell inadvertently produces more viruses, sometimes so many in quantity that the host cell itself may die, broken up by its new multiplying guests. The process is fast, with viruses reproducing in hours and days, and in each quick generation comes mutations in the virus’s genetic sequencing. Many of these mutations are harmful to the virus and many do nothing at all for the virus. But some may help, giving the virus accidental selective advantages, such as defenses against immunity, or more virulent reproduction capacities, and the virus thrives and moves on. It is difficult to say whether or not biological and computer viruses are analogous, whether Darwin the game and the process of natural selection are operating under the same principles. Or, to put it another way, it is difficult to say whether or not viruses, biological or electronic, are simply two kinds of automated interactions, and whether life, however it is understood, is little more than mechanical theater.
Corruption of blood: an English legal term describing the inability to inherit property, usually due to some capital crime committed by the inheritor. A defunct concept, now abolished in both the US and the UK, one finds corruption of blood unexpectedly in World of Warcraft, a high-tech game whose sole content is fantastic pre-modernity. Released in 2004 by Blizzard entertainment, World of Warcraft (WoW) is a massively multiplayer online role-playing game (MMORPG), in which millions of men and women play one another for pleasure and profit. WoW is a typical fantasy landscape, with touches of science fiction and steampunk, where druids, priests, rogues, warriors and other classes undertake quests and battles with complex and not always predictable outcomes. As is the case when a WoW deity named Hakkar the Soulflayer is introduced as the leader of Zul’Gurub dungeon. Controlled by the game’s artificial intelligence, Hakkar is a vampire who, among other talents, drains attacking players of their blood to replenishing his own health. The game’s programmers also provide Hakkar with a spell, or debuff, called corruption of blood, which temporarily contaminates an avatar’s blood, sapping it of some life. This spell can also provide protection against Hakkar’s vampiric algorithm. Infect one’s avatar with the spell, and Hakkar will harm itself when drawing the avatar’s blood. The spell has a second property, new to the WoW universe: it can be spread from avatar to avatar through proximity, thus taking on the properties of a virus. If a healthy avatar walks close enough to an infected avatar, there is a one hundred percent chance that the healthy avatar will become infected. At first, as planned, Zul’Gurub dungeon acts as a quarantine, containing the spread of the virus to players and their pets, but it is these pets, hunter pets, that provide a viral inter-species link to the outside world. The pets—like Hakkar, also algorithmic creations—could be dismissed by players back to cities outside of the dungeon, where the pets, asymptomatic like real world vermin might be, spread corruption of blood to thousands of players. Worse, player’s avatars can also teleport from the dungeon back to main cities, carrying corruption of blood back to innocent populations. Very quickly a virtual, worldwide pandemic is born. This event is unintended by the game’s designers, and the spell quickly spreads to other AI-controlled characters and weaker, less healthy players. The weaker characters immediately begin to die, and after a short time some players resign, while many players maliciously help the disease spread, and others attempt to help infected avatars.
Since WoW is a virtual world and resurrection is possible, the virus becomes more of a nuisance than an apocalypse. Blizzard resets the WoW servers, instantly ending the plague, but soon several epidemiologists take interest. In The Lancet Infectious Diseases, Eric T. Lofgren and Nina H. Fefferman propose using MMORPGs to study the behavior of citizens when faced with an epidemic. As they write in their article, computer software like Transport Analysis Simulation System and Epidemiological Simulation System rely on cellular automata, historical data, and predicative modeling to guess behavior in an epidemic situation. Given the need for large populations of participants and geographic scale—plus inherent ethical limitations—actual, real-time reactions of a population to a disease outbreak cannot be modeled with any accuracy, unless one were to use an already existing community like WoW’s. The authors note that the corruption of blood incident marks the first time that a virtual virus has infected a virtual human being in a manner even remotely resembling an actual epidemiological event.
The stated purpose of danooct1’s YouTube channel, Computer Viruses in Action, is to entertain users with the effects of (mainly older) pieces of malware, while educating them as to how they work. To date, the channel has 22,292 subscribers, and its videos have been watched 7,466,520 times. According to a Twitter account with the same handle, danooct1’s real name is Daniel White. He writes 8-bit music and tests computer viruses for fun. danooct1 cannot send you any actual malware because this would be against YouTube’s terms of service. He also does not accept unsolicited Steam or Skype friend requests. In each of his videos, Daniel White demonstrates a malware infection on an operating system or program. The operating systems are usually obsolete, and the malware is, as expected, difficult to obtain. Most of the videos are titled with the same labeling system: first the type of malware (worm, virus, Trojan, etc.), followed by a period; then the target operating system or program (DOS, Win32, Microsoft Word, etc.), followed by a period; then, finally, the name of the malware. Names include: Savior, Rigel, Gruel, Color Bug, Gigger, Melissa, Prizm, Selectronic, Phrase, Ari, Fagot, Prolin, Artic Bomb, Apparition, and Acid 2. The most popular species of malware is the virus, and DOS is among the most popular targets. Like an enthusiast of American Civil War rifles or Soviet space gear, White’s YouTube channel is devoted to the morphological variations within a category of technology. There is little or no discussion of code, almost no technical jargon. Instead, the infected desktop is shown as a kind of proving ground—an Aberdeen for the security enthusiast. White triggers virus after virus, with each example announcing its arrival in paroxysm of 8-bit graphics and bitmapped clipart. Viruses now are fully weaponized, malicious. This is because viruses have become profitable, and as their creators use them to steal data and survey hosts, it’s no coincidence that viruses improve drastically in technical variety and quality. Monetized and lethal, their development mirrors that of predatory capitalism, where shorter product cycles are used as a way to increase profits and to drive technological innovation. Building the most innovative phone, or at least convincing the public you have done so, and selling a new edition of that phone every year to the same consumers, makes you the richest company in the world. Build the fastest and most lethal virus, and the most innovative virus designer will make the most money as well. There is, for example, ransomware, software that holds a user’s data hostage until a fee is paid to a hacker. And there are better-known Trojans, software that hides and waits on a computer’s hard drive until called upon to act as a part of a botnet, a distributed network of compromised machines rented out by criminals to the highest bidder. Malware’s evolution, cultural and perhaps Darwinian, has led to a great variation of types and techniques, and when looking through the online databases of McAfee or Norton or danooct1, one can’t help but think that these warehouses of malware species constitute a kind of museum. Not unlike a natural history or military museum, these catalogs are concerned with the variations of a life form or tool; their aesthetic complexity is as specialized as a breed show. And to compare them to American Civil War rifles is not a loose analogy. Malware has become a category of munitions, designed by intelligence agencies, mafias, terrorist organizations, mercenaries, and governments to not only steal credit card information and computing cycles, but also to destroy enemy machines and workers. The best example of this is Stuxnet: malware most likely developed by American and Israeli intelligence agencies, that targets the Siemens industrial software used by Iran’s nuclear program. Stuxnet, another worm, is the most sophisticated piece of malware developed to date. Discovered in 2010, Stuxnet’s travel itinerary involves Iran, and its operation is intended, most likely, to be contained to nuclear-related facilities. But after it is released, an error in the worm causes it to spread beyond these specialized host machines, leading to its discovery by Iranian engineers. Stuxnet is designed to destroy the centrifuges used by Iran to enrich uranium; the worm spins the centrifuges out of control, damaging or exploding them, thus making the worm one of the few examples of malware that could cause harm to human beings. An order of magnitude larger than anything previously known, Stuxnet is, perhaps, a program on the edge of life, one that can cross from the symbolic logic of computer software to the murderous logic of international politics.