It would appear that a significant thing has happened—an act of war with extraordinary consequences—without anyone getting visibly upset. Perhaps I am understating: but the cyber worm attack on the Iranian nuclear facility at Bushehr may well have put it out of commission as effectively as any cruise missile strike by the Israeli or U.S. air force.
We cannot know the extent of damage. Iran is a mostly closed society; the Russians who designed and largely built that reactor have been operating no differently than they did during the Cold War; Pentagon sources seem themselves puzzled; Israelis, if they know something, are not telling; and so on. We can infer that something very bad has happened to the Bushehr reactor from scattered reports, but cannot be sure what that was, or if the cyber worm used to make the bad thing happen was (as various computer security experts have speculated) “Stuxnet.”
They became aware of Stuxnet several months ago, and immediately began trying to reverse engineer, to discover how it works and what it is meant to do. Also, mapping reports of its appearance as sleeping “malware” in thousands of the world’s major industrial computer control systems. (A peculiar concentration was found in Iran, with secondary concentrations in Pakistan and India.)
They now know that it is an unprecedented, very powerful “cyber weapon,” that cannot have been designed for routine purposes of industrial espionage; that it can be loaded into a computer system by a single infected memory stick, with no need for an Internet connection; that it “fingerprints” the control systems for the entire industrial operation, and monitors them closely in both space and time.
And then it just waits, perhaps indefinitely. Alternatively, without human intervention, it detects the coalescence of certain conditions, and broadcasts a very simple code—“DEADF007”—whereupon it takes over the control system, overriding commands from all individual computer terminals, and all embedded safety mechanisms. It uses the plant’s own capabilities: twirling the dials to contrive a meltdown of one kind or another.
If the paragraphs above strike gentle reader as resembling the opening of a science fiction novel, let me assure him they strike me in the same way. I am not a computer expert of any kind, and can understand what is being discussed only in the broadest outlines. My interest is in the implications of what is being discussed.
When planes first appeared in the skies, dropping modest little bombs on battlefields during the First World War, men began having visions of their destructive potential. We forget, now, but one of the leading inspirations to pacifism in the 1930s was the popular belief that “modern bombers can obliterate entire cities.” Their capacities were somewhat overestimated, but those who lived through the Blitz, through the destruction of Dresden, or the firestorms of Tokyo, formed lasting impressions of the impersonality of modern warfare.
From the moment the world’s first atomic bomb was dropped, on Hiroshima, and even before the last was dropped on Nagasaki, we began to envision a vast nuclear war, in which the potential of those bombers was fulfilled. I grew up in the era of “Ban the Bomb,” yet with the instinctive knowledge that the threat could not be banished, only contained. It served my generation as a means to remind that human life is fleeting.
Arguably, this latest “technological advance” carries a different message: and only that the computer operating systems upon which the global economy depends are, in their nature, insecure. And while there is no weapons system to which some kind of defence cannot be devised, eventually, we must see that this one confers advantages on the attacker that extend all the way to anonymity.
I am not criticizing or condemning anyone, here, any more than Walter M. Miller, Jr., was doing in A Canticle for Leibowitz. Bad and good will persist in human affairs.
Moreover, the Bushehr reactor needed taking out, and if this was the means to do it, I would not have hesitated to order the strike any more than Harry Truman hesitated to drop “Little Boy”: a terrible thing, but necessary to prevent something even more terrible. We might even rejoice, in a limited way, to see this new advance in the art of precise targeting—for the great majority in any war are bystanders.
What I’m saying, instead, is that even people so humble as ourselves, should draw the obvious conclusion. For if there is nothing secure about “high technology,” we could easily find ourselves in a market where the smallest raw potato commands a higher price than what’s left of the most advanced computer system. And that, in turn, is why we have always needed very low-tech back-up systems.