Are there any conditions in which what we understand as politics--which drives war--might theoretically change so much that the the General Theory of War becomes obsolete? Shlok Vaidya, is to my knowledge, one of the few people who has really thought this through in much detail with an eye toward far-off technology (i.e transhuman and posthuman eras):
- Small-Scale Networks vs Network
- Advanced information flows decreases mass requirements and increases decentralization.
- Trend continues until post-human age.
- Small-Scale Network vs Small-Scale Network
- Individual vs. Small-Scale Network
- Individual vs. Individual
- Post-human vs. Individual
- When the difference between man and machine is negligible.
- ? vs Post-Human
*Acceleration really takes off when the network barrier is broken.
*Technology is a type of information.
What Shlok is doing with this graph is extrapolating out his view of military history into the future using an interpretation of David Ronfeldt and Philip Bobbitt's work. However, you could still potentially interpret the General Theory in most of these categories. Yet there's another wrinkle here, from one of Shlok's older entries:
5GW is the point where the human body becomes the limitation (our technology is designed to interact with our nervous system through the combined affects of the rest of the body’s system- but it could do a lot more if we ignored the those limitations).To get into 6GW we’re talking about technology replacing everything in the human body except for the core (the brain). 6GW looks like brain leveraging technology directly, without the rest of the human systems in play. And then we progressively get to a point where machines are “better” than the brain. (7GW is when brains are made obsolete by machines.)
Irrespective of the analytical utility of the xGW or generations of war framework, which is an entirely different issue (it is extremely debatable), I think if you take out the letters "GW" from each sentence and think about it you reach the outer limits of our understanding of war and politics when you begin to alter the human element.
Obviously, even by the most wildest conventional predictions of computing (except for those of Kurzweil, which approach a somewhat religious dimension in their unrealistic nature and promise of rapture), this is not a concern we will be worrying about for a good deal of time--if ever in our lifetimes. Super-empowered robots is not a matter for SECDEF Gates to factor into force planning, even if we know he could probably roll up his sleeves and kick some robot behind. Singularity is not really a concern for anyone living today. But what of generations in the far future?
Which brings us to Neon Genesis Evangelion, which is an interesting visualization of the deeper issues behind the idea of posthumanity, even if it is not specifically about technology. Recounting the whole plot is not terribly important (and in some ways can be a barrier to understanding the deeper themes of the TV show due to its convoluted nature), but the basics are that a group of people, each for their own reasons, seek to speed up human evolution to change to something else entirely. Each has a different idea of how this process will occur.
The key element is that in accomplishing the process of what is called "Instrumentality"--it means giving up one's self and completely breaking down every kind of psychological barrier that comes with having an individual personality. Much like the Singularity's dream of conquering death, one can throw off the pains of being human to reach a supposedly more advanced state of being. Much like the ideas of cybernetic totalism that Kojima attacks in Metal Gear Solid games, this also means forgoing entirely the trappings of what we understand as being human--not just in the mundane sense of bioenhancements, but our entire way of experiencing the world.
Politics is essentially a means of dividing power, a scarce resource. And the source of politics comes from differences and commonalities that come from being human. Aristotle's concept of politics, for example, issues from human social activity. To embrace instrumentality in Evangelion means to reject politics entirely. This is not the same as a 1984 world in which thought is completely controlled, but to forego what we would understand as thought or cognition (and the politics that spring from it) altogether. When the choice is offered to Shinji, the main character, he ends up rejecting this. Those who follow the series know of Shinji's psychological suffering as well as the series' dim view of human nature and the source of human problems in the psychological barriers and human failings that Instrumentality seeks to erase. Shinji is very tempted to give in. But Shinji ends up choosing an imperfect world of pain that has the possibility for promise over a utopian reality that would mean compromising his own humanity.
It is clear that instrumentality would mean an end to politics---just as it would be an end to what we understand as human life as we know it. Whether or not you feel Shinji made the right choice comes down to your view of human nature and the possibilities for an end to human conflict. Hans Morgenthau, Raymond Aron, and Reinhold Niebuhr, however, would probably say Shinji made the right choice.
Comments