I’ve kept uncharacteristically silent during a week with events that matched issues I’ve talked about more than once: milblogs, embeds, CNN’s continued aid and comfort in the guise of objectivity, the discovery of a single Al Qaeda media strategy document, sentencing of an American performing traitorous acts by knowingly passing information, and the unconfirmed identification of one potential leaker for a New York Times story revealing classified information that had a strategic effect.
Yes, it is indeed an information war, and we’re stuck with it. (I would like to credit myself for getting Instapundit saying “it’s an information war”–I think I was the first he linked to saying that–but it’s a pretty common idea nowadays.)
A while ago I wrote a small thesis proposal which was much harder to complete than I would like to admit. (I know the reasons; I completed the task; I am not happy it took so long.) In the process of making a thesis proposal I talked to several folks who work in academia in topics related to “information warfare”. I’ve learned that the overwhelming majority of those folks work in terms of electronic and computer warfare; I have yet to find a marketing guru, for instance, hanging out with the geeks. I’ve also learned that if there’s an unclassified document in English that talks about information warfare in terms of ideas and thoughts and decisions, and is built like a doctrine manual or tactics techniques and proceures (also known as TTP’s), then nobody I’ve talked to knows about it.
So. I’ve asked one university if they’ll let me pay them lots of money to work on this problem in my free time, and if they like it perhaps I could get a doctorate out of it.
Wish me luck, and if you know of a backup school that allows one to complete a degree in this area while physically not at the university let me know…it’s not as if they accept just anyone, and I might be above the quota for “loud annoying guys”. So far every person I’ve talked to doing information warfare has been interested in the subject and is supportive–either this group of people is very very polite, or I might be on to something. If I’m rejected I don’t think it’ll be because of my choice of subject.
The “more” tag hides a longer explanation of the same thing, with citations and footnotes not added because I forget how to convert those from Word without a lot of pain, and just “saving as” in Word breaks the browser. Comment or email if you’d like a source I didn’t link. There may be a risk in mentioning this subject so that someone else can write what I wish to write more quickly, but it’s an important subject–and I think perhaps that the odd way one has to think to get deeply into this subject may act as a barrier to entry.
1. Preliminary Title:
War with Ideas:
Using Information, and Transmitting Concepts, To Promote Peace and Win Wars
1. How specifically have information and ideas been used in warfare as tools to advance the political goal? Who are the actors that make these actions happen? Is it controllable, and if so by how much?
2. What are some informational methods that have been used to advance a warfighting goal? Were they done on purpose, or was it clearly an accident?
3. After placing those examples into categories from grand strategic to tactical effect, does a pattern appear? Are there commonalities?
4. How effective were those methods in the short and long term? Are they reproducible?
5. Of these methods, which could be used today? Are there more general rules of fighting war with information and ideas that can be posed from this analysis?
3. Discussion of proposed approach, including sources, survey of known sources, and current debate within field:
Information is manipulated in war. When people make decisions affecting a war, their decisions are colored by their way of seeing the world and changed by the information they know, the information that’s been suppressed, the mindsets they grew to have, and their emotional response. History, internal communications and propaganda, communications from external parties, and taboo concepts all affect how a decision maker wages war. For an open society with a twenty four hour news cycle, changes in this stream of information change how that society supports the actions of their leaders. It is possible to affect that information stream. It should also be possible to characterize what sorts of effects can happen with particular actions to change the information stream.
Writers on guerrilla warfare and insurgency have addressed theory of propaganda and the importance of message to the cause. Western thinkers such as Colonel Harry Summers in his On Strategy, describing the failure of the United States to recognize and effectively counter attacks directly on American public opinion as a center of gravity, recognize the problem for an open society but do not address how to counter such a campaign. Opponents see the relative advantage they enjoy in the information arena; both national entities such as the Chinese view of “political warfare”, and transnational, such as Al-Qaeda’s messaging to convert and persuade, reflect how important the control of information is to control decision making by their competitors.
A society engaged in conflict has a center of gravity, public opinion of and sensitivity to conflict, that is particularly important in open, democratic societies. This support is also important in less free societies, but gets hidden by suppression and prevention of preference cascades, an externally apparent solidarity of opinion that rapidly collapses when people in the society are finally able to speak in the public square about what they really think. For a less free society, the manipulation of information is essential for the actions of a government–but if a story about how a people see themselves supports what the leaders want to do, so much the better.
Counters to this sort of information warfare are slow in coming. Experts such as Thomas Marks of National Defense University and Cori Dauber at University of North Carolina-Chapel Hill have analyzed aspects of this kind of force multiplier. Robin Brown’s 2002 effort to place the media-military relationship in a Clausewitzian framework has evoked discussion on weblogs and mailing lists. The cadre of information warfare experts centered around the Journal of Information Warfare (edited by William Hutchinson of Edith Cowan University in Western Australia) are beginning to describe the problem in terms of philosophy of perception, network analysis, and game theory.
Information warfare (not merely electronic warfare but a war of ideas and knowledge) can be described and characterized. There are techniques, tactics and procedures that can be effective from the tactical to the grand strategic level, and there are possible effective countering actions. A study of efforts to control, manipulate, and disseminate information during wartime might reveal general observations or rules that could contribute to the existing theory of information warfare.
Such a study focusing on actions to be taken by actors in a free society would have a slightly different focus than a study focusing on actors in a less free society. Countries that depend on preference cascade prevention and suppression of dissent to survive have thought about information warfare in a slightly different manner. A nonstate actor also may have an imperative to suppress adverse communication, and taboo thoughts thus become suppressed using the threat of violence. These differences in approach change how freer societies can utilize tactics, techniques and procedures that counter a competitor’s attempt to succeed at a particular effort at changing an information stream (sometimes termed an information fire). For example, Chinese propaganda units (political warfare units) are integrated into the People’s Liberation Army. Public affairs units in the United States are prohibited by law, custom, and culture from doing many of the actions considered permissible for a society such as China’s or Saudi Arabia’s.
Over the last few decades various militaries have built definitions of terms such as “information warfare” or “information operations”, and cultivated experts in those fields. Unfortunately, the field appears to be several rather divergent disciplines cobbled together. The Americans, for example, lump people who work in bits, packets and electrons (electronic and computer warfare), people who work in mass communications (public affairs), people who work in tactical and operational psychological operations (PSYOP), and call the whole thing Information Warfare (IW). Each discipline attracts different types of people; less helpfully, each discipline has limitations and restrictions that prevent one part of the IW community from assisting another. Buried in the sea of computer experts and press agents and leaflet distributors is a very small community of people who think in terms of ideas as information fires, fighting memes with memes. For this group, information war isn’t necessarily attacking computer networks, or spycraft, or public relations. Rather, this more narrow use of the term “information war” is more concerned with how a person thinks and makes decisions that are in favor of a particular aim in a conflict. The other disciplines have useful tools for this effort, and can contribute, but are not doing the same job. Fighting with information isn’t new, but is only beginning to be codified and described as a discipline in a networked and connected world.
Others have written about this definition of IW, either directly or while discussing other aspects of conflict. In the purely military field of study, the writings of and about information manipulation as a part of insurgent warfare are extensive, ranging from the previously-mentioned writings of Marxist leaders and opponents to the counterinsurgency (COIN) writing coming from operators in the field today. This work appears interdisciplinary, though lessons can be drawn from such disparate fields as journalism and media ethics, marketing, psychology, history, conflict analysis, network analysis, sales, and game theory. A paper that provides examples of efforts to manipulate information and ideas, shows what the manipulation was and who did the manipulation, reports the effects, and makes general conclusions about the efficacy of those efforts would be a step towards answering the initial set of questions posed at the beginning of this document.
In a search for general rules of fighting war by processing information, a possibly useful structure to use in thinking about information war and information fires is to divide examples into three types of audiences: internal propaganda, external message manipulation, and history manipulation.
How a people see themselves, their ideas of identity and belonging and understanding of the world as a member of the group, depends upon common and shared ideas. If the group’s shared stories of identity and history can be changed, then that group will then see the world differently. That group will thus make different decisions, and self-select new information that fits that mindset, and support a different level of violence in a conflict.
Several interesting reports of the Yugoslavian conflict from journalists, sociologists, and historians provide a recent and well-documented view of how a group’s self-description can change due to changing information streams, and what kinds of effort were necessary to escalate conflict. The collections Balkan Holocausts and The Kosovo News and Propaganda War provide some examples of how the conflict shifted character based on a distortion of history. UNESCO has been struggling with this aspect of warfare, and now has rules in effect about teaching of history in reaction to the collapse of Yugoslavia in such documents as Lebanon’s Taef Agreement.
Countries are aware of the importance of conforming history to their worldview, as can be seen in UNESCO’s history standards effort or the arguments over Japanese textbooks in the Republic of Korea and China. History itself becomes fought over, as parties struggle to include land claims or justify past behavior, as is currently being done in southern Sudan. This historical framing has a strong effect on the amount of animosity between belligerents, but it is not at all clear that the process is directly the result of people actively pushing a long term strategy to incite hate.
A more attributional example of cultivating hate over the long term (and the resultant military benefits) is in how a group brings up its children. From propaganda videos to schoolbooks to parades to children’s organizations, a long-term inculcation of values supporting a long war against another can be achieved. Sometimes this process works, sometimes not. Analysis of several different periods, where children have grown prepared to fight a lifelong enemy, may reveal common aspects and vulnerabilities.
Acculturation helps to define terms of reference for negotiations that build into war and support that war. Cultural influence on acceptability of certain behaviors also affects that support. If terms are defined where someone can publicly say the enemy “kills babies” but cannot easily say the enemy “is just like us”, decision patterns begin to match the language. Research on the psychology of decision making, such as characterization of selection bias and affiliation, can provide insight into why people decide that a group is an enemy and how this might be interrupted. One of many examples would be Milan Vego’s network analysis demonstrating how association, self-selection and isolation of groups helps to explain the way a public becomes polarized and ignores information sources against the group’s bias.
If a group of people grows up learning to hate another, or gets taught to do so, then war is much more likely. A description of how this happens may indicate future trouble, and suggest a solution. Is there a conscious reason for growing haters that can be thwarted? Is there a decision-making process in the long-term development of hate and discontent that can be interrupted? Once in progress, how does this process get stopped–and how long does it take before those feelings are calmed to a level where people will not make war against each other? Is there someone who incites on purpose, or is this always a self-organizing process or the same structure as a fad?
Controlling what a public sees, learns and cannot say determines to a great extent what that public will think. Is it possible for a person or small group of people to consciously control these information structures? What ways to counter these effects are useful, and are there situations where the long-term process has been reversed on purpose? Were active changes in a country’s culture, such as Turkey’s experience post-World War One in becoming a different state under Ataturk, or Japan’s post-World War Two experience, merely accidental, or are there common lessons that can be applied to peacekeeping?
Different ethnic conflicts appear to have similar internal messages, promoted in similar ways. Mary Kaldor’s description of Yugoslavian ethnic border advertising in her book New and Old Wars matches journalist Michael Totten’s contemporary reports of border billboards in Palestine and south Lebanon. Selecting a language and a common story of identity of grievance supports and sustains hostilities; the Meiji restoration-era classroom propaganda is not so much different from that of Eastern European classrooms, where the cult looks similar but the personalities are different.
There are possible strong parallels in what internal propaganda supports and sustains violence in a conflict and that which sustains an oppressive regime. It will be a challenge to identify the information manipulation necessary for violent conflict with an external enemy from the manipulation needed to sustain a regime. It would also be worthwhile to find the best ways for a free society to interrupt those processes and allow other messages, language, and thoughts to propagate. Wartime actions such as the changes to the BBC World Service in the Second World War had a definite effect on the internal messaging of the Axis countries. Captured documents from the U.S Army’s Harmony database reveal that the conflict in Syria against three Islamist groups from 1976 to 1982 was aided by information fires based in Iraq, but those fires weren’t sufficient, and the campaign ended with the Hama massacre–a massacre that was a powerful message in itself.
External Message Manipulation
For a group trying to impose its will on another by using information to influence mindset and decision making, there is a body of literature available that discusses the subject from an insurgent point of view, or a marketer’s point of view. Not as much has been written on resisting persuasion and attacks on mindset, though, with such small examples as countering resistance when persuading, or exhorting resistance to counterrevolutionary thought in Marxist insurgent texts. Given that, successful resistance to external memes has been achieved on purpose. A study of the texts focusing on persuading the Other may reveal countertechniques that have worked or failed in the past, and some commonalities in those techniques. These may well turn out to be longer term than an expected tactical psychological operations campaign can achieve; it may be that the long term change in mindset is what is needed to achieve a goal, which may explain the “extremely slow observe/orient/decide/act (OODA) loop” between attacks that seems to work for terrorist organizations.
The long term methods of framing an argument or mindset requires the ability to define terms of reference for a group, define what is acceptable and what is not, and to be able to suppress and enhance information. This allows a public mindset that makes some ideas more acceptable and some to be less acceptable. How those ideas succeed and don’t succeed is studied in psychology in market research.
Some ideas manage to get killed off in the public mindset pretty quickly. Some ideas that make sense to one party just don’t get traction with the other party. A recent example is the leaflet drop in southern Lebanon by Israel directing the public to leave the contested are–an idea that, if acted on completely, would remove one of Hezballah’s most effective informational counters, that of having Lebanese babies “killed by Israel” paraded on television. The leaflet drop wasn’t effective, but Israel thought it would be. Conversely, the Hezballah information fires in parading those babies around partially backfired when the crude nature of some of those efforts became known.
Some techniques are able to be characterized. The “cheat and retreat” cycle takes advantage of the “lie going halfway around the world while the truth is still getting its boots on”, selection bias, and the tendency to correct press reports in a tiny box on page A-17. Others will require more digging to reveal and describe.
Once a successful framework is built, the collected examples should allow some useful generalizations. Although ideas are not perfectly controllable, the spread of concepts and ideas and common knowledge can be influenced–who really needed to buy a “pet rock”, for instance–and actors and attempts to influence the information stream can be detected. More difficult will be identifying effective counters to that manipulation. The timeframe for such counters may be counterintuitive. It’s certainly a problem worth investigating.
8 Responses to “The Thesis Proposal”
Trackback URL for The Thesis Proposal: