To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

Disinformation

From Wikipedia, the free encyclopedia

Graphic showing differences between (deliberate) disinformation, unintentional misinformation, and hoax according to Wikimedia Research
Graphic showing differences between (deliberate) disinformation, unintentional misinformation, and hoax according to Wikimedia Research

Disinformation is false information spread deliberately to deceive.[1][2][3] This is a subset of misinformation, which also may be unintentional.

The English word disinformation is a loan translation of the Russian dezinformatsiya,[1][2][3] derived from the title of a KGB black propaganda department.[4] Joseph Stalin coined the term, giving it a French-sounding name to claim it had a Western origin.[1] Russian use began with a "special disinformation office" in 1923.[5] Disinformation was defined in Great Soviet Encyclopedia (1952) as "false information with the intention to deceive public opinion".[1][2][6] Operation INFEKTION was a Soviet disinformation campaign to influence opinion that the U.S. invented AIDS.[1][6][7] The U.S. did not actively counter disinformation until 1980, when a fake document reported that the U.S. supported apartheid.[8]

The word disinformation did not appear in English dictionaries until the late-1980s.[1][2] English use increased in 1986, after revelations that the Reagan Administration engaged in disinformation against Libyan leader Muammar Gaddafi.[9] By 1990 it was pervasive in U.S. politics;[10] and by 2001 referred generally to lying and propaganda.[11][12]

YouTube Encyclopedic

  • 1/4
    Views:
    1 674
    40 221
    3 058
    834
  • ✪ Disinformation & the Threat to Democracy
  • ✪ What's Happening with Russian Disinformation
  • ✪ Network Propaganda: Manipulation, Disinformation, and Radicalization in American Politics
  • ✪ Stanford Seminar - The Machineries of Doubt & Disinformation

Transcription

>> From the Library of Congress in Washington D.C. >> John Haskell: So welcome everybody to the Coolidge Auditorium here at the Library and to conversation about disinformation and the threat to democracy. This event is hosted by the Kluge Center here at the Library of Congress. And many of you may not know much about the Kluge Center, so I'm going to give you a one-seven synopsis. It was created to reinvigorate in the words of its charter, to reinvigorate the inner connection between thought and action at a high level. Conversations like these addressing challenges facing democracies in the 21st century are a key part of that effort. I'm going to introduce our guest today, Anne Applebaum, who as many of you know is a columnist for the Washington Post and a prize-winning historian. Her degrees are from at the bachelor's level Yale. She has a master's degree in international relations from the London School of Economics, and Georgetown granted her an honorary doctorate. She's a professor of practice at the London School of Economics where she runs Arena, a research project on disinformation, and she's the author of Gulag, a History in 2004, which won the Pulitzer's Prize for nonfiction. She won the Cundill Hisotry Prize in 2013 for Iron Curtain, the Crushing of Eastern Europe 1944 to 1956. And just this year she won the Lionel Gelber Prize for Red Famine, Stalin's War on Ukraine. We'll have a book signing afterward next door in the Woodall Pavilion for Red Famine. And I wanted to start out to give you an opportunity to tell us what that book's about and why it would be relevant to us. >> Anne Applebaum: Well first of all, before that, I should say thank you. I'm really delighted to be at the Library of Congress. It's a place where I've done research myself for my book on the Gulag. You have a pretty extensive collection of dissident material that I was able to use, Russian dissident, I should say, not American dissident. Although you probably have that too somewhere. >> John Haskell: I think so. >> Anne Applebaum: Somewhere, American samizdat. So thank you for inviting me here. Red Famine is a history of the Ukrainian famine. It's, and it's also an argument about the Ukrainian famine and why it happened. And this was a famine that took place in the years 1932-33, and the book argues that it was, well, we've known for a long time that it was not an accidental famine. So it was not caused by the weather. It was not caused by bad crops. It was not caused by insects. It was caused by a deliberate set of decisions taken by Stalin designed to weaken the Ukrainian peasantry, and it result, and it was literally caused by confiscation of food. So when you confiscate people's food, when you take their food away, and then as the Soviet Union did, you block roads so that peasants can't get to the cities, and people couldn't leave the Ukrainian public. Then you have a lot of people die. So the book is an argument about why that happened, why Stalin did that, why Ukraine, it's a little bit of a potted history of Ukraine. It explains what Ukrainian nationalism was with the Ukrainian national movement was, why Stalin disliked it. And I suppose it leads into our, into our current subject now in ways too because one of the reasons why it was possible, one of the reasons why it was possible to persuade to confiscate the food of starving people, because of course, it's required a massive operation, was that it followed a decade-long campaign against both the so-called Kulaks, which presence and also against Ukraine and Ukrainians. And one of the things you learn when you study Stalinism, I've now written three books about, essentially about Stalin and his way of thinking and his way of occupying countries, is that none of the violence that is possible would have been possible without the propaganda that prepared the violence. And so you immediately begin to think about how it is that you change people's minds, how you prepare them to do terrible things, how you convince them to accept certain, you know, a Stalinist totalitarian structure of the country. And that gets you thinking a lot about political language and how it's used and who's using it. And that's, in my head, one of the parallels between studying the history of the past and then trying to understand the politics of the present. >> John Haskell: So, you grew up in Washington D.C. and became an expert on the Soviet Union and Russia and Eastern Europe. How did that happen? How did you get interested in that, and how did -- >> Anne Applebaum: Bad luck. I mean there are different ways to answer that. So I was at university in the 1980s, early 80s when it was kind of the height of what we call the new Cold War as Reagan was president. And the Soviet Union seemed like this urgent problem that had to be solved. So like lots of people, I studied Russian. I had another idea which is was, I had been a fairly pretentious teenager, and my favorite writer was Nabokov, and I had this idea I would be able to read him in Russian. And one of the horrible ironies is that although I do speak Russian, I read it and use it all the time, and I can read Tolstoy. He's pretty, he's a very clear writer. I have a lot of trouble reading Nabokov. So that was all, he writes in a very -- >> John Haskell: You have something to aspire to then. >> Anne Applebaum: I can still aspire to that. But so that was another side reason. But no, I was drawn to it. I studied it. I was lucky enough to be a student in Leningrad in 1984. I spent a summer there when it was still Leningrad. And so I saw the end of the Soviet Union when it was still the Soviet Union. And I realized only years late, I've written about this recently, that that was a stroke of luck because a couple of years later, if you went to Leningrad, you would have been there during [inaudible]. It all looked different. So I was almost the last generation of American students who saw it when it was still the USSR and was still a totalitarian state or aspired to be a totalitarian state. >> John Haskell: You ended up in, or you were in Poland when the Berlin Wall fell, so you'd been working there. >> Anne Applebaum: Right, so then the second piece of luck, good luck, bad luck, is that I was in the interest in the region led me to become a freelance writer. I was a stringer in Poland in 1988-89, and I was there when communism fell and when the Berlin Wall fell. And after that, you know, I was stuck with that region. I also married a Pole. >> John Haskell: There we go. >> Anne Applebaum: That was another reason to stay there. And I've actually been living in Central Europe on and off since, you know, for 20 years. >> John Haskell: So at LSE you have this research project, Arena. Tell us a little bit more about that. What are you doing with that? >> Anne Applebaum: So Arena evolved out of, I run it with a colleague, Peter Pomerantsev, who is a British television producer and journalist who worked for ten years for Russia and wrote a very brilliant book about the Russian, the creation of the Putinist [phonetic] propaganda state and has a wonderful title. The title is Nothing is True and Everything is Possible. So if you want background on how, on the change in Russia, it's a very good book to read. He and I both at the same time for slightly different reasons got interested in the question of how Russian attempts to use the qualities of the internet and the quality of social media in order to promote both true and false. But pro-Russian narratives was beginning to take off. I was interested in it from my perch in Central Europe. I was watching Russian political influence campaigns, the attempt to influence elections. This, of course, all burst into the open in 2014 when Russia invaded Ukraine, and then, I think that was the moment when many people under, you know, some more clearly what they were doing. Because, of course, the invasion of Crimea, if you can remember, that was accompanied by a massive denial, you know. We don't know who these masked men are who are walking through Crimea. We have no idea where they got their weapons. I think the president of Russia even said, oh maybe they bought them in a shop, you know. You can buy, you know, armed personnel carriers, you know, in the safe way. But it was actually a very effective campaign, because it confused, certainly for long enough for them to occupy Crimea. And then it was followed by a similar campaign attempting to divide Ukraine. And I think this was the moment where people were oriented. We saw it a little bit earlier than that. So we began by being interested in this problem in trying to define how the authoritarian states in the modern world use language, how they're using the internet. I think one of the things that happens when you study, you know, the Russian disinformation campaigns online is pretty quickly you begin to understand that the problem isn't really Russia. The Russian state, for historical reasons, because they've been doing this for many years is what the KGB famously did, I mean this is in all my books, was interested in how to use language to manipulate people and was interested also in trying to penetrate. I mean we can remember, you know, Soviet support for the communist parties all over the west. Was interested in trying to get political influence in the West. This is a very old, you know. But pretty, you know, they were good at this because they've been thinking about it longer because they saw the possibilities first. But really, anything that they do now, almost anybody else could do. And there's no, you know, the creation of a bot farm or a trolling campaign, we can talk about what that is in a second if you want, is something anyone can do. You can do it in Russia. You can do it in China. You can do it in Texas. I mean, it's not a, it doesn't require any special technological knowledge. A little bit maybe, but not that much investment. And so it's really that, you get really quickly to understanding that this is to do with the nature of how we get and receive information now more generally. And so our project does two things. We study, we have done some projects very specifically on Russia, looking at Russian attempts to do political influence, and we did one in Germany. But also looking at, you know, how might we rethink what media does in order to reach alienated audiences. Are there ways to counter the spread of falsehood online. We've done these experiments. We've only actually been in existence for a year and a half. Because we, we were both working on the subject and writing about it. And then we decided it deserved its little seed at the university. And we were, we made an arrangement with LSE. >> John Haskell: Okay, so specifically to the topic of the conversation here today. How do you describe the threat from disinformation broadly, and does it constitute a paradigm shift? >> Anne Applebaum: So I think the, I think that the way, again, the way in which we get and receive and issue political information has changed very fundamentally, and I would say that it is a paradigm shift. And a good comparison is the invention of the printing press. You know, it seems like a long time ago, but if you think of what happened when, you know, instead of language being controlled by, you know, monks and monasteries who copied out manuscripts and handed them out to specific people, you know, who then passed them on more broadly. You know, there was a very severe system of gatekeeping in terms of who controlled information up until the printing press was invented. And then once you had the printing press, you had this multiple, you had exactly the same effect you have now, which is suddenly all kinds of people can read and get access to information. They can question what the monks are doing. And this has both positive and negative effects. It means that suddenly, I mean it's what led to the reformation. It's what led to the Protestantism. And actually, I've talked about this recently in Austria and I could say, well, people in this room might not like Protestantism. But maybe people in the United States think it's okay. Anyway, there's different views about that. But, you know, and there are positive and negative. But one of the things that it did do is that it then created enormous political conflicts. And there were religious wars in Europe. You know, well into the next century. And, you know, it created real political division and change. And again, in both a positive and negative sense. And I think that's what we're seeing social media doing is that it's given suddenly really all of us are writers and all of us are journalists and all of us are publishers. Anybody who writes something on Facebook or Twitter and then passes it on is now functionally publishing. And that means that the, you know, the way in which people see things, what they trust and don't trust, what they, you know, the way they look at news has changed very fundamentally. And really every time this has happened in recent history, it's been accompanied by major political changes. I mean you don't, you know, you had religious wars in the 16th and 17th century. But actually after the invention of radio, who were the first real beneficiaries of radio? Who understood its power better than anybody else, and the answer is Hitler and Stalin, who were both obsessed with the radio. They both used it. I mean, just as an example, when Stalin arrived in, when he invaded, when he arrived in Berlin in May 1945, the absolute first thing he did before he did anything else was take over the radio station. I mean even before they really finished fighting in other parts of the city. >> John Haskell: And we had somebody who didn't like the reformation, Father Coughlin, was also, he was a radio guy. >> Anne Applebaum: Yes. Yes, no, no, I mean understanding how to use radio was. But then of course, what was the reaction? Well one was that Franklin Roosevelt also learned to use the radio. The BBC in Britain is a very interesting reaction to the radio, because the British state suddenly said right, how do we channel this in a positive way? How do we bring people who live in the Shetland Islands and then Cornwall into a national conversation? And then they just, you know, the BBC was partly a creation of this fear that, of a social breakdown that would be created if we don't think, if we don't find ways to use it positively. And, you know, different countries came up with different answers to the disorganization of radio. But that's all a segue. I didn't want to get too much into the history of saying I think we're living through exactly that same kind of moment. And it has, again, both very positive and very negative effects. I mean technology, the technology itself is neutral. I mean I'm not anti-internet or anti-social media. But neither am I a Utopian who thinks that, you know, now that we have social media we're all connected. Everything will be better because clearly, that's not the case. I mean one of the effects, for example, is that we now have effectively a global media space. So whereas in the past, in the Soviet era, if, you know, the KGB wanted to try and create a rumor, for example, there's a wonderful example in the 1980s. The KGB staged this huge operation designed to create and push the rumor that the CIA had created AIDS. It was a famous story. And they tried to plant it in different presses, and they found sympathetic journalists who would write about it. And, you know, they had different, but this was very difficult to spread this rumor. They did it actually. They succeeded in convincing people in Malaysia or wherever. But, you know, but it was a tedious process, and it took many months. So now, you want to create a rumor? You know, you create, you know, ten fake websites that support it. You, you know, have them echo one another. You create, you know, a bot farm or a troll farm that can push it out. And you can do it in an hour. >> John Haskell: Because there are rumors about Zika that could be analogous to the one about -- >> Anne Applebaum: Right, I mean there are lots of it. I mean now that, now the amount of rumor and falsehood that we see, you know, every day, is extraordinary. This is not to say that there were no problems with the previous media model. And I'm not here to, even though I work for the Washington Post, I don't advocate returning to it. I don't think we're going to put it back together again. I don't have any nostalgia for Walter Cronkite. But nevertheless, we live in this new world, and it's important to understand, you know, the positive and negative. >> John Haskell: So do the major media sources, some people call them mainstream media, but the major media sources, they're, I don't know whether I want to put intention into this, but aren't they meant to be a bull work against this? Why aren't they serving as a bull work against, you know, fake news? Or are they? >> Anne Applebaum: They try, you know. There are lots of people have experimented with doing factchecking, creating factchecking websites or having special factcheckers on the pages. You know, but that, and that works up to a point. It works with those audiences who trust the fact checkers and who trust, I don't know, the Washington Post. But it does not work with audiences who don't trust the Washington Post. Which there are, you know, very large numbers. And so the problem can't really be solved by institutions that don't, you know, lots of people don't read or don't see or don't believe actually. So, I don't think it's a problem that the mainstream media, so-called mainstream media, can solve by itself. >> John Haskell: And so is it, are we at a point then where, I don't know, objectivity is a loaded term in some ways, but where the concept of objectivity or at least evenhandedness, you know, doesn't hold any sway? Is it at that level do you think? >> Anne Applebaum: You know, I think it's a little bit different than that. The problem is that there is no public agreement on what objectivity and evenhandedness means. And you know, there are certainly still people working in journalism who believe very much that they can do that. I mean I have colleagues who, I'm myself a columnist. And so this isn't my problem as much as it is others. But yes, certainly there are people that believe absolutely that, you know, their function in life is not to promote a political, you know, narrative or a political candidate. But what they're supposed to do is, you know, tell you what happened yesterday. And that's very much for a large piece of the press, it motivates a lot of people to go into journalism, you know. People who want to look for the truth and try and bring honesty into public life. And if you ask people what their motives are, that's often it. So I don't think it's dead as an ideal, but I think you would find, certainly in this country, but also across Europe, you know, you would find differing views about whether that's possible. I actually had a conversation, to me it was quite a shocking conversation, but it was with a Hungarian sort of former colleague a few weeks ago who just said to me point blank, oh, you, you know, you Americans who believe in this nonsense about objective journalism. You know, we all know that it's all one side or the other. And, you know, stop pretending. Which is interesting because the, you know, the attack on the possibility of good journalism. The attack on the independence of the judiciary. The attack on the, you know, neutrality of civil servants. All these kinds of attacks are we have heard before, and this is exactly. I mean when you look at the history of Soviet communism, one of the, you know, you would hear Lenin talk a lot about bourgeois, so-called bourgeois democracy and its fake institutions and its pretend media, which is really in the service of somebody or another. And it's really all a reflection of the power structure. You know, and on the right, you would have, you know, this is very much how fascists spoke as well. You know, we don't believe in these fake institutions. So, these kinds of, this dislike of any possibility of neutrality or honesty in public life is not new, and it certainly has, you know, I do not believe that we are now living through the 1930s by any means. But, you know, you can certainly see the possibility, the negative possibilities that could come from this. >> John Haskell: So I want to move on to some of the, a few of the mechanics of disinformation. Before we get there, I just want to, is it possible that we're overreacting, that the discourse isn't really as poisoned as what you might be suggesting? >> Anne Applebaum: I mean, you know, it's always possible that we're overreacting. But it's, you know, maybe better to see and acknowledge the problems and try and deal with them rather than ignore them and pretend they don't exist. I mean I actually had just with the narrow point of Russian disinformation, I had for a long time people saying to me, oh, maybe this is a problem for some people in Poland or Ukraine, but this isn't a real problem. It doesn't really exist anywhere else. And it was a kind of, even in this city when you talked to people in Congress about it, and you got a kind of well, this is a sort of third-rate issue. And we're sort of interested in it but not really. I did find after the last US election, a major change in how people see it. And suddenly, people understood okay, this is what the Russians do. They create Facebook pages designed that we, most people will have seen the description of how that works. And they create in their, you know, Black Lives Matter's Facebook pages on the one hand and anti-immigrant on the other. And they try and stage and create conflict. And now that people have seen oh yeah, that's how it worked here, there's more interest in understanding how it works around the world. But you, it's almost as if people here had to experience it or see it for themselves before they believed that it was a real problem. >> John Haskell: But the Soviet Union always propagandized. >> Anne Applebaum: Sure. >> John Haskell: Is this different in some way? >> Anne Applebaum: So, it's only different in that the ease and speed with which it can be done is different. It's different in that the kinds of language they're using and what they're working with is different. So when you look at, I mean this is really not about disinformation but more about Russian political influence campaigns. For example, in Europe, there, you know, in the olden days the Soviet Union looked for communists and fellow travelers that it could work with in different countries. And actually, the modern Russian state is in a certain sense much less ideological. They're not, you know, they're not bound to any particular ideology. What they're looking for is anybody on the far right, anybody on the far left, but even, you know, they look for others as well who serve their broader strategy in Europe. Which is kind of anti-European, anti-European Union, anti-NATO, anti-American. I mean they seek to undermine Western institutions. And you know, what they do, they don't actually create. They don't, you know, they didn't create Marine Le Pen, for example, who's the far-right leader of the National Front in France. She's an old, she's been in politics for decades, so was her father. They didn't create her, invent her, but they sought to amplify her message. And that's just something that's easier to do than it used to be. You can do it with, you can do it by, you know, you can do it with money. They funded her political campaign. Or you can do it online with, you know, with fake, you know, internet users and fake social media campaigns. So it's simply working with existing cleavages in Western societies. They can do quite a lot. So it's not, no I don't think it's fundamentally different, but the ground is fertile, and it's much easier and much cheaper than it ever -- >> John Haskell: And they're not promoting themselves. I mean they're not promoting communism. >> Anne Applebaum: No, they don't really care if we like them or admire them at all actually. I mean, they don't, for example, it's very different from the Chinese who do care, you know, whether -- >> John Haskell: So are they doing any self-promotion internally. I mean, if they? >> Anne Applebaum: So that's different. I mean Russian propaganda campaigns inside Russia are much, you know, really are much more important. And those are also, by the way, very, the language that's used in the Russian media, there was recently a very good study done of the three main Russian television channels, looking at what they, you know, what they, the stories that they wrote and publicized about Europe. And this was, and by the way, the United States, it's a very similar story. And I can't remember off the top of my head the numbers. I just wrote about this a couple of days ago. But it's something like 17 times a day, you know, certainly multiple times a day you get negative stories on Russian television of all kinds. And they fit into particular narratives. You know, Europe is weak. Europe is disorganized. Also, Europeans are, you know, terrified by terrorism, and they live at home in fear. The European, you know, there's a repeated stories, for example, you know, if you live in Europe, your children can be taken away and given to gay families. Because there's this terror, they terrorize you with homosexuality. I mean I know it sounds absurd, but that is actually stories that you get on Russian television. And at the same time, Europe is very aggressive and Russophobic. And they're anti, yeah, so it tries to show both. I mean the purpose of that is to make sure that no Russians think that democracy or European values are better, that they don't find it attractive so that they don't try to revolt against the authoritarian and oligarchic system that they have. And also maybe to prepare, you know, get people prepared for some kind of conflict. >> John Haskell: So on to the mechanics, what exactly are bots, and for those of us who are less technologically savvy? And what other specific methods are being used to hack or troll? >> Anne Applebaum: So somebody, this is not my original, I didn't make up this breakdown. But one of the ways to think about, you know, fakeness, I really hate the expression fake news for all the obvious reasons, one of the ways to think about falsity or falsehood on the internet is to think about it in different categories. And so first of all, there's the category fake identities. There are, you know, people who are pretending to be someone else. There are bots, which are little pieces, for those who don't do these things, these are little pieces of computer code that imitate human social media behavior so they can be automated. You can create bot farms so that you can create, I don't know, 10,000 bots, which will retweet certain or repost certain kinds of messages. And some of these are pretty crude, and you can actually detect them pretty easily. And some of them are quite sophisticated, you know, they'll respond to a particular word or a particular idea, and they'll immediately create a response. And so you can essentially automate reaction. So you have fake people, fake websites, you know, which pretend to be one thing but are really something else. And that's, by the way, very easy to do. And you know, lots of, anybody in this room could have a fake Facebook page or a fake Twitter page, Twitter account. It's not difficult. You might have to have some technological capability to create a bot, but even I, apparently not even so hard. Maybe for you and me, but -- >> John Haskell: So you're -- >> Anne Applebaum: No so anyway, there's this level of fake identity. That's one thing. And then there's a second level which is fake audiences. So you can create fake people. And then you can give the impression that you have more followers than you do. Or you can, you know, seek to show that something is more popular. You can seek to make something more popular, as I say, using automation or using trolls, which are people who are professionally posting things on the internet. And this is something the Russians did before anybody else. They understood the possibilities of that. Then they created these troll farms in Saint Petersburg, which is actually a wonderful image if you think about little trolls, you know, in a farm. >> John Haskell: There's no famine there right now apparently. >> Anne Applebaum: Right. So these, and these are different ways. Then you have a third level, which is actual fake stories, you know, invented stories, which can be then promoted by these, you know, by the fake people and the fake amplifiers. And then there's a fourth level which is sort of entirely fake narratives. So for example, this narrative that I just mentioned in Russia about the terrorism, you know, the government is terrorizing people who are insufficiently pro-gay. And I'm sorry to use that, but that's part of the narrative. So, and you know, that's repeated again and again. And it's shown in different kind of stories. And there are different ways in which it's, you know, there's a version of this that's used quite a lot in Western Europe which has to do with immigrants, you know, immigrants are coming. They're going to rape your daughters. You know, pictures and photographs are shown in different ways. During the German election it was a big, this was a big Russian theme was an attempt to scare people and worry people and make them, so it's an attempt to create fear. Anyway, these are different levels of ways in which falsehood exists on the internet. And you know, it may be that through controlling even the problem of fake identities, you could do quite a lot to eliminate the problem of fake. We need to begin breaking down this problem, and if we want to really think about stopping it, start, one wants to start with the narrative problem. So how do we fight back against Russia? But maybe actually we should go back further down and look at how anonymity works on the internet. Ask whether we want there to be that much anonymity and so on. >> John Haskell: So do you know of any specific examples of any of those levels, but say for example, at the creating a fake audience level that Russia might be doing now in the US? >> Anne Applebaum: So actually I haven't, I mean we know what they were doing during the US election. >> John Haskell: Right. >> Anne Applebaum: Because that's now been published. So for example, they created fake, I mean this was one I was shown actually. So they created, for example, a fake Black lives Matter Twitter account that looked like a real one and sought to obtain audiences. And, you know, what is the purpose of doing that? Well, the purpose was that at some point to use the trust that people had in that account in order to get past a message to that audience. So they seek to create false audiences. And again, sometimes using real causes. I mean, of course, there's a real Black Lives Matter. It doesn't mean it's not real. But in addition to that, they seek to create copycat or imitation ones that they can get followers or get audiences and then use that, the trust that they gained to pass messages. Because one of the important things now, if you ask what's different about now and the past, is we have much more divided audiences than we once did. And people now are much more likely to get their information, they get it from people they trust. They get it from whatever their friends on Facebook or their cousins who they follow. And so the, you know, the game is to build audiences that trust you since, as we've said, the markers of quality or the markers of trust that used to exist are gone. I mean even somebody was saying to me the other day, if you think about even when you used to read, for example, mainstream newspaper, you know, you had the front page, you know, and then you had the sports section. And then you had the, you know, the comics, you know, and then you had the opinion pages. And even you would see those visually and you would think of them all a little bit different. Like you didn't think the comic page was news, right. I mean, presumably. >> John Haskell: Right. >> Anne Applebaum: And when you looked at the opinion page, the way it was structured, I mean, for example -- >> John Haskell: That's before Doonesbury is what you're talking about pre-Doonesbury. >> Anne Applebaum: Before Doonesbury, that's right. But if you looked at the op ed page, you knew these were op eds, right. There's the editorials on the left and then the opinions on the right. And this is different from the news. And so here you're reading people's opinions and interpretations and the news is meant to be the news. This is all completely broken down. You know, when you look at the, you look at something that says Washington Post on the top, you don't have any sense immediately whether that's an op ed or it's a news story or it's a joke or it's a parody. And so, you know, the hierarchies are broken down, and so people, you know, when people decide what they're going to trust, you know, they often rely on their friends or they rely on, you know, certain kinds of, you know, language that they trust or certain kinds of people and so on. And this is what the Russians tried to use. And again, we're talking about the Russians, but really anybody could do it, in order to build particular audiences that they could then message. >> John Haskell: Is China in this space then? >> Anne Applebaum: So yes and no. They have a different set of tactics. I don't think the Chinese are interested in, for the most part, certainly in European and American politics, they're not seeking to, you know, undermine democracy or increase extremism or elect particular candidates. They don't have that interest, and they don't seem to be playing that at all. In fact, I mean the Chinese are not interested in extremism at all. They like the world to be very stable. They're actually very happy with NATO. They don't want it to fall apart. They don't want the EU to fall apart. They like dealing with, you know, they have, you know, the status quo suits them in that way. Russia is a revisionist power that doesn't like the way the international system and wants to undermine it. So they have a different, the Chinese are doing different things though. They are, they do seek to not so much online but they seek to use institution, like Confucius centers that they fund. They seek to get influence in American universities. They have, you know, through scholars. There are some, in some European countries, I mean, oddly, you know, in some Central European countries, they've made kind of targeted investments that they seek to then use to get some. But when they have political influence, what they're interested in is, for example, discouraging country X or Y from having a relationship with the Dalai Lama. I mean they have particular political goals that they care about that are more to do with them. So it's more. >> John Haskell: Right. >> Anne Applebaum: And it's, they don't have this, you know, this Russian style interest in kind of upsetting the apple cart. >> John Haskell: So as a practical policy matter, is the -- >> Anne Applebaum: I think they're different. >> John Haskell: Yeah, but is the US government tracking disinformation efforts in our elections yet? >> Anne Applebaum: No. I mean, there are pieces of the US government, I mean, actually, a lot of pieces of the US government understand this issue quite well. Certainly, people, you know, at the Pentagon and at the State Department, they understand particularly this Russian problem. They, you know, for, you know, if you're worried about, for example, American troops that are based in the Baltic states, you think about this problem every day. You know, you're worried about, you know, people trying to hack your soldiers, and at the same time, you're worried about, I don't know, a fake story saying that American soldiers has raped a Lithuanian girl, and what are you going to do about that? So people are thinking about it all the time, so very, very big and important problem. What we don't have yet is a center in the government or a place where you could track and monitor these things in a daily way and in a comprehensive way. Let me put it differently. It isn't anybody's full-time job to worry about this. There are pockets of people who are interested and care about it, but I don't know there's a center. I mean, in terms of our elections, there is, I know that the state, kind of state level, this is more to do with the mechanics of elections. State level, you know, election offices, election commissions, are worried about it and do think about it and talk about it. There was recently an interesting kind of training exercises at Harvard that was partly funded by the media tech companies which, they did a kind of war game. So imagine it's election day and, you know, here, you know, here are some attacks on the system that are coming. Someone's trying to hack your system. How do you react? And so it was a kind of training that they run like a war game where they had, I think people from all 50 states participating. >> John Haskell: Are you aware of the status of what academic research is on this? I know you're doing some of this at LSE, but -- >> Anne Applebaum: So there is academic research. It's, again, it's still pretty scattered. This is a very new subject and issue. But there are some very good people both in this country and elsewhere who have begun trying to analyze social media. One of the problems is that the some is, so Twitter is quite easy to analyze. Facebook is very difficult. Facebook has not made its data accessible to academics. I think they're under, after the political pressure of the last couple of months they're talking about in very controlled ways, you know, in ways so that your people, academics, researchers are blind to individuals and details and so on, making data available so that people can understand how some of this works. But the, you know, look, it's a new field. >> John Haskell: Right, yeah. So ripe in other words. So, I mean who could or should be fighting back against it then? I mean the government's not involved in it except at the intel level perhaps. You know, should it be, should tech companies be relied upon? I mean how are we, what would make sense as a strategy, as a society to fight back, for instance. >> Anne Applebaum: If a difficult question. I've been thinking about this for a while now. There's a, you can't, I think it's impossible to look for sort of a silver bullet strategy. Now there's going to be one answer, and if we could just find this then we're going to fix it. I think there's going to be a range of answers. I mean some of the answers could be found by the tech companies if they wanted to find it. And if they don't want to find it, we might have to make them find it. And this, again, this is this problem of fake identities and fake amplification. I mean this is something that could be fixed technically or technologically. So there's that piece of it. There is a, you know, there's clearly a role for media and for journalists to think differently. How do we reach the people who don't read us? This is an interesting problem. There is a role for civic organizations for people who, you know, who do online investigations of this and who do fact checking. And funding for that and activity in that space has bumped up a lot. Even in the last year their foundations are now interested in that kind of problem. And I think there's obviously there's a role for education at many levels, not just kind of in schools. But you know, adults should learn how to, you know, in this question of detecting what's true and what's false. I mean we were just talking before about how students don't necessarily see anymore what's a good source and what's a bad source online. I mean I think there's same for adult, you know, almost everybody. >> John Haskell: Because you and I when we were in college, we were, you go to the card catalog, and everything, virtually everything was a major publisher -- >> Anne Applebaum: Right. >> John Haskell: or university press. So it was relatively reputable, and then you follow cites and book bibliographies. But today it's, you know, and we had actually to get out of our dorm room to go there and do that. >> Anne Applebaum: Right. >> John Haskell: As opposed to being bombarded with all this stuff. So that's, I mean, we're the suckers ultimately, right. And so, it's our weakness, and that's what you. Is that what we have to address? >> Anne Applebaum: That's, one of the issues that I haven't resolved in my head actually is the question do people want to know what's true? Do they want good information? And we all think we want it, and if you ask people, they say they want it. But how much effort are people willing to put in in order to get it. And this then becomes a political problem. Because if they aren't willing to do it, I don't know, can you have democracy if people don't care anymore whether things are true or false? It begins to be, you know, a real challenge. And the second problem then is can you have democracy if you don't have, so one of the effects of social media and, you know, online media, is that, as I said, people are now siloed in their echo chambers where different people trust different kinds of news. Okay, but if you don't, if there's no shared public space, if we're not all having the same debate anymore. And this isn't about opinion, you know, left wing-right wing. This is like, do we all agree what happened yesterday? Right. If we don't all agree what happened yesterday, how do we make a policy to deal with it? Or how do we debate it or talk about it? And finding ways to bring back some kind of shared public space. I think that's a real crisis, particularly in this country. I mean in some European countries where you still have public broadcasters, there still is the BBC, you know, if not everybody likes it. It exists, and it's, everyone agrees that it's a legitimate news organization. But I mean look, in this country we have people who, you know, I don't think there is a national agreement about who's legitimate television and who's not. >> John Haskell: So, one of Russia's targets, of course, is Ukraine, and you've written a good bit about it. Are they trying to do anything? >> Anne Applebaum: So Ukraine is fascinating actually, because Ukraine is a kind of petri dish. Ukraine is where all kinds of, you know, where Russian political influence campaigns have been tried and practiced, and almost everything you see everywhere else has been tried at least once in Ukraine. Whatever hacking people's private email or creating, you know, bot nets that will, that was all done over the last decade in Ukraine. Ii think it's, and it's also been a kind of petri dish for responses. The first really good and interesting kind of anti-disinformation NGO or civil society group was created in Ukraine called Stop Fake. And Stop Fake began using sort of techniques of online journalism to identify when a picture was fake or when a video was fake or when, you know, and they create, again, they created, they sought to reach journalists, and they began looking at identifying. This is not so much factchecking as verification. So they're in, it's an interesting group, and they've tried, you know, they've tried different experiments and trying to figure out how to reach people and how to try in different languages and so on. You know, the Ukrainian government has also tried some things that I think are negative. You know, they've tried to ban, they have banned actually. There's a Russian social media platform called VK, which is now banned in Ukraine. And there's just, you know, they reckon that's manipulated. So they try and do it by banning things, and we'll see if that works. I think, you know, probably it won't. You know, one of the answers sometimes is okay, you need to create, if you're being, if your society is being undermined by, you know, negative, you know, one of the correct responses should be well, you need a positive narrative that attracts people. They've been maybe less good at creating that. >> John Haskell: So you have a good megaphone with writing a column a couple times a week for the Post and other outlets. What have you told policymakers to think about or what would you like to? We have some in the room that we have an opportunity you can tell. >> Anne Applebaum: Yeah, raise your hands. >> John Haskell: Maybe this is what might be a good thing to think about. You know, when you have an opportunity to talk to a member of Congress. What do you tell them? What would be a useful way to think about what to do about this? >> Anne Applebaum: So first of all, it would be useful to have some piece of the US government doing this full time as its only job. And it shouldn't just be people who do public diplomacy or press communications, which is what it often is now. And who can begin to think full time about the aspects of it. And then, and also beginning to fund research into it. Which is happening a kind of scattered way. There are foundations doing it, but there should be more. Also, I think, I think it's, you know, the lesson of the recent hearings with Mark Zuckerberg is that Congress really needs to up its game on understanding what this is and how it works. Yeah. It's not -- >> John Haskell: They're disagreeing with you. >> Anne Applebaum: Yeah, it's not actually that funny. You know -- >> John Haskell: [Inaudible] humor. >> Anne Applebaum: The level of knowledge, you know, is incredibly low. You know, I mean this is such an important and urgent and interesting problem with all kinds of facets, both domestic and foreign policy, affecting education, affecting research. You know, shouldn't there be a, you know, congressional committee devoted to this? Shouldn't we begin to think harder about which pieces of the US government should be doing, which pieces of the legislature. I mean I'd like to see, you know, members of congress who do this, you know, just like we have some who do the arms services committee, and that's what they think about full time. I'd like someone to be thinking about this all the time. Because there may be, there may have to be some regulatory or legislative pieces of the solution. I mean certainly, there will be, the Europeans are going to do that if we don't. So, it's time to get on the ball. >> John Haskell: Well, you know, we appreciate very much your being a part of addressing that knowledge gap. I mean that's what we're trying to do here. We appreciate that a lot here at the library. So the last thing I want to ask is, you know, you have a good streak on winning prizes for your books. So, not to jinx anything, but what is your next project? >> Anne Applebaum: I thought you were about to say what is my prize? I don't know. I don't know. I am, I have, I've been very involved in this issue and trying to do research on it and also inspire a conversation about it, which is something I hadn't really done before. I've always worked as a journalist or as a historian, not as somebody kind of involved in policy. So, it's been new for me. I would very much like to write a book about the year 1989 and what happened afterwards. Because we've made a lot of assumptions about what happened then and what happened in the 1990s. And actually, the 1990s is a really interesting decade when you had huge transformation in Central and Eastern Europe. And although it's been written about, you know, by journalists at the time and in some scattered way, I think it, it's one of those things where the 1990s weren't interesting for a long time because it was just kind of old news and stuff that happened that was boring and yesterday. And now suddenly, I think we're at the moment where it's history. And so oh now, it's time to reassess what happened after communism fell. As you know, most of my three larger history books have all been about Stalinism. And I think I won't write about Stalin again. Done with Stalin. >> John Haskell: So we're having a book signing on the most recent book next door at Woodall Pavilion, so I hope you all will come join Anne in a few minutes. But thank you very much. >> Anne Applebaum: Thank you. >> John Haskell: This was very informative. >> Anne Applebaum: Thanks. [ Applause ] >> This has been a presentation of the Library of Congress. Visit us at loc.gov.

Contents

Etymology and early usage

The English word disinformation, which did not appear in dictionaries until the late-1980s, is a translation of the Russian дезинформация, transliterated as dezinformatsiya.[2][6][1] Where misinformation refers to inaccuracies that stem from error, disinformation is deliberate falsehood promulgated by design.[4] Misinformation can be used to define disinformation—when known misinformation is purposefully and intentionally disseminated.[13] Front groups are a form of disinformation, as they fraudulently mislead as to their actual controllers.[14] Disinformation tactics can lead to blowback, unintended negative problems due to the strategy, for example defamation lawsuits or damage to reputation.[14] Disinformation is primarily prepared by government intelligence agencies.[15]

The tactic was used during the long Roman-Persian Wars, examples being the Battle of Mount Gindarus, Battle of Telephis–Ollaria, and Heraclius assault on Persia.

Usage of the term related to a Russian tactical weapon started in 1923, when the Deputy Chairman of the KGB-precursor the State Political Directorate (GPU), Józef Unszlicht, called for the foundation of "a special disinformation office to conduct active intelligence operations".[5] The GPU was the first organization in the Soviet Union to utilize the term disinformation for their intelligence tactics.[16] William Safire wrote in his 1993 book Quoth the Maven that disinformation was used by the KGB predecessor to indicate: "manipulation of a nation's intelligence system through the injection of credible, but misleading data".[16] From this point on, disinformation became a tactic used in the Soviet political warfare called active measures.[17][5] Active measures were a crucial part of Soviet intelligence strategy involving forgery as covert operation, subversion, and media manipulation.[18] The 2003 encyclopedia Propaganda and Mass Persuasion states that disinformation came from dezinformatsia, a term used by the Russian black propaganda unit known as Service A which referred to active measures.[17] The term was used in 1939, related to a "German Disinformation Service".[19][20] The 1991 edition of The Merriam-Webster New Book of Word Histories defines disinformation as a probable translation of the Russian dezinformatsiya.[20] This dictionary notes that it was possible the English version of the word and the Russian language version developed independently in parallel to each other—out of ongoing frustration related to the spread of propaganda before World War II.[20]

Former Romanian secret police senior official Ion Mihai Pacepa exposed disinformation history in his book Disinformation (2013).[6]
Former Romanian secret police senior official Ion Mihai Pacepa exposed disinformation history in his book Disinformation (2013).[6]

Ion Mihai Pacepa, former senior official from the Romanian secret police, said the word was coined by Joseph Stalin and used during World War II.[6][1] The Stalinist government then utilized disinformation tactics in both World War II and the Cold War.[21] Soviet intelligence used the term maskirovka (Russian military deception) to refer to a combination of tactics including disinformation, simulation, camouflage, and concealment.[22] Pacepa and Ronald J. Rychlak authored a book titled Disinformation, in which Pacepa wrote that Stalin gave the tactic a French-sounding title in order to put forth the ruse that it was actually a technique used by the Western world.[1] Pacepa recounted reading Soviet instruction manuals while working as an intelligence officer, that characterized disinformation as a strategy utilized by the Russian government that had early origins in Russian history.[6][1] Pacepa recalled that the Soviet manuals said the origins of disinformation stemmed from phony towns constructed by Grigory Potyomkin in Crimea to wow Catherine the Great during her 1783 journey to the region—subsequently referred to as Potemkin villages.[6][1]

In their book Propaganda and Persuasion, authors Garth Jowett and Victoria O'Donnell characterized disinformation as a cognate from dezinformatsia, and was developed from the same name given to a KGB black propaganda department.[4] The black propaganda division was reported to have formed in 1955 and was referred to as the Dezinformatsiya agency.[20] Former Central Intelligence Agency (CIA) director William Colby explained how the Dezinformatsiya agency operated, saying that it would place a false article in a left-leaning newspaper.[20] The fraudulent tale would make its way to a Communist periodical, before eventually being published by a Soviet newspaper, which would say its sources were undisclosed individuals.[20] By this process a falsehood was globally proliferated as a legitimate piece of reporting.[20]

According to Oxford Dictionaries the English word disinformation, as translated from the Russian disinformatsiya, began to see use in the 1950s.[23] The term disinformation began to see wider use as a form of Soviet tradecraft, defined in the 1952 official Great Soviet Encyclopedia as "the dissemination (in the press, radio, etc.) of false information with the intention to deceive public opinion."[2][6] During the most-active period of the Cold War, from 1945 to 1989, the tactic was used by multiple intelligence agencies including the Soviet KGB, British Secret Intelligence Service, and the American CIA.[19] The word disinformation saw increased usage in the 1960s and wider purveyance by the 1980s.[6] Former Soviet bloc intelligence officer Ladislav Bittman, the first disinformation practitioner to publicly defect to the West, described the official definition as different from the practice: "The interpretation is slightly distorted because public opinion is only one of the potential targets. Many disinformation games are designed only to manipulate the decision-making elite, and receive no publicity."[2] Bittman was deputy chief of the Disinformation Department of the Czechoslovak Intelligence Service, and testified before the United States Congress on his knowledge of disinformation in 1980.[17]

Disinformation may include distribution of forged documents, manuscripts, and photographs, or spreading dangerous rumours and fabricated intelligence. A major disinformation effort in 1964, Operation Neptune, was designed by the Czechoslovak secret service, the StB, to defame West European politicians as former Nazi collaborators.[24]

Defections reveal covert operations

Chief of Russian foreign intelligence Yevgeny Primakov confirmed in 1992 that Operation INFEKTION was a disinformation campaign to make the world believe the U.S. invented AIDS.[6][7]
Chief of Russian foreign intelligence Yevgeny Primakov confirmed in 1992 that Operation INFEKTION was a disinformation campaign to make the world believe the U.S. invented AIDS.[6][7]

The extent of Soviet disinformation covert operation campaigns, came to light through the defections of KGB officers and officers of allied Soviet bloc services from the late 1960s through the 1980s.[25][10] Disorder during the fall of the Soviet Union revealed archival and other documentary information to confirm what the defectors had revealed.[25] Stanislav Levchenko and Ilya Dzerkvilov defected from the Soviet Union and by 1990 each had written books recounting their work in the KGB on disinformation operations.[10]

In 1961, a pamphlet was published in the United Kingdom titled: A Study of a Master Spy (Allen Dulles), which was highly critical of then-Director of Central Intelligence Allen Dulles.[8] The purported authors were given as Independent Labour Party Member of Parliament Bob Edwards and reporter Kenneth Dunne—when in actual fact the author was senior disinformation officer KGB Colonel Vassily Sitnikov.[8]

An example of successful Soviet disinformation was the publication in 1968 of Who's Who in the CIA which was quoted as authoritative in the West until the early 1990s.[26]

According to senior SVR officer Sergei Tretyakov, the KGB was responsible for creating the entire nuclear winter story to stop the deployment of Pershing II missiles.[27] Tretyakov says that from 1979 the KGB wanted to prevent the United States from deploying the missiles in Western Europe and that, directed by Yuri Andropov, they distributed disinformation, based on a faked "doomsday report" by the Soviet Academy of Sciences about the effect of nuclear war on climate, to peace groups, the environmental movement and the journal AMBIO: A Journal of the Human Environment.[27]

Deception, Disinformation, and Strategic Communications, cover illustrating propaganda from Operation INFEKTION

During the 1970s, the U.S. intelligence apparatus paid little attention to try to counter Soviet disinformation campaigns.[8] This posture changed in September 1980 during the Carter Administration, when the White House was subjected to a propaganda operation by Soviet intelligence regarding international relations between the U.S. and South Africa.[8] On 17 September 1980, White House Press Secretary Jody Powell acknowledged a falsified Presidential Review Memorandum on Africa reportedly stated the U.S. endorsed the apartheid government in South Africa and was actively committed to discrimination against African Americans.[8] Prior to this revelation by Powell, an advance copy of the 18 September 1980 issue of San Francisco-based publication the Sun Reporter was disseminated, which carried the fake claims.[8] Sun Reporter was published by Carlton Benjamin Goodlett, Presidential Committee member of the Soviet front group the World Peace Council.[8] U.S. President Jimmy Carter was appalled at these lies and subsequently the Carter Administration displayed increased interest in CIA efforts to counter Soviet disinformation.[8]

In 1982, the CIA issued a report on active measures used by Soviet intelligence.[28] The report documented numerous instances of disinformation campaigns against the U.S., including planting a notion that the U.S. had organized the 1979 Grand Mosque seizure, and forgery of documents purporting to show the U.S. would utilize nuclear bombs on its NATO allies.[28]

Operation INFEKTION was an elaborate disinformation campaign which began in 1985, to influence world opinion to believe that the United States had invented AIDS.[6][7] This included the allegation that the purpose was the creation of an 'ethnic bomb' to destroy non-whites.[7] In 1992, the head of Russian foreign intelligence, Yevgeny Primakov, admitted the existence of the Operation INFEKTION disinformation campaign.[6][7]

In 1985, Aldrich Ames gave the KGB a significant amount of information on CIA agents, and the Soviet government swiftly moved to arrest these individuals.[29] Soviet intelligence feared this rapid action would alert the CIA that Ames was a spy.[29] In order to reduce the chances the CIA would discover Ames's duplicity, the KGB manufactured disinformation as to the reasoning behind the arrests of U.S. intelligence agents.[29] During summer 1985, a KGB officer who was a double agent working for the CIA on a mission in Africa traveled to a dead drop in Moscow on his way home but never reported in.[29] The CIA heard from a European KGB source that their agent was arrested.[29] Simultaneously the FBI and CIA learned from a second KGB source of their agent's arrest.[29] Only after Ames had been outed as a spy for the KGB did it become apparent that the KGB had known all along that both of these agents were double agents for the U.S. government, and had played them as pawns to send disinformation to the CIA in order to protect Ames.[29]

Post Soviet-era Russian disinformation

In the post-Soviet era, disinformation evolved to become a key tactic in the military doctrine of Russia.[30]

The European Union and NATO saw Russian disinformation in the early 21st century as such a problem that they both set up special units to analyze and debunk fabricated falsehoods.[30] NATO founded a modest facility in Latvia to respond to disinformation[31] and, following agreement by heads of state and governments in March 2015 the EU created the European External Action Service East Stratcom Task Force, which publishes weekly reports in its website "EU vs Disinfo".[32] The website and its partners identified and debunked over 3,500 pro-Kremlin disinformation cases between September 2015 and November 2017.[32]

Methods used by Russia during this period included its Kremlin-controlled mouthpieces: news agency Sputnik News and television outlet Russia Today (RT).[30] When explaining the 2016 annual report of the Swedish Security Service on disinformation, representative Wilhelm Unge stated: "We mean everything from Internet trolls to propaganda and misinformation spread by media companies like RT and Sputnik."[30]

Later in the 21st century, as social media gained prominence, Russia began to use popular platforms such as Facebook and Twitter to spread disinformation. Facebook believes that as many as 126 million users have seen content from Russian disinformation campaigns on its platform. Twitter has said that it has found 36,000 Russian bots spreading tweets related to the 2016 American election.[33] Elsewhere, Russia has used social media to destabilize former soviet states such as Ukraine and other western nations such as France and Spain.[34]

English language spread

How Disinformation Can Be Spread, explanation by U.S. Defense Department (2001)
How Disinformation Can Be Spread, explanation by U.S. Defense Department (2001)

The United States Intelligence Community appropriated usage of the term disinformation in the 1950s from the Russian dezinformatsiya, and began to utilize similar strategies[5][35] during the Cold War and in conflict with other nations.[6] The New York Times reported in 2000 that during the CIA's effort to substitute Mohammed Reza Pahlavi for then-Prime Minister of Iran Mohammad Mossadegh, the CIA placed fictitious stories in the local newspaper.[6] Reuters documented how, subsequent to the 1979 Soviet Union invasion of Afghanistan during the Soviet–Afghan War, the CIA put false articles in newspapers of Islamic-majority countries, inaccurately stating that Soviet embassies had "invasion day celebrations".[6] Reuters noted a former U.S. intelligence officer said they would attempt to gain the confidence of reporters and use them as secret agents, to impact a nation's politics by way of their local media.[6]

In October 1986, the term gained increased currency in the U.S. when it was revealed that two months previously, the Reagan Administration had engaged in a disinformation campaign against then-leader of Libya, Muammar Gaddafi.[9] White House representative Larry Speakes said reports of a planned attack on Libya as first broken by The Wall Street Journal on August 25, 1986 were "authoritative", and other newspapers including The Washington Post then wrote articles saying this was factual.[9] United States Department of State representative Bernard Kalb resigned from his position in protest over the disinformation campaign, and said: "Faith in the word of America is the pulse beat of our democracy."[9]

The executive branch of the Reagan Administration kept watch on disinformation campaigns through three yearly publications by the Department of State: Active Measures: A Report on the Substance and Process of Anti-U.S. Disinformation and Propaganda Campaigns (1986); Report on Active Measures and Propaganda, 1986–87 (1987); and Report on Active Measures and Propaganda, 1987–88 (1989).[5]

Disinformation first made an appearance in dictionaries in 1985, specifically Webster's New College Dictionary and the American Heritage Dictionary in 1985.[36] In 1986, the term disinformation was not defined in Webster's New World Thesaurus or New Encyclopædia Britannica.[1] After the Soviet term became widely known in the 1980s, native speakers of English broadened the term as "any government communication (either overt or covert) containing intentionally false and misleading material, often combined selectively with true information, which seeks to mislead and manipulate either elites or a mass audience."[3]

By 1990, use of the term disinformation had fully established itself in the English language within the lexicon of politics.[10] By 2001, the term disinformation had come to be known as simply a more civil phrase for saying someone was spouting lies.[11] Stanley B. Cunningham wrote in his 2002 book The Idea of Propaganda that disinformation had become pervasively utilized as a synonym for propaganda.[12]

Analysis

The authors of a 2006 book about psychopathy in the workplace, Snakes in Suits describe a five-phase model of how a typical workplace psychopath climbs to and maintains power. In phase three, manipulation, the psychopath will create a scenario of "psychopathic fiction"—where positive information about themselves and negative disinformation about others will be created, casting others in roles as a part of a network of pawns or patrons to be utilized and groomed into accepting the psychopath's agenda.[37]

Responses from cultural leaders

Pope Francis criticized disinformation in a 2016 interview, after being made the subject of a fake news website—during the 2016 U.S. election cycle he was falsely said to support Donald Trump.[38][39][40] He said the worst thing the news media could do was spread disinformation, that it was a sin,[41][42] comparing those who spread disinformation to individuals who engage in coprophilia.[43][44]

Ethics in warfare

In a contribution to the 2014 book Military Ethics and Emerging Technologies, writers David Danks and Joseph H. Danks discuss the ethical implications in using disinformation as a tactic during information warfare.[45] They note there has been a significant degree of philosophical debate over the issue as related to the ethics of war and use of the technique.[45] The writers describe a position whereby the use of disinformation is occasionally allowed, but not in all situations.[45] Typically the ethical test to consider is whether the disinformation was performed out of a motivation of good faith and acceptable according to the rules of war.[45] By this test, the tactic during World War II of putting fake inflatable tanks in visible locations on the Pacific Islands in order to falsely present the impression that there were larger military forces present would be considered as ethically permissible.[45] Conversely, disguising a munitions plant as a healthcare facility in order to avoid attack would be outside the bounds of acceptable use of disinformation during war.[45]

Research

Consequences of exposure to disinformation online

There is a broad consensus amongst scholars that there is a high degree of disinformation, misinformation, and propaganda online; however, it is unclear to what extent such disinformation has on political attitudes in the public and therefore political outcomes.[46] This conventional wisdom has come mostly from investigative journalists, with a particular rise during the 2016 US election: some of the earliest work came from Craig Silverman at Buzzfeed News.[47] Cass Sunstein supported this in #Republic, arguing that the internet would become rife with echo chambers and informational cascades of misinformation leading to a highly polarised and ill-informed society.[48]

However, research done on this topic points less clearly in this direction. For example, internet access and time spent on social media does not appear correlated with polarisation.[49] Further, misinformation appears not to significantly change political knowledge of those exposed to it.[50] There seems to be a higher level of diversity of news sources users are exposed to on Facebook and Twitter than conventional wisdom would dictate, as well as a higher frequency of cross-spectrum disussion.[51][52] Other evidence has found that disinformation campaigns rarely succeed in altering the foreign policies of the targeted states.[53]

Strategies for spreading disinformation

There are four main methods of spreading disinformation recognised in academic literature:[46]

  1. Selective Censorship.
  2. Manipulation of search rankings.
  3. Hacking and Releasing
  4. Directly Sharing Disinformation

See also

References

  1. ^ a b c d e f g h i j k l Ion Mihai Pacepa and Ronald J. Rychlak (2013), Disinformation: Former Spy Chief Reveals Secret Strategies for Undermining Freedom, Attacking Religion, and Promoting Terrorism, WND Books, pp. 4–6, 34–39, 75, ISBN 978-1-936488-60-5
  2. ^ a b c d e f g Bittman, Ladislav (1985), The KGB and Soviet Disinformation: An Insider's View, Pergamon-Brassey's, pp. 49–50, ISBN 978-0-08-031572-0
  3. ^ a b c Shultz, Richard H.; Godson, Roy (1984), Dezinformatsia: Active Measures in Soviet Strategy, Pergamon-Brassey's, pp. 37–38, ISBN 978-0-08-031573-7
  4. ^ a b c Garth Jowett; Victoria O'Donnell (2005), "What Is Propaganda, and How Does It Differ From Persuasion?", Propaganda and Persuasion, Sage Publications, pp. 21–23, ISBN 978-1-4129-0898-6, In fact, the word disinformation is a cognate for the Russian dezinformatsia, taken from the name of a division of the KGB devoted to black propaganda.
  5. ^ a b c d e Martin J. Manning; Herbert Romerstein (2004), "Disinformation", Historical Dictionary of American Propaganda, Greenwood, pp. 82–83, ISBN 978-0-313-29605-5
  6. ^ a b c d e f g h i j k l m n o p Taylor, Adam (26 November 2016), "Before 'fake news,' there was Soviet 'disinformation'", The Washington Post, retrieved 3 December 2016
  7. ^ a b c d e United States Department of State (1987), Soviet Influence Activities: A Report on Active Measures and Propaganda, 1986–87, Washington D.C.: Bureau of Public Affairs, pp. 34–35, 39, 42
  8. ^ a b c d e f g h i Waller, J. Michael (2009), Strategic Influence: Public Diplomacy, Counterpropaganda, and Political Warfare, Institute of World Politics Press, pp. 159–161, ISBN 978-0-9792236-4-8
  9. ^ a b c d Biagi, Shirley (2014), "Disinformation", Media/Impact: An Introduction to Mass Media, Cengage Learning, p. 328, ISBN 978-1-133-31138-6
  10. ^ a b c d Martin, David (1990), The Web of Disinformation: Churchill's Yugoslav Blunder, Harcourt Brace Jovanovich, p. xx, ISBN 978-0-15-180704-8
  11. ^ a b Barton, Geoff (2001), Developing Media Skills, Heinemann, p. 124, ISBN 978-0-435-10960-8
  12. ^ a b Cunningham, Stanley B. (2002), "Disinformation (Russian: dezinformatsiya)", The Idea of Propaganda: A Reconstruction, Praeger, pp. 67–68, 110, ISBN 978-0-275-97445-9
  13. ^ Golbeck, Jennifer, ed. (2008), Computing with Social Trust, Human-Computer Interaction Series, Springer, pp. 19–20, ISBN 978-1-84800-355-2
  14. ^ a b Samier, Eugene A. (2014), Secrecy and Tradecraft in Educational Administration: The Covert Side of Educational Life, Routledge Research in Education, Routledge, p. 176, ISBN 978-0-415-81681-6
  15. ^ Goldman, Jan (2006), "Disinformation", Words of Intelligence: A Dictionary, Scarecrow Press, p. 43, ISBN 978-0-8108-5641-7
  16. ^ a b Senn, Ann (1995), Open Systems for Better Business: Something Ventured, Something Gained, Van Nostrand Reinhold, p. 25, ISBN 978-0-442-01911-2
  17. ^ a b c Nicholas John Cull; David Holbrook Culbert; David Welch (2003), "Disinformation", Propaganda and Mass Persuasion: A Historical Encyclopedia, 1500 to the Present, ABC-CLIO, p. 104, ISBN 9781610690713
  18. ^ Ostrovsky, Arkady (5 August 2016), "For Putin, Disinformation Is Power", The New York Times, retrieved 9 December 2016
  19. ^ a b Henry Watson Fowler; Jeremy Butterfield (2015), Fowler's Dictionary of Modern English Usage, Oxford University Press, p. 223, ISBN 978-0-19-966135-0
  20. ^ a b c d e f g "disinformation", The Merriam-Webster New Book of Word Histories, Springfield, Massachusetts: Merriam-Webster, Inc, 1991, pp. 143–144, ISBN 978-0-87779-603-9
  21. ^ Mendell, Ronald L. (2013), "Disinformation", Investigating Information-based Crimes, Charles C Thomas Publisher Ltd, p. 45, ISBN 978-0-398-08871-2
  22. ^ Hy Rothstein; Barton Whaley (2013), "Catching NATO Unawares: Soviet Army Surprise and Deception Techniques", The Art and Science of Military Deception, Artech House Intelligence and Information Operations, Artech House Publishers, pp. 189–192, ISBN 978-1-60807-551-5
  23. ^ "disinformation", English Oxford Living Dictionaries, Oxford University Press, 2016, retrieved 9 December 2016
  24. ^ Bittman, Ladislav (1972), The Deception Game: Czechoslovak Intelligence in Soviet Political Warfare, Syracuse University Research Corporation, pp. 39–78, ISBN 978-0-8156-8078-9
  25. ^ a b Holland, Max (2006), "The Propagation and Power of Communist Security Services Dezinformatsiya", International Journal of Intelligence and CounterIntelligence, 19 (1): 1–31, doi:10.1080/08850600500332342
  26. ^ United States Information Agency (1992), "Crude, Anti-American Disinformation: 'Geheim' and 'Top Secret' Magazines: Purveyors of Crude, Defamatory Disinformation", Soviet Active Measures in the 'Post-Cold War' Era 1988-1991 - A Report Prepared at the Request of the United States House of Representatives Committee on Appropriations by the United States Information Agency, Washington, D.C.: United States Government Printing Office
  27. ^ a b Earley, Pete (2007), Comrade J: The Untold Secrets of Russia's Master Spy in America After the End of the Cold War, Penguin Books, pp. 167–177, ISBN 978-0-399-15439-3
  28. ^ a b Goulden, Joseph (2012), "Disinformation (dezinformatsiya)", The Dictionary of Espionage: Spyspeak into English, Dover Military History, Weapons, Armor, Dover Publications, p. 64, ISBN 978-0-486-48348-1
  29. ^ a b c d e f g Johnson, Loch K., ed. (2012), "Counterintelligence as Disinformation Operations", The Oxford Handbook of National Security Intelligence, Oxford Handbooks, Oxford University Press, pp. 548–550, ISBN 978-0-19-992947-4
  30. ^ a b c d MacFarquharaug, Neil (28 August 2016), "A Powerful Russian Weapon: The Spread of False Stories", The New York Times, p. A1, retrieved 9 December 2016
  31. ^ Anne Applebaum; Edward Lucas (6 May 2016), "The danger of Russian disinformation", The Washington Post, retrieved 9 December 2016
  32. ^ a b "EU vs Disinfo". EU vs Disinfo. European External Action Service East Stratcom Task Force. Retrieved 3 December 2017.
  33. ^ "Russia Using Disinformation To 'Sow Discord In West,' Britain's Prime Minister Says". NPR.org. Retrieved 20 February 2018.
  34. ^ "How Russia's Disinformation Campaign Could Extend Its Tentacles". NPR.org. Retrieved 20 February 2018.
  35. ^ Murray-Smith, Stephen (1989), Right Words, Viking, p. 118, ISBN 978-0-670-82825-8
  36. ^ Bittman, Ladislav (1988), The New Image-Makers: Soviet Propaganda & Disinformation Today, Brassey's Inc, pp. 7, 24, ISBN 978-0-08-034939-8
  37. ^ Babiak, Paul; Hare, Robert D. (2007), Snakes in Suits: When Psychopaths Go to Work, HarperCollins, p. 240, ISBN 978-0061147890
  38. ^ "Pope Warns About Fake News-From Experience", The New York Times, Associated Press, 7 December 2016, retrieved 7 December 2016
  39. ^ Alyssa Newcomb (15 November 2016), "Facebook, Google Crack Down on Fake News Advertising", NBC News, NBC News, retrieved 16 November 2016
  40. ^ Schaede, Sydney (24 October 2016), "Did the Pope Endorse Trump?", FactCheck.org, retrieved 7 December 2016
  41. ^ Pullella, Philip (7 December 2016), "Pope warns media over 'sin' of spreading fake news, smearing politicians", Reuters, retrieved 7 December 2016
  42. ^ "Pope Francis compares fake news consumption to eating faeces", The Guardian, 7 December 2016, retrieved 7 December 2016
  43. ^ Zauzmer, Julie (7 December 2016), "Pope Francis compares media that spread fake news to people who are excited by feces", The Washington Post, retrieved 7 December 2016
  44. ^ Griffin, Andrew (7 December 2016), "Pope Francis: Fake news is like getting sexually aroused by faeces", The Independent, retrieved 7 December 2016
  45. ^ a b c d e f Danks, David; Danks, Joseph H. (2014), "The Moral Responsibility of Automated Responses During Cyberwarfare", in Timothy J. Demy; George R. Lucas Jr.; Bradley J. Strawser (eds.), Military Ethics and Emerging Technologies, Routledge, pp. 223–224, ISBN 978-0-415-73710-4
  46. ^ a b Tucker, Joshua; Guess, Andrew; Barbera, Pablo; Vaccari, Cristian; Siegel, Alexandra; Sanovich, Sergey; Stukal, Denis; Nyhan, Brendan (2018). "Social Media, Political Polarization, and Political Disinformation: A Review of the Scientific Literature". SSRN Electronic Journal. doi:10.2139/ssrn.3144139. ISSN 1556-5068.
  47. ^ "This Analysis Shows How Viral Fake Election News Stories Outperformed Real News On Facebook". BuzzFeed News. Retrieved 29 October 2019.
  48. ^ Sunstein, Cass R.,. #Republic : divided democracy in the age of social media. Princeton. ISBN 9780691175515. OCLC 958799819.CS1 maint: extra punctuation (link) CS1 maint: multiple names: authors list (link)
  49. ^ Boxell, Levi; Gentzkow, Matthew; Shapiro, Jesse M. (3 October 2017). "Greater Internet use is not associated with faster growth in political polarization among US demographic groups". Proceedings of the National Academy of Sciences. 114 (40): 10612–10617. doi:10.1073/pnas.1706588114. ISSN 0027-8424. PMC 5635884. PMID 28928150.
  50. ^ Allcott, Hunt; Gentzkow, Matthew (May 2017). "Social Media and Fake News in the 2016 Election". Journal of Economic Perspectives. 31 (2): 211–236. doi:10.1257/jep.31.2.211. ISSN 0895-3309.
  51. ^ Bakshy, E.; Messing, S.; Adamic, L. A. (5 June 2015). "Exposure to ideologically diverse news and opinion on Facebook". Science. 348 (6239): 1130–1132. doi:10.1126/science.aaa1160. ISSN 0036-8075.
  52. ^ Wojcieszak, Magdalena E.; Mutz, Diana C. (1 March 2009). "Online Groups and Political Discourse: Do Online Discussion Spaces Facilitate Exposure to Political Disagreement?". Journal of Communication. 59 (1): 40–56. doi:10.1111/j.1460-2466.2008.01403.x. ISSN 0021-9916.
  53. ^ Lanoszka, Alexander (2019). "Disinformation in international politics". European Journal of International Security: 1–22. doi:10.1017/eis.2019.6. ISSN 2057-5637.

Further reading

External links

This page was last edited on 30 October 2019, at 11:27
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.