To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

1998 United States Senate election in Missouri

From Wikipedia, the free encyclopedia

1998 United States Senate election in Missouri

← 1992 November 3, 1998 2004 →
 
Kit Bond official portrait cropped.jpg
Jay Nixon crop.jpg
Nominee Kit Bond Jay Nixon
Party Republican Democratic
Popular vote 830,625 690,208
Percentage 52.7% 43.8%

MOSen98Counties.svg
County Results

Bond:      40-50%      50-60%      60-70%      70-80%

Nixon:      40–50%      50–60%      60–70%

U.S. Senator before election

Kit Bond
Republican

Elected U.S. Senator

Kit Bond
Republican

The 1998 United States Senate election in Missouri was held on November 3, 1998. Incumbent Republican U.S. Senator Kit Bond won re-election to a third term.[1]

YouTube Encyclopedic

  • 1/1
    Views:
    1 744
  • ✪ Misinformation in American Politics

Transcription

>> Let's go ahead and get started. We've got actually a lot to get through today. All right, so this is the first event for the new Project for an Informed Electorate. It's a new organization on campus that seeks to inform the electorate, just like it sounds like. We have a bunch of upcoming events this fall, all of them having something to do with learning about voting, learning about politics, elections, and so forth. So for example we have debate viewings. There's one on October 3rd, the First Presidential Debate here on campus. What we do there is we watch the debate together and then we have a discussion. There'll be some political scientists in the audience and we'll have a discussion afterwards of what went on in the debate. We'll have some pre and post surveys to see if people changed their minds. So there's one for the First Presidential and one for the Vice Presidential. For the students out there I should mention, there will be pizza at these events. So free food is available so please come for that. Then we also have a couple of initiative explainers. There are 11 initiatives on our ballot this year and they can be very confusing. There are some that are actually duplicates, almost, and so they're especially confusing because of that. And so what we'll do at those two events is explain what the initiatives are in a non-partisan way-- the pros, the cons, who's endorsing what, what money's being spent on each side. All that kind of good stuff. So if you bring your voter pamphlet that you get in the mail, your sample ballot then you can, you know, just check off what you think and it'll be, you know, done with and you can help your neighbors understand how better to vote too. So there's one here on campus and then there's another one that's downtown at the library main branch downtown in the evening. For those of you who find it less convenient to come to campus, you can go to that one downtown. Those are basically going to be the same thing, so pick one or the other if you want to come. And then we have a lecture on who will win the presidential election coming up on the 24th which ought to be very good. It's by David Barker, a political scientist at Pitt who is coming to Sacramento State, so that should be fantastic. And then a recap a week after the election where we'll talk about what happened during the election, and I have a panel of experts that will be fantastic. So all those events are coming up. Our website is listed here at the top. We also have a lot of voter information. The website is set up so that you can click through to all kinds of issues, initiative explanations, all kinds of content that you might need to vote better. So check out that website. It also has a schedule and all the details here too. So that's the Project for an Informed Electorate. I want to thank the office of public affairs in particular for helping with the publicity for this event and for putting together the website as well. All right, the topic for today-- I guess I'll use this microphone. It sounds better, right? All right the topic for today is misinformation in American politics. You probably, if you follow politics at all can think of all kinds of examples of misinformation. When people think they understand something but they don't. So I want to distinguish between misinformation and ignorance. So ignorance is when people don't know much about something. They just basically don't have the information. Ignorance is a bad thing in a democracy when we're expected to vote on important issues, but we know what the solutions are. We give people the facts. We give people the information. They learn it and, you know, that's why we have classes like introduction to government that we make students take here so that you'll learn that basic information and leave the status of ignorant and become informed. Misinformation on the other hand, is when people think they understand the facts but they're wrong. So they have misinformation. And that's much more of a problem for American politics or any democracy because if people think they understand something already, they're not seeking out new information and they may be resistant to corrections as we'll see later. So misinformation is much more of a problem for our political world. So it's, the other problem with misinformation is that it's factually wrong and it's also often strongly held. So people think they know and they feel strongly about their opinion that is actually based on false information. So today what we'll do is go through several high profile examples of misinformation in recent American politics and some current events. I know there are plenty of other examples you could come up with and those are probably valid too. These ones are chosen because they are verifiable and also because they illustrate something that's more important and that, you know, gives us some light into this phenomenon of misinformation. After we talk about these examples we'll talk about some of my research that's more California-based that tells us something about who holds misinformation and sadly will actually be a little bit even more depressing than these original examples. After that we'll talk about what it means and then at the end will be the hopeful section where we'll talk about some possible solutions, ways to prevent it in the first place and things we can do about it if it does happen. So the first example, the Affordable Care Act which is the official title for it. It's been nicknamed Obama care by the opponents on the right, and recently the Obama administration has actually said, that's fine we'll call it that. It sounds like Obama cares. So we can call it Obamacare at this point I guess. But there's been lots of opposition to it clearly. So this is a chart that shows the public opinion. Each of these dots is a different poll, and you can see that when, you know, before people knew anything about it when they heard about it there was more approval and very little disapproval. But once the political debate started and this is, you know, before the passage of the bill and here when you have lots of polls on it, before the political debate really started. And once it really started you ended up with more opposition than support. So the red line is the opposition. The black line is the support for it. So as it's debated, the opposition gets highest as we're hearing lots of pundits talk about it and politicians talk about it. And as it passes, we still have majority oppos- or plurality opposition and as we go through time to the most current data, that continues. People are overall, more people are opposed than support the Affordable Care Act. However, if you break it down and you look at what's actually in the Affordable Care Act, the actual details of the Affordable Care Act, there's overwhelming support for what's in the bill. So for example, create an insurance exchange pool that makes it affordable for people because we're buying in bulk. 79% overall approve of that. Obviously democrats more than republicans, but nevertheless you have strong approval and a majority among republicans. And that's true with all of these examples here is a majority among republicans as well as democrats and overall in the electorate as well. So that children can stay on the parents insurance until the age of 26. Strong support for that. Subsidy assistance to individuals who can't pay for their insurance completely on their own, huge support. Banning of pre-existing conditions exclusions. So previously if people had a pre-existing medical condition they'd be kept off of healthcare. Big support for changing that. Eliminate lifetime cap on insurance payments, and requiring that companies with 50 or more employees insure their employees. So as you can see, this is a really interesting pattern. So majority support these items in the bill and there are many others that have similar numbers. Other items that are in the bill and that's now the law that have majority support. The only thing that doesn't is the mandate that requires that everyone actually buy health insurance, but the rest of it doesn't work without that because that's what allows the insurance companies to do all these new things that cost them more money because they get more people who are on the insurance roles. So this is really curious. If you look at what's in the bill, people like it. If you look at the opinion of the overall law, they don't like it. And that's because of misinformation. It's likely because there was a lot of publicity around ideas about this bill that were not true. The idea of a government takeover of healthcare, which it was not. It's very much still with private insurance. And of course the infamous death panel idea. So this is a phrase that was coined by Sarah Palin. It's not based in any reality within the bill. There's nothing like this, but this misinformation, this idea got planted in a lot of people's minds. So this question is, if Obama's plan became law-- this is before it passed-- do you think senior citizens or seriously ill patients would die because government panels would prevent them from getting the medical treatment they needed? So 41% thought yeah people would die because of Obamacare. Well if you believe that, of course you're going to be opposed to the idea of the bill. It's just not true though. And so this misinformation is driving the overall disapproval of it. When they know what's in the bill, they like it. But when they don't, they don't. Next example, birthers. Birthers are the people who are under the misimpression that President Obama was not born in the United States. Now, our constitution actually requires that the president be a natural born citizen, so that's a really serious charge. If you believe he wasn't born in the United States, you believe that he is constitutionally prohibited from holding the office that he holds. Nevertheless, in spite of all sorts of factual evidence, people still hold that belief. So originally the information that was out there, there's a birth certificate that was produced. There were birth announcements, if you go back to 1961 in Hawaiian newspapers, there's several of them that had printed birth announcements of his birth at the time and Hawaiian officials certified that in fact he was born in Hawaii. Nevertheless, people still believe this. So this is a question that was taken before the release of the long form birth certificate. So there was this big debate about well, you know, he hasn't yet released the long form birth certificate and that, you know, that's why we know that he's not really born in the United States. So before that 55% thought he was born in the United States. 15% thought he wasn't. And 30% were not sure. So that's, you know, 45% who are not convinced that he's legitimately president of the United States. That's a pretty big deal. So then they released the long form birth certificate. There's a copy of it if you want to look at it for yourself. It's the final piece of possible evidence that one could produce to prove that he was born in the U.S. The good news is that people did correct their opinions. There was change. People did come around to the reality and the facts in that particular situation. So here's the before numbers and here's the after numbers, and we go from 55 saying it's true to 67 saying it's true. You have fewer people who aren't sure, and fewer people who think it's false. So that's good. The correction worked, right? We got the new information that helps us understand the truth. Sadly, the misinformation came back over time. So here's the numbers over here that are at the most recent. So July 2012, just a couple of months ago. So you can see the before the release of the long form birth certificate, after it, and then it bounces back. And in fact it actually gets a little bit worse. So there are fewer people who don't know and more of them have moved into the he's not born in the United States category. So it actually backfired to a small extent and certainly didn't persist in the new information. That's especially strong among republicans who only 31%-- not even a third-- think he's legitimate as the president of the United States. And that's probably to be expected because they have more of an ideological reason to want to believe that he's not legitimate and shouldn't be president of the United States. But that's disturbing right? The misinformation gets corrected in the best way possible. People believe it at first but then they bounce back. Third example. Here, this is a study by the program on international policy attitudes of issues that were prominent in the 2010 election. A lot of those are still prominent today. What they did is break it down by ideology and partisanship. So which people from which side have these pieces of information-- misinformation-- in their minds? So my income taxes have not gone down during the Obama administration. Huge majorities of both democrats and republicans had that misinformation. The taxes did in fact go down for 95% of Americans. There are a lot of tax cuts in the stimulus, so that number, if it were to reflect reality would be 5%. But instead its' 92 and 82. Most economists who have studied the stimulus estimated it saved or created only a few jobs or caused job losses. It's the opposite. They estimate that it saved millions of jobs, but still you have huge majorities thinking otherwise of both parties. The bailout of GM and Chrysler was not something that took place under both Presidents Bush and Obama. Again, the misinformation by both parties. Then we have the breakdown by party. There are some things that republicans were more likely to believe and some things democrats were more likely to believe that were not true. The American economy is getting worse. Technically it was getting better on the recovery by then. So republicans more likely to believe that because their ideology would lead them to do that. Among economists who estimate the effect of the healthcare law, more think it will increase the deficit. It's the other way around, they think it'll decrease the deficit. So you can see a huge disconnect between democrats and republicans. The stimulus legislation did not include tax cuts. It did. That was a large chunk of it. There's not agreement amongst most scientists that climate change is occurring. There is consensus. And then it is not clear the birther thing as well. Big differential, right? This is also true for democrats. So when there are questions wherein they would prefer a different answer, they also are more likely to hold the misinformation. So it was proven that the US Chamber of Commerce spent large amounts of money raised from foreign sources to support republican candidates. 57% of democrats thought that was true even though it was not. The TARP, which was the bank bailout, was voted on. When it was voted on, democrats did not mostly favor it. They did. So again, a reason that they would want a different answer. And Obama has not increased troop levels in Afghanistan, which he did. So again, the ideology, the partisanship drives people to want to think things are true that would confirm what they already believe. So now we're moving onto some of my own research on this question of misinformation. Now misinformation is bad. All these examples we've looked at would lead people to make poor choices in voting, form opinions that might be misinformed. It could be dangerous for democracy. We assume though that people who are politically sophisticated-- that's the term political scientists use-- the people who follow politics, who are educated, who pay a lot of attention, who are higher income, who are more attentive in general, you would think those people would have less misinformation right? Like they would be the people who could sort through it, who could pick out the things that weren't true. Sadly and depressingly enough, my research shows the opposite, that in fact there are examples wherein people who should know the most, know the least. So I have several examples. One is proposition 13, some California government questions that are on a more recent poll, and then the example of tea party activists. So tea partiers are a new group on the right that are very concerned about big government issues, very conservative. You would expect people who are activists, who are very engaged in politics would also be the people who would know a lot about politics. My numbers do show that tea party activists, the strongly ide- people who strongly identify with the tea party do know more than other comparison groups on questions of basic civics. So you know how many years do you serve in the house and the senate, those kind of questions. They do very well. But the misinformation comes out in that group too. So what we know from political science is that people who should know more-- high income. People who are older. People with higher education. People who pay more attention to politics. People who know other things about politics and government. And then people who participate, so people who are who registered to vote or vote, and then with the information environment. Meaning what kind of news sources they're gathering the information from. If they have good news sources, then they should know more. This is the question on prop 13. So proposition 13 for those of you who don't know was an initiative passed back in 1978. What it did was it put a cap on property taxes. That cap now means that people who bought their properties in 1978 pay a much lower tax rate than people who buy their properties now. So you could have two houses right next to each other. One homeowner who bought it in 1978 would pay a much lower rate of taxes on property than the person next door who bought it more recently. It has been blamed for a lot of lack of funding for education and all sorts of other things in California and it's been something that's discussed year after year after year because it is so important for the funding and because there are these possible inequities. And also because homeowners love it right, because it keeps their property taxes low. So homeowners don't want to change the proposition 13. So the question was, as you may know in 1978 California voters approved proposition 13 which reduced local property taxes. To the best of your knowledge, did prop 13's tax reduction apply only to residential, only to commercial, or both? Do you know the real answer? How many people think it's residential only? Commercial only? Both? Oh you guys are smart. Look at that. So you guys actually bucked the trend that we found in the data. You actually do the same sort of, you had the same answers as the high school dropouts, so you should be proud of yourselves. So what we found in this data is actually the opposite of what you would expect. So the people with-- the high school dropouts were more likely to get it right, that it's both. And the people with advanced degrees were more likely to get it wrong. So it runs opposite of what you would expect, which is really bizarre. And that's why I asked the question in 2005 on the statewide field poll, and got these results and thought there's got to be something wrong with the data, with the coding, something, and waited until I had another chance to get it on the poll in 2009 and saw the same pattern and then I felt like, okay, I can publish off of this because it's so bizarre. All right so by education it's the opposite. By income-- you would think that people with lower income who maybe can't afford a house aren't thinking about property, would do worse. No, they do better. And the higher income people do worse. Political attentiveness. So how much do you pay attention to politics? So, it would make sense the people who pay a lot of attention would do better, but no it's the opposite. The people who do the best are those who don't pay a lot of attention. The people who do the worst are the people who do pay a lot of attention. Age. You would, you could forgive people who weren't even born in 1978 for getting this wrong. You would think the people who were around would get it right. No, the opposite. Knowledge. So this is an index of knowledge questions, other knowledge questions, how well they did. You'd think people who did well on other questions would do well on this one. No, the opposite thing is true here too. And then voter registration. People who are registered to vote should be more attentive, care more about politics. Same pattern. Home ownership. This is about property taxes so you'd think people who own homes would at least understand it. No, same pattern. People who don't own homes, people who are renters are more likely to get it right, and people who own homes are more likely to get it wrong. So that's weird, right? It's the opposite of what we'd expect. People have this misinformation. There's a pattern in the Prop 13 data which is that they pick, the people who pick it wrong, the people who should be politically sophisticated pick residential only probably because that's what they care about right? They're more likely to be homeowners or want to be homeowners and so that's what they focus on with this law in spite of the fact that it applies to businesses too and there are lots of potential problems with that, including the non-competitiveness of businesses who've owned their property since 1978 versus one that bought their property more recently right? So I went looking for this same phenomenon-- I thought okay so it happens with prop 13. That can't be the only place. There must be other examples where the people who should know more, don't, and that the opposite is true. Sure enough, on a few of these questions asked last year, it was also true. So the average calipers, that's the public employees retirement system here in California, a recipient now gets less than $30,000 a year in retirement pay. That's true, they get $27,000 average as of this poll. People who paid the least attention to the news were more likely to get it right. The people who paid the most attention to the news, more likely to get it wrong. Same with education. The high school drop outs, more likely to get it right. People with advanced degrees, more likely to get it wrong. Income and age, same pattern. So the opposite pattern in this example as well. Same with this question. California has fewer state workers per citizen than all but five other states. That was true at the time of the poll. If you think about it, we have more citizens in general so that probably makes sense if you do it per capita. But again, people who pay the least attention to the news, more likely to get that right. People who pay the most attention to the news, more likely-- or less likely to get it right. Same thing with education, income, and age. So this opposite pattern where the people who should know more, don't. Another example-- so let's talk about that for a second. So the people who should know more don't probably because they're paying attention to news sources and to information that confirms what they already believe. So they're picking out the information that resonates with them and they're ignoring the information that doesn't. So the tea party example, the tea party again is an activist group. We would expect them to be engaged in politics and therefore know more. Again they did do more on civics book questions but they did worse on some other questions that, where they might prefer a different answer than the true answer. So this one, in 2008 when same sex marriages were legal in California, religious organizations like churches were required by law to perform same sex weddings. That's not true. We do have the first amendment to the US Constitution that would never allow that. Religious organizations can choose to marry people or not based on if they're a member of that religion or you know if they've been divorced before or whatever their criteria might be based on their religious principles. However, that percentage of people got that wrong. And the people who identified most with the tea party were more likely to get that question wrong than these other comparison groups. We don't have the data here on like occupy protestors or something that might be a sort of similar group on the left. The best I could do with this data set was strong liberals. So they do better on that question, and so does the sample overall. Another example with the tea party is that PERS question, how much public employees get. The tea party is very much about small government and reducing taxes and so forth so it makes sense that more of them get that one wrong as well. And then the state workers per citizen, same thing. Small government would motivate them and they're more likely to get that wrong. You could see that like the percentages are pretty high here of misinformation overall, right? The whole sample, 61% are getting that wrong, but it's a higher incidence amongst those who are really into the tea party as you might imagine. And then the state of California spent more on prisons last year than it did on K through 12 education. The reality is they spent, we spent about four times more on K through 12 education, but you had the, in this example the strong liberals are more likely to get it wrong than anybody else. And that's because they're the ones who are more likely to be concerned about, you know, more, wanting more spending in public education and being resentful of prison spending. So this is, you know, even though we're breaking out by tea party, their ideology is really driving all of these answers. The other thing is the confidence. So it's also a problem is people are confident in their wrong answers, that they're pretty sure about what they think. They're very sure. And we did find that the tea partiers in general were more confident in their responses to all these questions. So these are five questions. They were very confident in 2.3 out of 5 and the, compared to only 1.8 in the entire sample. So they were more confident in general. But they were also more likely to be confident and wrong in these five questions. These five questions there were more that their ideology would lead them to get wrong, to be fair, but they were confident and wrong at the same time. So that's more of a problem that people are confident and wrong. So what have we learned from all of these examples? We've learned that misinformation exists on all kinds of important political issues. We've also learned that it has a partisan and ideological dimension right, that it has a lot to do with what we already believe and what we want to be true. We also find that misinformation is held not just by the uninformed or the inattentive, but also by the most motivated and supposedly knowledgeable. So that's a problem. And also that the corrections don't always last. So that birther example. There was a factual correction, it worked for a little while and then bounced back. And then the misinformation can be strongly held. They can be confident that they are right even when they're not. So why is that? This is sort of a strange phenomenon. Why? Well one explanation is our, basically our mass media environment. We have what's called divided audiences, and what I mean by that is that it's possible today because of the internet and because of the way the cable news networks operate that you could tune into only information that confirms what you already believe. So if I'm a liberal, I could watch MSNBC and look at Daily Kos and only look at liberal sites on the internet and get my news that way. If I'm conservative I could be watching Fox News all the time and then also look online and look at Red State and Drudge Report. And if I'm doing that, I'm not getting any information that might counter my mistaken impressions and I'm probably getting mistaken impressions fed. If there's any misinformation that's being distributed that way, I'm not going to be dissuaded from believing that. And it's only going to be reinforced. So that's a problem is the divided audiences in our media environment. Another reason is that the people who are more likely to receive information that's wrong are the people who are paying attention. So the most attentive is-- the citizens who are into it, who are reading the most, who are watching the most news, they're the ones actually who are most susceptible to this misinformation if it's coming from pundits, if it's coming from elected officials, if it's coming from candidates. They're the ones who will hear it and they'll hear it often and absorb it. So it's actually the people who aren't paying attention who are less likely to collect this misinformation in the first place. And then another reason it happens is just self interest and myopia. So for the prop 13 example, people were paying attention to what applied to them. So people who are homeowners or potential homeowners are paying attention to that aspect of it and not so much to the business aspect because most of them probably didn't own their own business property. That's actually fairly rare right? So it's some of it's just self interest where you pay attention to what affects you. Also there's this phenomenon called motivated reasoning. So motivated reasoning is when you want to come to a particular conclusion. You're hoping that a particular thing is true and so you only collect the information that's relevant to that outcome and you dismiss the information that's not. And that's related to this idea called selective perception wherein we just take in the information that confirms what we already believe and we ignore the information that doesn't. So we're biased by our partisanship so we see what we want to see. We pick out the sources that confirm what we want to already believe and then we have an affective screen meaning that we pay attention to what we like, you know? If you are a conservative and you watch Bill O'Reilly you'll get a warm fuzzy feeling in your chest and you'll just love it and if you watch Rachel Maddow it'll feel awful, and so we go with what feels good, right? And it feeds that tendency to take in the information that we already wanted to be true. It also means that when we get a correction and we don't like it-- so if somebody tells us, you know, actually, you know the numbers are going the opposite direction or your candidate lied in this case, you, it doesn't feel good to think something negative about your side, and so you reject that information even if it might be true. Another problem that we've sort of covered is that it, we trust the experts and the pundits who are also wrong. So we're more likely to take in information when it comes from a trusted source. If those sources are giving out information that's not true, then we'll receive it and we'll also be wrong. A further problem is that the corrections can be dismissed. So we saw that with the birther example, but sometimes they actually backfire. There have been laboratory stories done where they bring people in and ask, give them information and give them corrected information and see what happens. And it turns out that sometimes getting the new information actually makes you dig in your heels more with the original wrong information. So you believe something was true in the first place that was wrong but that fed your ideology, you get the correction and what do you do psychologically? You try to protect yourself against that new information and so you dig in even more and get even more extreme in your support of the original belief that was untrue. That's a problem right? That's how we continue to believe things even after they've been debunked. Another reason we have this misinformation is because a lot of times it's repeated and repeated and repeated. So you hear it in different news cycles, in different news outlets. You see it on the internet. You read it on Facebook if your friends are posting it. So you see it lots of different places and there are studies that show that repetition gives us the misimpression of truth. So if you hear something enough times, if you keep hearing people say, you know, we never landed on the moon. That's fake. They set it up on a studio in Hollywood and nobody ever landed on the moon. And you hear that, you know, on TV and then you hear it on the radio, then you see it on the internet and then one of your friends posts it on Facebook, you're going to soon start to think, yeah that is pretty ridiculous of course we never went to the moon. How could people [inaudible] all the way to the moon right? So, the repetition makes it seem true even when it's not. Even when it's untrue, we start to believe it if we hear it enough times. So what are the implications? Well, one of the problems might be that a lot of our opinion polling that gets reported in the press that we rely on, base most of the public opinion polling isn't about what's true or not true right? What it's about is if you like or don't like something. If you approve of the President. If you approve of Congress. If you like a Bill, if you don't like a Bill. Well that may be skewed if it's based on misinformation. So if people think they know what's in a Bill but in fact they're wrong, they may think they disapprove when in fact, they approve. So our public opinion polling on approval may be skewed because of this misinformation. And then worse than that, we may be voting wrong, right? So you might make choices when you vote based on something that you think is true, but it's not. And you may inadvertently vote for the wrong candidate. So if you really understood the facts, you might pick the other side. So that would be a big problem for the functioning of democracy if people are making choices that are against their own interests because they believe something that is not true. And that could be a problem for direct democracy. So for ballot measures, initiatives, if people are voting directly on policy and they believe things that aren't true about those ballot measures or about the state of the world or about what's going on with the budget, they may be voting incorrectly on things that actually become [inaudible]. So on, you know, on this example it may be just they're electing people-- the wrong people, but it's, maybe it's sometimes worse if they're voting directly for policy based on something that's not actually true. So what can we do? So this the hopeful part at the end where we [inaudible] kind of like [inaudible] it's hopeless. Not exactly. So there's some ways in which we can think about fixing it or at least inoculating ourselves against this misinformation in the first place. So one is more responsible journalism where they don't report the false claims as fact in the first place. So it turns out from a lot of research that once the claim is planted in your head, it's very hard to eradicate it. So the better way is to prevent it from getting in your head in the first place. So if reporters started out saying today Sarah Palin made the false claim that she could see Alaska from her-- see Russia from her house. Right? If they say in the sentence that it's false. So your initial reveal of the information-- she never actually said that, that's actually misinformation too. That's not exactly what she said. And don't give fringe sources credibility. So sometimes when they're trying to keep balance, there's this false equivalence and so they'll try to get experts on each side but sometimes, you know, all the actual experts are on one side and there are very few people who are kind of fringy types that are on the other side, and yet reporters present them each as if they're both credible. And that can create the impression that the incredible information is possibly true. So don't give that credibility in the press to people who are very fringe and to ideas that aren't confirmed. Reduce the media repetition of the incorrect information. So there's lots of research that shows that it's cemented in our minds if we hear it a lot of times. So if the press is comple- continually repeating information that's not true, even if they're talking about it, you know, this side said this, this side said that, if you hear it that way, eventually you'll start to believe [inaudible] repetition so just don't continue to repeat that information. And then repeat the corrections enough to overcome the misinformation. So a lot of times what happens is that the false claim-- you'll see the false claim 30 times and then you'll see the correction once, and that's supposed to fix it. That doesn't. What we would need is more than 30 corrections, right? So that the actual, true information repeated as many or more times than the false information, and you almost never get that. And then don't phrase the corrections as a negation. So it turns out there's research showing that if you say I am not a crook, people just think about the original sentence I am a crook. So they're going to think of you as a crook when you say I am not a crook. So this is good to use just in your personal life. Don't say like, I didn't, I'm not a cheater. I wouldn't cheat on you. Say something completely different right? Because that's going to implant that in the person's mind. So I'm not a crook doesn't work. You would have to use something else that doesn't use that negative terminology and say I am honest instead. And so when you're doing any sort of correction in the press or if the people working for campaigns want to be smart about it, don't use the original statement, use a rephrasing that doesn't use the negative information that you're trying to correct. And then we know that those corrections don't always work, but there are certain circumstances that research shows are more likely to have them work. So corrections tend to work better with face to face deliberation. So if you sit down with people and you talk it over and you offer the new information with credible documentation, people are more likely to come around. That doesn't happen a whole lot in the real world though. We don't tend to sit down with our neighbors and bring out file folders with all kinds of information and do corrections. So that one's not really that practical. But one that is is when they come from an unexpected source. So if Mitt Romney says entitlement spending is too high. We're spending way too much on social security and welfare and these things, you're, you know people are going to think, well of course he would say that. You know he's a republican. He wants smaller government. Of course he would say that. If Bill Clinton said entitlement spending is too high and we should reduce it, then people might say oh, if, he's a democrat and if he says so, maybe there's something to it. So if it's from a source that you don't expect to have that point of view but they do, people are much more likely to take it in and believe that source. It also works better, corrections work better if they're done along with the original information. So at the same time or very soon afterwards. So if you, the more times it's repeated in the first place, the more likely people will be to believe the wrong information. But the sooner they get the correction-- so preferably for news outlets if they could do it immediately with the original statement. But if they don't know immediately, if they could do it as quickly as possible afterwards, it cuts down on the negative effects. And then fixes are also better with fewer partisan cues. So if you hear the correction like republicans say that the, that the smear by the Obama administration isn't true. You'll think oh that's just partisan politics and you'll cue into whatever your partisanship is. If, however, it's, the correction comes from someplace else that's not partisan, that's an objective source, you're more likely to believe it. So the less we rely on the partisan cues, the more likely we are to absorb it. Another fix would be to increase ad watches, fact checks, and this idea of naming and shaming. So if political figures are using incorrect information or pundits for that matter are saying a lot of things that are untrue, the idea is to get that information out there. This person cannot be trusted. Here are the examples wherein they're saying things that are not true. The idea is that eventually that results in people not wanting to say the wrong things in the first place because they know that they will be excoriated for it. This last one is a little bit depressing but just accept that misinformation is going to happen, but try to limit the damage. So for example, it's a big problem with our direct democracy where we're voting on ballot measures and initiatives, so maybe we make it harder to, in California you can pass a constitutional amendment with a simple majority. Make that harder so that we can't have these changes as quickly done by people who might not know what they're talking about. Pay as you go where anytime we make changes through initiative you have to explain where the money will come from so that citizens are actually thinking about the trade-offs and can't employ that sort of misinformation as well. So maybe some structural ways to at least head off the worst consequences of it. And then lastly we just need more research on how it works and what we can do about misinformation because it clearly is a problem in our politics if we have a lot of citizens out there who believe things that are just not true. All right, so we have time for questions and answers. We have a microphone up here if anybody has questions or comments. [ Pause ] It might need to be turned on. Is there a button somewhere? [ Pause ] Yeah. [ Inaudible audience question ] >> Okay so the question is, in those examples that I used from my own research that less informed people were more likely to get it right, and is it possible that they're less sure? Yeah and actually I didn't, I should also say I didn't present all the like statistical information and all the citations. I do have those if anybody wants to ask questions. So yes we actually did find that more of those people said they didn't know than any of the other categories, and with the prop 13 example, I'm not imagining that they actually know. I think that what they did was that they guessed right. So essent-, you know if you're thinking is it residential, commercial or both? Both sounds like a good middle option, so I know from public opinion research that if you give people a range of five answers from strongly agree to strongly disagree, there's one in the middle that's sort of neutral, they'll pick the neutral one when they don't know because you don't want to look dumb and not answer the question, but you do want to, so you want to, you know, say something that's not completely wrong. So you pick the middle option. So that's what I think those people did. I don't think they really knew, I think they guessed the most simple answer. Does that answer your question? Okay thanks. [ Pause ] Sadly I don't know how these work either. >> I just wanted some clarification on where you said that some parties may produce [inaudible] information that's unexpected. So what, should we believe them more if it's unexpected or what should our take be? >> Yeah that's a good question. So the question is, if sometimes we get information that's unexpected and we're more likely to believe that, should we trust the unexpected information more? Is that it? >> Yes. >> Okay, well the studies show that people do trust the unexpected information more and I think there's reasons behind that, right? Because if somebody says something that seems contrary to their self interest or to what they might be motivated to do based on their partisan beliefs, then it's more likely that it's true. There's a reason that people trust that more. So yes, if it's unexpected from that source, it's probably more likely to be true. Other questions? [ Pause ] [ Inaudible audience question ] >> Okay so the question is, are sources like National Public Radio more credible than something like cable news outlets and other sources? Is that it? Yes. So there are studies that actually break it down by which media outlet you look at and actually in my data I broke it down as well and it turns out that public radio listeners and public television watchers are less likely to have misinformation than people who are looking at partisan sources. The people with the most information in some of these studies are people watching Fox News in particular, but there's misinformation coming from the cable, you know all the cable news sources. And so it's sort of on a scale of most information to least, or most misinformation to least misinformation. And the public sources, which we would hope are doing good journalism and not motivated by, you know, selling their product, they do in fact have less misinformation that they transmit to their audiences. [ Pause ] [ Inaudible audience question ] >> Yeah the question is, was there a regulation that changed having to do with whether the press had to report things that were true or not? There was a lot of deregulation that took place especially during the Reagan administration actually that di- you know sort of took the teeth out of a lot of our expectations for public airwaves. So the idea is that the broadcast networks, the public airwaves are owned by the citizens and that would apply to radio as well. And so there was a requirement that they should provide educational information, that they would have this fairness idea where if they brought somebody on to give one point of view they had to have balance and bring somebody else with the other point of view. Most of that has been, gone by the wayside because of deregulation starting in the 1980's. Technically some of it still exists, it's just not acted upon very much. I mean things like the educational requirement, they'll use you know cartoons that are mostly selling toys and you know things like that. Other questions? Yeah? [ Inaudible audience question ] >> Yeah that, that's a tough one. So the question is, if you're having a conversation and somebody brings up misinformation, how do you know if it's, you know, correct information or misinformation? And the answer is not easy. We don't necessarily know. I mean we do rely a lot on media outlets to do fact checks if you're watching the debates. Hopefully you come to our debate events for the Project for an Informed Electorate, but we, a lot of times we have to rely on outside sources. You could go and do your own research and come back to the person. There are more credible sources and less credible sources. I'm going to put in another plug for the Project for an Informed Electorate, but we do have a, on our web page we have a misinformation section where we have fact checks and all kinds of information that can verify some of that. But I guess the bottom line is, be wary. Right? Don't just accept the information right off because the chances are there's something that's not true in there. I wish I had a better answer like, the magic answer is, but. Yeah? [ Inaudible audience question ] >> Yeah so the question is, okay so you have social networks like Facebook and Twitter and all that stuff-- >> [Inaudible] >> Right. >> [Inaudible] >> Right so it ends up being polarized because we pick our friends partially because they agree with us, right? And we defriend people-- if you have a friend who's you know saying all, you're a conservative and they're saying all this crazy liberal stuff, then pretty soon you defriend them and, you know, you don't have to hear anything but things that make you feel good inside. So how do we deal with that and how do you get better information? I mean, the answer would be put up with some discomfort, right? So you would have to actually, you know, leave those people from high school who have gone to the dark side on your page and listen to what they're saying. But better than that really is to go to the more objective sources for information. So like your public sources we were just talking about. NPR, PBS put out good, objective information. And use websites like the Project for an Informed Electorate to get the good information. But I mean there are studies that show that we're more persuaded by our own friends through those social network sites. There's a recent study that just came out that actually said that liberal persuasion was more common on social networking than conservative persuasion. So maybe the conservatives are more, just are, you know, stuck with where they are and the liberals were more likely to be persuasive with their friends. So, there's that. Others? Yeah? [ Inaudible audience question ] >> Okay so the question is in newspaper sources are we getting misinformation coming from newspaper reporters? >> There's a lot of research on the bias coming from various news organizations and various sorts of reports. Most mainstream press newspaper front page reporting actually is very credible. Studies of bias show there's actually very little of that. We perceive the bias when we don't like what they're saying even if it's true, right? So and I run this experiment with my media class all the time where I take away the headlines so they don't know what the source is and just show them the story and have them analyze it for a liberal or a conservative bias. And people who are liberal, find conservative bias. People who are conservative, find liberal bias-- in the same article. And so that just shows that we notice it when we don't like it, and when we do like it, we don't notice it. So an objective article will feel biased to us if we, you know, are sort of object to some of the things that are being said. But objectively those mainstream newspaper reporters, there's not, especially around elections, there's not a lot of bias. You do find, of course, opinionated sections at the end of the paper in the Op Ed section but most of that is pretty solid and pretty trustworthy information from, you know, the Wall Street Journal front page, or the New York Times front page, or the Washington Post front page. The Sac Bee is fairly balanced. Other questions? All right well thank you so much for showing up. It was a great audience. [ Applause ]

Contents

Major candidates

Democratic

Republican

Results

General election results
Party Candidate Votes % ±
Republican Kit Bond 830,625 52.68%
Democratic Jay Nixon 690,208 43.77%
Libertarian Tamara Millay 31,876 2.02%
Constitution Curtis Frazier 15,368 0.98%
Reform James Newport 8,780 0.56%
Majority 140,417 8.90%
Turnout 1,576,857

References

This page was last edited on 12 July 2019, at 11:13
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.