Episode 31 - Polling, Politics, And Predictions with Patrick Murray, founding director of the Monmouth University Polling Institute
What are the polls saying? Patrick Murray, founding director of the Monmouth University Poll, talks about his 20+ year career conducting media polling and custom-designed research projects for a variety of clients including what he was seeing in polls pre-pandemic versus now, during these unprecedented times, and how the 2020 election might be affected.
Patrick Murray is an expert at not only designing and conducting polls, but also explaining how opinion polls work. Using examples, Mr. Murray describes the polling process and how the media often mistakenly uses them to predict outcomes. The 2016 Presidential Election Polls is reviewed and Mr. Murray discusses how the pollsters got it wrong. (They didn’t).
TIME STAMPS:
2:30 - Who started professional opinion polling in America?
5:20 - How can a pollster be assured the person being interviewed is telling the truth? How does social bias affect opinion poll accuracy? How partisanship affects polling.
8:30 - What lessons have the media, pollsters, and the public learned from the 2016 Presidential affection.
15:00 - What factors will affect opinion polls during the 2020 pandemic.
18:00 - How has the Black Lives Matter movement and protest change society in the polls?
19:00 - Can polls affect change?
22:10 - People act on their perception of reality - not reality. Polls can measure people’s perception.
28:50 - What are the most pressing issues as we move toward the 2020 elections.
​
-------------------
​
​
Guest: Patrick Murray, founding director of the Monmouth University Polling Institute.
​
Hosts: CurtCo Media’s CEO, Bill Curtis, Pulitzer-Prize Historian, Ed Larson, and International Trade Attorney and Malibu Democratic Club President, Jane Albrecht
​
Produced by: Mike Thomas
​
Sound Engineering by: Michael Kennedy
​
Theme Music by: Celleste & Eric Dick
​
Transcript
​
​
*PLEASE NOTE: TRANSCRIPTS ARE GENERATED USING A COMBINATION OF SPEECH RECOGNITION SOFTWARE AND HUMAN TRANSCRIBERS, AND MAY CONTAIN ERRORS. PLEASE CHECK THE CORRESPONDING AUDIO BEFORE QUOTING IN PRINT.
​
00:00:00
Automated: From CurtCo Media.
( Singing)
00:00:07
Bill Curtis: We've all heard, seen and read a thousand times about how this or that politician or candidate is polling, pre- debate, post- debate, about this issue, that issue, and we watch incessantly while news announcers give us their organization's reflection of, well, our collective opinion, or at least today's collective opinion. It seems that our society's opinions change about as often as my socks do. Yes, Mike, I mean daily. But, where do these statistics come from? How are they sourced, counted, kept honest? Today we're going to deep dive into our own opinions with the help of one of the most respected pollsters in the country. I can't think of a more appropriate subject for this episode of Politics: Meet Me in the Middle.
I'm Bill Curtis. Our panel. Firstly, our cohost, Pulitzer prize- winning historian, best- selling author, worldwide lecturer, and the widely quoted socially distant and zoomed- in authority of everything historical and constitutional, Professor Ed Larson. How're you doing, Ed?
00:01:07
Ed Larson: Glad to be back with you and glad to be with Jane and Patrick Murray. What a treat.
00:01:13
Bill Curtis: Also zooming in, Jane Albrecht. She's an international trade attorney who represented US interests all over the world. She has worked with high- level government officials in many countries, and she's been involved in several US presidential campaigns. Hey, Jane, nice to remotely see you, too.
00:01:29
Jane Albrecht: It's always delightful to be here and honored to be here with Patrick Murray as well.
00:01:34
Bill Curtis: So, as you've heard, Monmouth University is one of the most respected polling institutes in the country. We're fortunate that Monmouth's founding director, Patrick Murray, is here with us in the middle today. You'll recognize his voice, of course, because you've heard him so many times doing exit poll analysis and commentary on CNN, MSNBC, Fox News, PBS, NPR, and all the major networks. And since 2005 he has focused Monmouth's polling on everything you can imagine from presidential state and local elections to business studies, nonprofits, TV, viewer segmentation, even something close to my heart, magazine reader surveys. Hopefully podcast next. He's a commentator on politics and public opinions and that's where we'll be focusing our discussions here on Politics: Meet Me in the Middle. So, Ed, is polling a new thing or did our founders take voters' temperature back in the formation of America?
00:02:30
Ed Larson: Good politicians have always had a knack for knowing what the voters think and what they want. Aaron Burr was a master of it, but formal polling is really relatively new, at least within the last century. It started with the growth of the modern newspapers. It was only in the 1930s that Elmo Roper and George Gallup tried to develop a scientific method of polling where you'd try to get a representative sample of people. Now, those representative samples weren't very good. Blacks were almost entirely excluded. You can look back now and it's almost comical how bad their collections were and how biased they were. Now there are literally hundreds of organizations taking thousands of polls with every election cycle, and with having Patrick Murray on, we've got one of the best.
00:03:22
Bill Curtis: Well, speaking about one of the best, Patrick, how did you get started in this racket?
00:03:26
Patrick Murray: Well, the story that my grandmother tells is that I started when I was about four- years- old, riding a bus into Philadelphia. I sat at the front of the bus and I asked everybody who got on the bus if they liked the bus. But I learned my first lesson about how you bias a poll by following up that question before they answered with saying, " I like the bus." So I'm telegraphing to them what the correct answer is. So, automatically from the age of four I was learning how to ask questions and how not to ask questions. But seriously, a most formative experience that I had, which was, I was doing a semester in Washington, D. C. as an undergraduate, then I saw this ad in the city paper there and walked in and it was a pollster. Peter Hart was the Democratic pollster.
I didn't know whether Democrat, Republican, I had no idea. What I did know was that I was calling and talking to voters in Hawaii and Michigan and Wisconsin and Arkansas, and a whole post of interesting places and asking them questions. And I realized I was pretty good at that. And I went to Rutgers University where they had one of the foremost state level polls at the time, the Eagleton poll, which started in 1971. And I walked over there one day and just said, " I'm interested in practical politics. I'm interested in this stuff when I read it in the academic literature. You got anything for me to do?" And they said, " Yeah, we've got this little project we just need some help with, if you want to do it."
And that was it. From that point on, I was not going to be a political science professor, I was going to be a pollster. What happened was, as I progressed as a pollster, that experience that I had as an interviewer, talking to people and understanding the interaction that you have when you're trying to get people to tell you their honest opinion, informed me much more than any of the academic work in many ways that I did along the way.
00:05:15
Bill Curtis: How could you tell at the time, Patrick, that you were getting an honest opinion as opposed to the opinion they thought they should give you? How do you create a control for people that are not actually giving you honest answers when you realize that you're getting kind of a load from someone because they're telling you what they think you should be hearing rather than what they're thinking?
00:05:32
Patrick Murray: That social desirability bias is important. That's one of the things that I said you really need to develop an ear to understand that a question that you ask may not be as innocuous as you think. I'm going to give you an example from a poll that we just released, which is, before COVID hit, were you planning to take a trip for summer vacation? It seems innocuous, right? A yes or no answer. So we got a number, 63% that was in line with numbers that we had gotten from past years. And of course when we ask follow- up questions we find that fewer people are actually going to take that vacation. That was the purpose we were asking them.
When we actually looked at how the responses were given by party, Democrats were significantly more likely, 76% or so of Democrats said that they were planning a vacation, which was more than we'd seen for Democrats in the past. But by the same token, only 47% of Republicans said they were planning a vacation, which would mean before COVID hit, 2020 was going to be the lowest year for Republicans taking a vacation in a year in history. Now, there's no way that that's true.
What happened was, we were asking that question within a series of other questions, asking about the impact of COVID. This is a huge problem that we've been facing and has been growing over the past decade, is that almost everything now is viewed through a partisan lens so that when you get a question you first are thinking about, " Well, what does this say about my belief system?" Rather than simply, " Am I going to do this or not going to do this?" And so, Republicans who want to defend President Trump want to say, " Hey, I wasn't planning a vacation because, you know, to let you know that COVID hasn't changed my plans. I had... COVID, there's not been a big impact," where Democrats are saying, " I did plan a vacation and COVID and the response of the Republicans and President Trump are what caused me not to be able to take this vacation."
Now, when we actually drilled down, we had a bunch of follow- up questions. By the time we got to the follow- up questions about what you actually are going to do, that partisanship disappeared because we were now anchoring it in real behaviors that they said they were going to do tomorrow.
00:07:34
Bill Curtis: Interesting.
00:07:35
Patrick Murray: One of the things I think distinguishes me from other pollsters is that I go out there and I actually talk to people. I listen in on conversations. This is like how you understand how people talk about things, not by imposing your academic view on how the world should work, but on actually how people talk about them, the vernacular used. And I found that when I go out to places like Iowa, New Hampshire in the throes of these presidential primaries, I am able to get people to come out of their shell because they don't know what I think.
I'm able to present in a way that, whatever you're about to tell me, I don't have a judgment on. Or maybe you even think that I probably will agree with you. I found people saying things to me in those situations that they probably would not say if I had walked up with a TV camera where they were automatically going to say, " Well, I have to defend President Trump," or, " I have to knock President Trump and defend the Democrats," whatever it happened to be.
00:08:27
Bill Curtis: So, let's dive into a polling situation that we all remember. What lessons did we learn from the 2016 Clinton/ Trump election?
00:08:38
Patrick Murray: Well, one of the things that I learned is that the media doesn't really understand the error associated with polling. And one of the things that I looked at is the total error, particularly in the states that were competitive. And so, let's say we have 15 states that are the most competitive states. Well, the error in 2016 across those states was no different than the error was in 2012. And overall the error in 2016 at the state level was only slightly higher than it had been on average.
What happened was, the error was off enough in a few states that it changed the Electoral vote outcome, whereas in 2012 it did not do that. So the errors that are inherent in the polling did not change what our expectations were going into the election. That was the key, that the same amount of error was there, it was just our expectations were held up even with the error in 2012, but expectations were not met in 2016.
00:09:35
Bill Curtis: When you polled for 2016, did you poll based on the Electoral College or did you poll based on popular vote?
00:09:44
Patrick Murray: We polled based on popular vote. If you're going to do Electoral College you do a 50 state poll, which means you have to have a large enough sample size in all 50 states. So you're focused on those 15 states, but what happens is that those 15 states that are most competitive are close, then the potential errors are going to be exacerbated. And that was the problem that we found. The public had shifted in terms of how they voted based on their educational level. In the past the difference between voters with a college degree and voters without a college degree didn't matter all that much. Now, starting in 2016, it mattered, and because we didn't have a proper way in our voter list to weigh education, a lot of pollsters didn't weigh by education, but that only accounted for about one or two points of the total error.
We're talking about a four point error overall. We found that more of our likely voters who said they were going to vote for Hillary Clinton decided to stay home than Trump voters and that accounted for a point or two. These are things that you can't predict in a poll, and that was my big problem with the polling error in 2016 was not so much about the polling, it was about the number of articles out there that used the word predict. Polls don't predict anything. Polls tell you what things are at the time you take the poll. Now, the fact that polls are fairly accurate in terms of elections is because very little usually changes between the time a poll was taken and the election, if you're talking about a poll that's taken within a week of the election, that pretty much the die is cast.
And that's why polls are accurate. Not because they're predicting what will happen, it's because what was the lay of the land on the day the poll was taken didn't change by the time we got to the election. And that's why polls aren't in and themself predictive, it is they just tell you what is at the time, and as long as things aren't volatile in the last few days, and that certainly was not the case in 2016, then you're not going to get changes. And what happened is, we had a volatile election, we had enough people moving around, and we had a number of polls that had the race in these key states within five points, and all that said was, " Well, this is going to be a close race and it looks like Hillary Clinton's ahead, but you shouldn't put all your money on that because we know that things are going to be changing between now, the time we took the poll, and election day."
00:11:58
Bill Curtis: Is it possible that in 2016 people weren't really willing to admit out loud that they were thinking of voting for Trump?
00:12:06
Patrick Murray: There were some of those people and I had looked at my poll, particularly in Pennsylvania. What we discovered was, in urban areas and suburban areas, which made up about two thirds of Pennsylvania, we had the results dead on. When we compared our results in those counties versus what we had in the poll, they were dead on. Where we were off was in the rural part of Pennsylvania. What we found is not... People weren't lying to us about how they were going to vote. That the Democrats who are going to vote for Trump or the lean Democrats who were going to vote for Trump, weren't talking about it. So they were less likely to answer a poll than they had been in the past.
And what we found from doing our follow- up work was, it wasn't just about answering poll questions. They actually weren't talking to their family members about how they're going to vote. They didn't want to hear it. We already are seeing there's some of that still exists today. We have to factor that in. But as I said, that's only 1%. Now, if we're talking about a couple of different factors that are one or two percent and they add up to four or five percent but you don't know which ones are at play at which particular time, the key thing that we need to do is to get the media to start saying, when a four or five point poll or a bunch of four or five point polls come in, is that saying, there's still error around this. While it looks like it's leaning towards Biden or leading towards Trump, there still is enough error around this that we can only characterize this as a close election.
There is unknowns. There's error and unknowables inherent in polling, and we need to be more cognizant about that and talk about that a little bit more. Look, if we see Joe Biden is ahead by 10 points in every poll in Michigan, then yes, Joe Biden is ahead. And if he loses that, the polls were definitely wrong. But if we see Joe Biden ahead by four points on average, and Donald Trump ends up being able to squeak out a win by 10, 15, 000 votes in Michigan, then the polls were not necessarily wrong. It's only the depiction of the polls. The media were saying, " Joe Biden's definitely going to win this," based on a bunch of polls that only had him up by three or four points.
00:14:06
Bill Curtis: We're going to take a quick break and when we return, I'd like to talk to you about that particular subject and how you're dealing with 2020 when the people who actually vote are going to be a little up in the air. We'll be right back.
00:14:28
Speaker 6: On medicine we're still practicing. Join Dr. Steven Tabak and Bill Curtis for real conversations with the medical professionals who have their finger on the pulse of healthcare in the modern world. Available on all your favorite podcasting platforms. Produced by Kerko- Media.
( singing)
00:14:50
Bill Curtis: We're back with Patrick Murray, Monmouth University Polling Institute, and Ed Larson and Jane Albrecht. So, Patrick, we were talking about what happened in 2016. And now as we get to 2020, there are a lot of interesting factors that you've probably not been dealing with before, like the pandemic, how it's going to affect people actually leaving their homes and voting, where you can have a mail- in ballot and where the mail- in ballots might not happen. How are you controlling for that?
00:15:20
Patrick Murray: We don't know yet because, I'll be honest with you, we stopped our state- level polling as the pandemic hit. We were polling in the Democratic primary. We polled Michigan in early March and then Arizona, and in Michigan our poll was great. But Arizona, what happened was, between the time we polled and the time the election happened, which was only a couple of days between them, is that there was a huge shift in people not showing up to vote on person because of the unrolling pandemic.
And more people voted by mail or by drop- off than had voted ever before. But the people that we had in our poll, many of them who were going to vote in person, just simply did not vote. And so we dropped our polling at the state level because of that. So the larger question is, " Okay, so what are you going to do about this?" And I said, " Part of it was, we don't know yet because the states haven't told us exactly how they're going to run the election in November." Because we already know from our past polling in March that this is going to be a big issue.
00:16:20
Bill Curtis: So, over the weekend I think you tweeted, " Rarely does a poll result surprise me because I am open to whatever the data will reveal. Change is usually either incremental or momentary, but there's something fundamentally different in these results." And you were reflecting upon results where almost 60% of the people that you polled said that police officers facing a difficult or dangerous situation would be more likely to use excessive force if the culprit is black compared to the one third who say that police are just as likely to use excessive force against a black or white culprit. You were pretty surprised by that.
00:16:59
Patrick Murray: Yes, I was because it was a significant sea change just four years ago. Only about a third of the public said that. They are now close to six and 10 who say that. That's a big shift. And I have been asking that question for a couple of years, and there are other incidents. Eric Garner and so, or they didn't change opinion all that much. It was incremental. And opinions are changing among the white. And to me as somebody who's been measuring public (inaudible) for 25 years, there are certain things that stand out to you that you say, " Well, this is different than I've ever seen in a poll before in terms of a shift." And that was one of them. And that's where your understanding of sociology and psychology comes into play.
00:17:41
Bill Curtis: So what are some of the other ways that Monmouth University Polling is trying to understand the Black Lives Matter movement, its resilience or lack thereof?
00:17:49
Patrick Murray: We were doing that polling just as the protests were starting with the George Floyd murder. And what we're finding was that the initial violence, people were saying, " Oh, I don't like the violence." And we've gotten that all along. And usually what will happen is, particularly among white respondents, they'll see the violence and say, " Well, that undercuts the validity of the cause." And what we're finding in our questions was, if you ask a separate question, they said, " Well, we don't like the violence, but we fully understand where that anger is coming from." And that's a key difference, that one I never saw before.
00:18:24
Bill Curtis: With the Black Lives Matter movement, have we actually gotten to the point where we're going to take it seriously and it'll live long- term, or could it qualify for that momentary concept?
00:18:33
Patrick Murray: I put it this way because, as I said, I've seen something different in this polling number, something that is a harder change, a more permanent change. But because I see it as a permanent change now doesn't mean it will remain permanent. What I can say is, I've seen a window open to a discussion about race and systemic racism that we haven't seen in the past. And so the question is, does that window stay open? And it depends on how the conversation develops, but the potential for that staying open, that conversation continuing, is at a level that it's never been at before.
00:19:10
Bill Curtis: What can we learn from the polls? How can we use what you learn as a way to change our actions so that we can make sure that this is more of a long- term change rather than this week's fad?
00:19:24
Patrick Murray: Well, what we do know about how people behave is that they close off their willingness to engage in new discussions when fear is involved. And that has been the case in every past situation is, yes, this is a problem, but I need to protect myself. Well, the president's rhetoric plays on that fear. It has had actually had the opposite effect because he hasn't done what other past politicians have done, which is acknowledge that there's some sort of ephemeral problem out there, and we're going to do something unnamed to address it. But, the violence that these people are using to express their point of view will undermine your safety and security in the neighborhoods where you live.
What Trump is just saying is, " Their behavior is bad and everybody's behavior who's supporting them is bad," so they're putting all these white people in the same boat with the black people who are protesting and other people of color who are protesting, and so white people are now saying, " Wait, he's calling me the enemy." So he's not having the effect of promoting fear among them. He's actually pushing them into solidarity.
Ironically, probably one of the things that can happen here is for Trump to continue what he's doing in his rhetoric, because it's pushing those people who have said that they're willing to engage in this conversation to continue to engage in this conversation. Now the thing that you don't want to do is you don't want to divert attention. As I mentioned, you don't want to divert attention and dilute what this is all about. I mean, this is about systemic racism which has been a scourge for our country since slavery. And so you want to make sure that you continue to concentrate on that.
00:21:07
Jane Albrecht: Going back to what you just talked about, Patrick, the fact that what's happened has opened a window to discussion in a way that hadn't been opened before, but people tend to close off such discussions when fear is involved. Do you have any sense yet for how the moniker Defund The Police could feed into that? It's not just Trump playing on the fear of the violence of the protest. I don't think that will get that far, but because Defund The Police, if you know what's behind it, it's really about deep reform of policing. But the moniker they chose is radical and seems extreme, and I could see people saying, " Hey, I really think we should address this situation of systemic racism and police brutality, but I don't know if I want to trust the Democrats to do this because they would defund the police." So, how is that going to play into this ability to move forward?
00:22:01
Patrick Murray: Jane, you've read my mind because that's the key. That's what I'm talking about in terms of keeping this conversation open. It's not whether the public fully understands what Defund The Police means. It's what their perceptions are. People act based on their perceptions of reality, not what reality is. And that's something that polling can measure. We can't measure in- depth analysis of people thinking about solutions for racism. What we can measure is the things that are going on out there in the world.
How are they reacting to them? And is that going to potentially have an impact on, not only their attitudes, but their behaviors. Is Defund The Police, is that something that can stoke fear or has the conversation moved enough that people feel, " Oh, I know what that really means." Even if they don't know exactly what that really means, their perception is that, " Yeah, I know it doesn't mean fully that." So that, then, do we get a tipping point where that pushes that too far? We don't know and that's why you need polling.
00:23:02
Bill Curtis: Can you tell us how your polls have gauged the effect of the pandemic on the next elections? And have you been able to test that at all?
00:23:10
Patrick Murray: Less about predicting what's going to happen in an election versus what we actually saw in terms of moving the needle. Trump got an initial bump in his approval rating in March because there's this rally effect. People want to be able to rally around their leader. Again, this goes back to fear. When there's an attack on us, and then this pandemic is an attack on our security and our safety, and you want a strong leader to be able to do that.
What's interesting was why he got a bump. He got nowhere near the bump that our state governor's got, that other foreign leaders got in their own countries, because the opinion about Trump is baked in. So what we found is there was a lot of polling out there that said, " Oh, older people who are more susceptible to the virus are turning against Trump because of his response to COVID." When I looked at the polls, I said, " Well, no. These differences existed before COVID." What they're only doing is reinforcing what people already thought about President Trump. Whether you like him or dislike him, it had a reinforcing effect.
00:24:13
Bill Curtis: Let's talk about some of the more detail aspects of pandemic and see if you've polled for it. For example, the opening of the economy versus the health risks and the potential for a bump and hospitals having more issues. Have you tested those points?
00:24:28
Patrick Murray: Yeah, we get by about two- to- one margin. People are more concerned about opening too quickly because of the health impact than they are concerned about opening too slowly because of the economic impact. And that's been pretty stable throughout this.
00:24:40
Bill Curtis: Well, that's interesting. Is that nationwide or is that state by state?
00:24:43
Patrick Murray: That's nationwide.
00:24:45
Bill Curtis: The impression I think a lot of us have is that funding can affect a poll's outcome. Does it matter who's paying for the poll?
00:24:55
Patrick Murray: I guess it does. I mean, we don't do paid polls, so it's not so much that they bias the polls when they do that, but when you're dealing with a client, it actually comes out in the questions that you ask and the questions that you choose not to ask. That is where I tend to see the bias. It's not in the results themselves, but in, let's avoid this part of the issue, seems to be the bigger bias.
00:25:19
Bill Curtis: That's interesting that you said you don't charge for your polling. So how does Monmouth University get its funding for this?
00:25:28
Patrick Murray: So, Monmouth University is doing this as a public service. This is one of the areas, we have a number of other research institutes, something called the Urban Coast Institute, for example, that does research on the urban environment, the intersection of public policy and science. And we do that in order to take the expertise that we have inside the university and share it outside the university. So this is one of the things that Monmouth does. Now obviously it also helps to give Monmouth publicity and people hear the Monmouth name, and that's always good because every college spends money on marketing and communications. So this is one of the ways that we do some of that as well.
00:26:05
Bill Curtis: Interesting. So being that we've only got a few minutes left, I can't help but ask you if right now you had to lay down a bet based on the polls that you've put out there and the trends you've watched over the last two decades, can you call our next election?
00:26:18
Patrick Murray: No, absolutely not.
00:26:18
Bill Curtis: Presidential election?
00:26:21
Patrick Murray: Absolutely-
00:26:22
Bill Curtis: You're not even going to take a stab at it.
00:26:24
Patrick Murray: ... no question, no way in hell. And in fact, and don't take this personally, Bill, but I'm offended by that question because this is my bugaboo, is that polls do not predict. I don't predict. I don't have a crystal ball. I don't know what's going to happen. I get these questions from reporters all the time. " Well, does what happened yesterday mean for four months down the line?" I have absolutely no clue because if I did, I'd be using that money to go bet on the horses or play the lottery, not to do polling.
00:27:00
Jane Albrecht: Patrick, can you explain a little bit of the difference between what Monmouth does and the pollsters that the presidential campaigns employ? My understanding is that at this stage, the presidential campaigns will have very capable pollsters and they will be polling down to the district. But if you can explain the difference between what you do and what someone, even a top pollster working for a presidential campaign, would do these days.
00:27:27
Patrick Murray: Right. I've said that our polling industry is pretty open in terms of sharing information with us. One area that's not as open are the campaign pollsters. They play everything close to the vest because what they're doing at this particular stage of the race is they're testing messages. They say, "Oh, I have this information about my opponent and I have 16 different negative pieces of information about my opponent. Which one do I think is going to be most effective, and with which group of voters?" Because you're only going to target that group of voters with online ads or TV ads or whatever it is, these messages. So which message is going to be most effective to do that? That's called message testing. So that's what these pollsters are doing.
The other thing that they're doing is then looking at these different groups and saying, " Okay, if we can move this group over here and we can move X percentage of them, how does that impact our likelihood to win that state?" And that's what those... We're not doing that because that's not our mission. Our mission is to simply say, " This is what was on the minds of the voters. This is what they think is the lay of the land today. This is what they care about." Their mission is to say, " How do I help win the next election for my client? And I do that by figuring out what messages are going to work best for them, where do they spend the resources? And also where do they spend the resources getting out the vote," in the hopes that this will push them over the top and close elections.
00:28:50
Bill Curtis: Patrick, do you mind if we do just a rapid- fire asking your opinion of the following subjects. I'd like you to rank them one to 10, 10 being highest, whether or not your polls have revealed that these issues may or may not affect state or federal elections.
00:29:08
Patrick Murray: I'll try.
00:29:09
Bill Curtis: Race, Black Lives Matter.
00:29:11
Patrick Murray: I think that's about an eight right now.
00:29:14
Bill Curtis: Handling of COVID- 19.
00:29:15
Patrick Murray: It's either a five or a 10, depending on how you look at it. People won't react to it specifically, but it's the undercurrent of what they think is going on in the world.
00:29:26
Bill Curtis: Supreme Court.
00:29:28
Patrick Murray: Two, except in Maine. In the Maine Senate race I think the Supreme Court could play out with Susan Collins. Other than that, it's going to be a two.
00:29:37
Bill Curtis: How about women's rights to choose?
00:29:39
Patrick Murray: Again, one or two specific Senate races, Maine being one of them. Other than that, not an overarching issue. Not an issue that's going to change minds. So I would say a three or four.
00:29:50
Bill Curtis: When you say it's going to be an issue, and it could be an issue in Maine, the Supreme Court, would it help Susan Collins or hurt her? Which way does it cut?
00:29:57
Patrick Murray: Hurt her now because of her response and her dealing with the Kavanaugh appointment. That's going to hurt her because even though she personally is pro- choice, the steps that she's taken along the way and her explanation, her very weak explanation for what she did in the Kavanaugh hearing, keeps undermining itself. So, for example, Kavanaugh was one of the dissenting votes on the LGBT ruling of the Supreme Court, right? So that's going to all just feed into Susan Collins's... Not that Susan Collins is anti- gay, but that Susan Collins are anti anti- abortion, either one of those, but that Susan Collins really is not effective and is being played.
00:30:38
Bill Curtis: Which leads us to the next one, LGBT.
00:30:42
Patrick Murray: Again, I take this from the 40, 000 foot view above, for many individuals, that is a very important issue. In terms of affecting this election and changing this election, it's a two or a three.
00:30:57
Bill Curtis: Okay. Lies.
00:31:00
Patrick Murray: Hmm. A two, because you believe what you're going to believe.
00:31:05
Bill Curtis: History of womanizing. Abuse. Me Too.
00:31:08
Patrick Murray: That's going to be a two or a three, in terms of changing the outcome.
00:31:13
Bill Curtis: International relations, China and Mexico?
00:31:16
Patrick Murray: Two or a three, barring something happening.
00:31:20
Bill Curtis: Pro business at all costs, oil and so on.
00:31:23
Patrick Murray: That could be a six or a seven.
00:31:25
Bill Curtis: Environment at all costs.
00:31:27
Patrick Murray: I think before the events of the past couple of weeks, that could've been a six or a seven. I think it's down to a four or five now.
00:31:36
Bill Curtis: Economy versus economy because of a pandemic.
00:31:41
Patrick Murray: The economy is going to be a nine or a 10, but not necessarily in the way that you might think.
00:31:48
Bill Curtis: Oh, well, then you got to give me a little color.
00:31:50
Patrick Murray: Okay. All right. So a lot of people are looking at the economic issues in terms of overall employment rates or GDP or stock market, and those don't matter as much. Again, as perceptions of how people feel that they are doing relative to everybody else, and right now people are feeling that relative to everybody else, they're doing okay even if they're suffering from short- term layoffs and so forth, because they believe those layoffs are going to be short- term. If those layoffs become long- term in November, then that's going to shift the equation.
00:32:23
Bill Curtis: Interesting. Healthcare for all.
00:32:27
Patrick Murray: As of right now, it doesn't look like it's going to be as much of an issue as it would have been a month ago. So maybe I'll say a seven or an eight.
00:32:34
Bill Curtis: Okay. How about presence? Ability to appear presidential?
00:32:39
Patrick Murray: 10. And it's not that it's changing anybody's mind, but that's why people think what they do about Donald Trump right now. And as long as Joe Biden doesn't show himself to be unpresidential, I think he's going to hold on to that as well. But that is going to be extremely important.
00:32:57
Bill Curtis: Interesting. Patrick, this has been a pleasure and enlightening for me. I hope you'll come back and join us a few times before the election, because I got through about a third of my questions. You've been a good sport. Thank you.
00:33:13
Patrick Murray: Oh, my pleasure. Anytime. Take care.
00:33:16
Bill Curtis: Jane Albrecht, thank you very much, and Ed Larson. This is Politics: Meet Me in the Middle. Come back and see us again.
If you like what you hear, please tell your friends and let us know how we're doing by leaving a comment. It really helps if you give us a five- star rating and we really appreciate it. You can also subscribe to the show on Apple podcasts, Stitcher, or wherever you listen to your favorite podcast. This episode was produced and edited by Mike Thomas, audio engineering by Michael Kennedy, and the theme music was composed and performed by Celleste and Eric Dick. Thanks for listening.
00:34:01
Automated: ( singing)
From CurtCo Media. Media for your mind.