Nearly eighty years ago, smarting from sharp criticisms of the still-young opinion polling industry, America’s pre-eminent pollster devised a “quintamensional plan” to improve the quality of its work. The polls, said George Gallup, should start with a “filter” question, “designed to find out whether the respondents [had] given any attention or thought to the issue” they were about to be asked about. Next should come a series of information questions to determine what respondents know about the issue. “Opinions on any issue could then be confined to those who have some idea what the problem is.”
That remaining group of respondents would then be asked an “open” or “free-answer” question, allowing them to voice viewpoints that might otherwise have been overlooked. Only then, he suggested, should they be asked a question that required them to answer “yes,” “no” — or “no opinion.” Their answers to the earlier questions would allow pollsters to see whether a “no opinion” response was based on ignorance, indifference, confusion, or the respondent’s wrestling with conflicting considerations. A “yes” or “no” would be followed by questions designed to discover why a respondent held this view.
Finally, respondents would be asked about the intensity with which they hold their views. In Gallup’s original polling manifesto, intensity hadn’t mattered: if a poll was a “sampling referendum” designed to guide legislators, as Gallup argued, then every “vote” should count equally, regardless of how strongly or weakly held. But if voters were indifferent to an issue, they would be unlikely to turn out for a referendum. And in the absence of a referendum, legislators would be less likely to be guided by opinions that were wishy-washy than by opinions that were firm — opinions that might induce a change of vote.
In a prefacequiz, Gallup also emphasised the importance of the “split ballot” technique as a way of monitoring the impact of different ways of writing the survey questions (though not different kinds of response alternatives). The American Institute of Public Opinion had deployed the split-ballot, he said, since 1938.
Against Gallup’s five-point plan, how does Australian issues polling — as opposed to voting-intention polling — measure up? Pollsters are asking issues questions all the time, but the comparison is more revealing when they all pose questions about the same issue at roughly the same time. The question of Palestinian recognition is the most recent — and, as it turns out, quite revealing — Australian example.
Depending on which poll you look at, the federal government’s decision last month to recognise Palestine at the September meeting of the United Nations was supported by almost half of the Australian public, or by no more than a third, or by just a quarter.
A nationwide poll taken by DemosAU on 31 July suggested that Australia’s recognition of “a Palestinian state in the present circumstances” would be supported by 45 per cent of respondents. But a Resolve poll, conducted on behalf of Sydney Morning Herald and the Age, that went into the field on the day of the government’s announcement (11–16 August), found the move to recognise Palestine was supported by just 24 per cent. Then came an Essential poll, conducted a week or two after the announcement (20–26 August), which reported that the decision was supported by a third of its respondents — fewer than the DemosAU poll but more than Resolve’s.
None of the polls — all of them conducted online — were based on a sample drawn from a list of all voters. Instead, they drew from panels into which potential respondents had mostly opted in. So, the quality of the sampling offers no obvious reason for preferring one set of results over either of the others.
At the May election, Resolve’s estimate of the parties’ vote shares was more accurate than the estimates ventured by Essential or DemosAU by around a percentage point. But on the Palestine question we are not talking about a difference of one or two percentage points, as we usually are with election forecasts; we are talking about differences of as much as twenty-one points.
Resolve’s sample for the Palestinian poll (1800 people) was bigger than either DemosAu’s (1079) or Essential’s (1034), but that’s unlikely to explain much of the difference either. The DemosAU poll, conducted in a single day, was what the British call a “quickie”; the Resolve and Essential polls were conducted over six or seven days, which is one way to boost very low (though undisclosed) response rates.
What, then, explains the differences in the levels of support and opposition across the polls? Most likely, the pollsters’ introductions or preambles (when these were used) to their question, the wording of the question, and the range and nature of the response options.
DemosAU: Conducted late July, the poll that produced the biggest proportion in favour of recognition introduced its question thus: “The recognition of a Palestinian state is a topic of international debate. Some countries have formally recognised a Palestinian state as a step towards supporting a two-state solution to the Israel–Palestinian conflict. Others believe such recognition should only occur as part of a negotiated peace agreement between the parties.” Then came the question: “Would you support or oppose Australia formally recognising a Palestinian state in the present circumstances?” The response options, as published, were “support,” “oppose,” “don’t know.” As presented to survey respondent, they appear to have been ordered differently: “support,” “don’t know,” “oppose.”
Formally, the introduction was “balanced”: reasons were given — one for recognition, one against. Whether the respondents themselves considered the two reasons to be balanced — and how one might establish whether they did — is another matter.
While each of the reasons was drawn from the current debate, they were not necessarily the most powerful each side had to offer, nor were they necessarily of comparable importance. In September 2022, the Australian Palestinian Advocacy Network listed at least ten reasons for recognising a Palestinian state “now.” And following the government’s announcement, Peter Jennings, formerly of the defence department, listed nine reasons not to do so. As it happens, recognising “a Palestinian state as a step towards supporting a two-state solution” was not on APAN’s 2022 list, and recognition “only occur[ing] as part of a negotiated peace agreement” doesn’t appear to be on Jennings’s list.
Since the argument in favour of recognition was the first consideration mentioned, and online polling can generate a “primacy effect” — the tendency for respondents to choose the first option they are offered — the introduction might (unwittingly) have encouraged support for recognition. Moreover, if we assume that most respondents preferred peace to war, the reason for supporting recognition may have seemed more constructive, hence more attractive, than the reason for opposing it — the appeal of the immediate (“a step”) especially attractive to those who placed some distant future (“should only occur”) at a discount.
Note, too, that by the standard Flesch–Kincaid measure of intelligibility, the introduction was “very difficult to read.” The difficulty of comprehending the introduction may have encouraged “satisficing”: respondents struggling with the cognitive load, ticking the first box (“support”), and moving on. The cognitive challenge (and the placement of the “don’t know” option) may also help to account for the fact that nearly a third of the sample (32 per cent) responded by saying “don’t know.”
Having answered the question, those who had expressed a view one way or the other were “engaged in a chat-style discussion with a research AI specially programmed by the DemosAU team to probe their opinions and attitudes in more detail.” From these discussions “key themes” emerged.
For those who supported recognition these were: “Historical Land Claims and Right To Self-Determination” (to which 39 per cent of respondents were said to have referred); “Humanitarian Concerns” (34 per cent); “Path to Conflict Resolution” (26 per cent); and “Response To Israel’s Recent Military Actions” (15 per cent). What proportion expressed other views, or no view, is not disclosed.
For those opposed to recognition, the themes that emerged were: “Concern Over Legitimising Hamas” (40 per cent); “Not Australia’s Business” (20 per cent); “Preconditions for Peace as Conflict Resolution” (18 per cent); and “Belief that Israel Holds Exclusive Sovereignty for Historical/Biblical Reasons” (15 per cent). Again, the proportion offering other reasons or no reason at all was not stated.
Both sets of reasons suggest that while the considerations mentioned in the introduction to the question may have primed some — bringing considerations to the fore that might not otherwise have entered their thinking — they certainly didn’t prime all. “Path to Conflict Resolution,” the only consideration that approximated the reason referred to in the preamble, was mentioned by no more than a quarter of those in favour of recognition; “Preconditions for Peace as Conflict Resolution,” the only consideration against recognition that approximated the consideration to which the preamble had referred, was mentioned by less than a quarter.
What of those respondents who registered as “don’t know” when asked about recognition? Were they unable to decide between conflicting considerations or did they know nothing and had never considered the question?
Resolve: The question asked by Resolve and response options it offered were much more problematic; so problematic that it shouldn’t have been asked — or, if asked, not published. It’s an example of poor polling and woeful oversight by newspapers that commission polls and find themselves ill-equipped to distinguish good polling from bad.
“The prime minister has recently said Australia, along with several other countries, will recognise Palestine at the UN assembly in September,” Resolve told respondents. This introduction lacked formal balance; by referring only to the prime minister and countries that supported him, it was weighted (however inadvertently) in favour of his position. At the same time, it ran together things about which respondents might have had separate views — the position of the prime minister, and the position (as DemosAU had done) of “several other countries.”
Following this introduction, respondents were asked, “Which is your preference on this question of recognition: Australia should recognise Palestine in September’s UN meeting regardless of who is in power” (24 per cent ticked this box); “Australia should wait until Hamas is replaced and/or when Palestine considers recognising Israel’s right to exist first” (32 per cent); or “No change” (44 per cent)?
While there was one box respondents could tick to indicate their support for the government’s move, they were offered two to indicate their opposition. To what, in the first box, was the phrase “regardless of who is in power” supposed to refer? To Labor and the Coalition in Australia? To Likud or some other party in Israel? Or to Hamas or some other entity in Palestine? On the Flesch–Kincaid scale, this option was “difficult to read.”
The second box — “wait until Hamas is replaced and/or when Palestine considers recognising Israel’s right to exist” — rolled two considerations into one, separated by “and/or.” Even if we assume that “Hamas” was an entity with which respondents were familiar, that they also understood that an entity called “Palestine” had yet to recognise “Israel’s right to exist,” and that they understood these things in roughly the same way, the use of “and/or” made this a daunting choice. On the Flesch–Kincaid scale, this statement, too, was “difficult to read.”
What about the third box, “no change” — referencing the fact (nowhere spelled out) that Australia didn’t recognise Palestine at the time? Though the second and third alternatives should have been mutually exclusive — respondents, after all, could only choose one of the two — those who ticked the second box might just as easily have ticked the third since the belief that “Australia should wait until Hamas is replaced” and the belief that “Australia should wait until Palestine considers recognising Israel’s right to exist” (the second option) were not alternatives to supporting the status quo but reasons for supporting the status quo (the third option) — unless, of course, we assume that “no change” meant “no change, whatever happens.”
“Don’t know” was not a response the question allowed. Preventing a “don’t know” may encourage those with an opinion to express it, but only at the cost of forcing those without an opinion either to skip the question (creating a potential non-response problem) or to select an option they don’t necessarily support — here, most likely, in large numbers, “no change.” That “no change” was the most popular of the three alternatives also may be explained by the fact that it was the only alternative “easy to read.”
If DemosAU exaggerated the level of support for the government’s move to recognise a Palestinian state, the Resolve poll surely underestimated it.
Essential: “To what extent do you support or oppose the Australian government’s move to recognise the state of Palestine?,” Essential asked. There was no introduction. This was the question that produced an answer in favour of recognition mid-way between the result obtained by DemosAu and the result obtained by Resolve. If the other two questions were “very difficult to read,” this one was simply “difficult to read.”
Respondents were offered five-point scale: “strongly oppose” (20 per cent); “somewhat oppose” (10 per cent); “neither support nor oppose” (37 per cent); “somewhat support” (16 per cent); “strongly support” (18 per cent). Responses were evenly spread — some might say randomly spread: 34 per cent in favour, 30 per cent opposed — a difference of just four points — with 37 per cent saying “neither support nor oppose.”
The way the results were published might have suggested that “strongly support” was the first option, but “strongly oppose” came first. So, any “primacy effect” would have boosted opposition to the proposal, not support. Essential, the only one of the three polling organisations to be a member of the Australian Polling Council, was the only one of the three to publish its questionnaire.
How do these polls measure up against Gallup’s “quintamensional plan”? None sought to establish whether respondents had “given any attention or thought to the issue,” much less what interest in or information about the issue they brought to the interview. The closest we can get to estimating how many had given little or no “attention or thought to the issue” is the proportion registered by the polls as “don’t know” or as “neither support nor oppose.” These numbers are high.
Had the polls drilled down into the numbers to discover how many had “given any attention or thought to the issue,” the findings might have been sobering. Asked in January 2021, in a probability-based poll, “how interested are you in the Middle East and the Israel-Palestinian conflict,” two-thirds of the respondents said they were either “not at all interested” (35 per cent) or only “a little bit interested”; only a quarter (24 per cent) said they were “somewhat interested” and hardly any (8 per cent) said they were “very interested.” Asked “How much do you know about Australia’s foreign policy towards Israel and Palestine,” most (57 per cent) said they knew “virtually nothing”; while 27 per cent said they knew “a little” and 14 per cent said they knew “something,” almost no one (2 per cent) said they knew “a great deal.”
Polls like this are almost certain to reflect the times in which they are taken; this one was taken a few months before the second Intifada. Given the events of 2023 — October 7 — and their consequences, both the level of interest in “the Israel–Palestinian conflict” and the level of knowledge about “Australia’s foreign policy towards Israel and Palestine” are very likely to have increased. How much they might have increased, we cannot say.
Given the likely low level of engagement, the questions and introductions — and/or, we should add, the response alternatives — invited the use of split-ballots, even if this meant enlarging the number of respondents. This never happened. Pollsters (and academic survey researchers) have always been keen to maintain that public (polled) opinion largely exists independently of how it is measured. So, experimental techniques — with the risk that different techniques will yield quite different results — are rarely used, no matter the insistence of the experts in questionnaire design that they should.
In the era in which he proposed it, Gallup’s suggestion that pollsters include an open-ended question “to bring to light general attitudes,” was largely impractical. Until the 1970s, interviews were conducted almost entirely face-to-face; the idea of interviewers faithfully recording open-ended responses, of market research organisations putting the time into coding these responses, and of media outlets footing the bill for effort involved, was mostly fanciful.
AI might change all that. DemosAU’s poll included no open-ended question before respondents were asked whether they would “support” or “oppose.” But in asking for their reasons for supporting or opposing “Australia formally recognising a Palestinian state in the present circumstances,” DemosAU was able to harness the power of AI to both record respondents’ “verbatims” and to code them.
Gallup’s final suggestion, that the polls measure the intensity of opinion, has always been something pollsters could do. On the Palestine question, however, only Essential tried to do it. As we have seen, it presented respondents with a five-point “Likert” scale with “strongly oppose” at one end and “strongly support” at the other. On this measure, 38 per cent had strong opinions — 18 per cent in “favour,” 20 per cent “against.” In 2021, slightly fewer (32 per cent) said they were either “somewhat interested” or “very interested” in “the Middle East and the Palestinian conflict,” even if many fewer (16 per cent) said they knew either “something” or “a great deal” about “Australia’s foreign policy towards Israel and Palestine.” Even allowing for an increase in the levels of interest and knowledge since then, the polls suggest (though they certainly don’t prove) that intensity of opinion on the recognition of Palestine may be closely related to interest in the conflict but not closely related to (self-assessed) knowledge about Australia’s position on it.
Of the three polls, the Essential poll is the one most likely to represent the state of public opinion on the recognition issue. This is not because it is the most recent, taken after the public had had more time to digest the news and think about it. Nor is it because it represents a reading that is midway between the other two — though, as rule of thumb, the idea that the truth is likely to in the middle offers some reassurance. Rather, it’s because the Essential poll is the least problematic of the three.
Essential’s is the only poll to have respected the injunction to “Beware of asking respondents about solutions to complex problems.” It avoided the use of an introduction that might bias as well as inform a response. It avoided the use of alternatives that might have been loaded or incomprehensible. And it allowed respondents to say they had no opinion even if this also allowed respondents to conceal views they held.
Does that mean that support for recognition split 34–30, as the Essential poll showed? Not necessarily. A poll, after all, is no more (if also no less) than the product of a sample survey; even if the sampling is probability-based, there are other kinds of possible error: coverage error (a sampling frame, such as the internet, that doesn’t cover everyone), measurement error (from asking the wrong question to asking a question that solicits meaningless answers), non-response error (those asked to participate not responding, now very common, with those who do respond having different views from those who don’t). Each of these may be more important than sampling error — though sampling error is the only source of error most pollsters are prepared to acknowledge.
More striking than the difference between the level of support for a Palestinian state and the level of opposition is the level of non-commitment — the number of respondents who chose “don’t know” or “neither support nor oppose.” In American polls that offer a “don’t know” as an option, the uptake across all manner of topics normally ranges from “about an eighth to a third.” In these Australian polls, the level of non-commitment ranges from about a third (32 per cent) to about a half (46 per cent) — both “don’t know” and “neither support nor oppose.” How many who responded with a “yes” or a “no” should really have registered as uncommitted?
For an issue that appears not to have generated widespread engagement — with low interest in it, and little knowledge about it, if only one of the polls had had the wit to measure it — might the intensity of opinion be a better measure of public opinion than its overall distribution? If intensity is our best guide, then support for recognition (as measured by Essential, the only poll that measured it) appears to be evenly split: 18 per cent in favour, 20 per cent against.
Has there been a shift in the polls over the last eighteen months or so? If there has been — either in the distribution of opinion or in its intensity — then the findings of Essential and DemosAU in April and May last year respectively suggest it was small. (DemosAU’s head of research declared that the July 2025 figures represented a “jump” in support of ten percentage points since last May, with the gap between support and opposition increasing from thirteen points to twenty-two. But changes in the questions and/or response options mean the polls don’t allow us to infer anything much.)
Are polls the only measure of public opinion? They are not. The rise of polled opinion, the American political scientist Benjamin Ginsberg argued nearly forty years ago, transformed earlier understandings of public opinion: from being understood as views citizens volunteered to views survey researchers solicited; from views inferred from particular kinds of behaviour to the expression of particular kinds of attitudes; from the expression of groups, their leaders and so on, to the expression of random individuals; and from assertions of an agenda for public debate to responses to a pollster’s agenda of public concerns.
While the rise of polled opinion may have marginalised earlier understandings of public opinion, it hasn’t eliminated them — especially when it comes to public policy — as the recent pro-Palestinian and pro-Israel protests show. Political leaders are usually less concerned with whether a proposal has widespread public support than with whether falling into line with — or defying — calls of certain kinds is likely to shift votes. •