Most polling stories during the campaign have focused on the horse race — or horse races, given that polling is being done in particular seats as well as nationally, and in some cases for the Senate. But one story, at the very beginning of the contest, focused on what publishers ought to know about pollsters before commissioning their work, what the public has a right to know, and what respondents should know if they are to give informed consent.
According to the story, James Chessell, group executive editor of the Sydney Morning Herald and the Age, had said that his papers would “no longer commission uComms” to carry out polling now that he had become aware of who owned it. The story, which was published by ABC Investigations on the day parliament was prorogued, also reported that “some other uComms clients now intend to stop using the company after being made aware of its ownership.” Clients mentioned in the report included GetUp!, Greenpeace, the Australian Youth Climate Coalition, the Australia Institute, the Animal Justice Party, and a number of political candidates — none of them conservative — in the federal electorates of Higgins and Wentworth, and in the state electorate of Sydney.
Who owns uComms (or rather UComms, as it appears in the company registry)? Effectively, three shareholders: Sally McManus, ACTU secretary; Michael O’Connor, national secretary of the CFMMEU, an affiliate of the ACTU; and James Stewart, a former executive of another polling operation, ReachTEL. Both McManus and O’Connor are “listed as shareholders on behalf of their organisations,” said the ABC. It also noted that “[b]efore being contacted by the ABC, uComms’s business address was listed in company documents and on published polls as being the same as the Melbourne CBD office building as the ACTU.” Subsequently, its listed address had changed “to a nearby Melbourne CBD address.”
The ABC had discovered much of this by searching UComms’s records, lodged with the corporate regulator ASIC. Even then, it had been forced to dig “deep,” since UComms’s records made “no explicit reference to the ACTU or the CFMMEU.” An initial search revealed only one shareholder, a company called uPoint Pty Ltd. McManus, O’Connor and Stewart were the (non-beneficiary) shareholders of uPoint, making them directors of UComms only indirectly.
UComms styles its polls — or it did until shortly after the story broke — as “UComms/ReachTEL” or “UComms Powered by ReachTEL.” This was not because it is the polling company ReachTEL, renamed, but because UComms uses ReachTEL’s original robo-polling and SMS technology. ReachTEL, founded by Stewart and Nick Adams in 2008, was acquired in September 2015 by Veda, a data analytics company that is also Australia’s largest credit reference agency. The sale of ReachTEL to Veda, said Stewart, would allow the two “to grow the business with Veda… [and] enhance our research offering, and the union of our collections and marketing platforms [would] expand our market leading solutions.” (Veda itself was acquired by Equifax, a member of the S&P 500, in February 2016.)
Whether Chessell held himself or his newspapers responsible for not checking on UComms’s ownership, or blamed UComms for not presenting him or his editors with a statement of ownership, is not entirely clear. The former seems unlikely. According to the report, “none of the eleven uComms clients contacted by the ABC said they had thought to make a paid search of the company’s structure” — a search that would have set them back all of $17. “We do not routinely do ASIC searches of all companies with which we do business,” said the NSW Nature Conservation Council, one of UComms’s (now former?) clients.
If clients aren’t responsible for checking these things, should the company have told them? UComms thinks not. But rather than argue that it is up to clients to protect themselves from reputational damage, UComms says that any concern on that score would have been without foundation. “The notion there would be a conflict of interest is ludicrous — the whole point of establishing the company in the first place was to provide a quality service at lowest possible cost for both unions and the broader movement,” a representative was quoted as saying. “We’re growing the union movement to fight for fairer workplace rules and that means we need to make use of the latest technology. uComms is a part of that effort.” The trouble with this defence, of course, is that clients like the SMH and the Age are not part of any union or “broader movement”; if anything, just the opposite.
The ABC’s story noted that UComms had “received widespread media coverage in the past twelve months for its polls, including a recent front-page splash commissioned by the Sydney Morning Herald predicting a Labor win in the New South Wales state election” — an election Labor lost. Was the SMH concerned that UComms had predetermined the result in Labor’s favour — on the assumption, perhaps, that a good set of figures for Labor would discourage the Coalition’s volunteers or deflate its vote? Apparently not. “There is no suggestion,” the report was careful to say, that “the outcome of uComms polling is influenced by its ownership structure.”
So, why the fuss? Because in the same way that justice not only needs to be done but must be seen to be done, polling that purports to give an objective measure of public opinion to any reader needs to be neutral and be seen as neutral — meaning, among other things, that it is conducted by those who don’t have a conflict of interest or agenda, however unconscious. (One reason why the uranium lobby dropped its own polling, in the late 1970s, and had some of its questions incorporated into other polls, was that Max Walsh of the Australian Financial Review, noting the provenance of the polls, discounted them.) While UComms’s state election poll may have been conducted in a thoroughly professional manner, if the company was “controlled by two of the most powerful forces on the left-side of politics,” as the report put it, there was a very real risk — if the ownership of UComms became known — that it would fail to satisfy the requirement that it be seen to be conducted in a thoroughly professional manner.
“Polling experts,” the ABC’s report insisted, “say uComms should have made clear to its clients, survey respondents and anybody reading their results that the Labor-aligned groups co-own the company.” But in the long history of polling in Australia, just when exactly have polling companies made it clear to their clients, to respondents, or to “anybody reading their results” who it is that owns them?
The rise of in-house polling. Where media outlets that publish poll results also own the company that produces them, the need to have the company clarify its ownership to its client hardly arises. From the early 1940s to the early 1970s, the only national poll — the Gallup Poll, formally known as Australian Public Opinion Polls (The Gallup Method), or APOP — was owned by a consortium of newspapers whose members, and associated mastheads, had exclusive rights to publish its results. The consortium consisted of the Melbourne Herald, whose managing director Keith Murdoch was responsible for bringing the Gallup Poll to Australia and for organising the group; the Sydney Sun; the Brisbane Courier-Mail; the Adelaide Advertiser; the West Australian in Perth; and the Hobart Mercury.
In 1971, when the Australian started publishing the results of APOP’s first rival poll as a national poll, Australian Nationwide Opinion Poll, or ANOP, it was publishing the results of a poll that News Ltd, majority owned by Keith’s son Rupert, had created, only this time as a joint venture with Associated Newspapers in Britain, and with its British subsidiary, National Opinion Polls. After the 1974 election, ANOP’s managing director, Terry Beed, bought out Associated Newspapers’s half, sold because ANOP wasn’t making money; Murdoch lost interest, though the company’s losses were attractive for tax purposes; and the company was sold to two of ANOP’s other employees, Rod Cameron and Les Winton. By year’s end, the relationship between ANOP and the Australian had come to an end.
In 1985, News — this time with a local market research firm, Yann Campbell Hoare Wheeler, or YCHW — created a new poll, Newspoll, via Cudex, a joint venture company in which News and YCHW were equal partners. Newspoll’s findings were published by the Australian. In May 2015, after Cudex was dissolved, News was left without a direct stake in any polling organisation. Newspoll now became a brand within the stable of Galaxy, a company founded by David Briggs, a former Newspoll executive, who struck out on his own (with his wife as co-owner) in 2004. Since December 2017, Galaxy has been owned by the British polling organisation YouGov.
While in-house polling of the kind associated with two generations of the Murdochs may now be largely a thing of the past, in-house polling has not disappeared. The SMH publishes the results of its weekly “readers’ panel,” in which 2000 or so readers are asked to give their “feedback” on questions that touch on issues of public policy, politicians in the news, and so on. For the election campaign, the AFR, its stablemate, has also established a “reader panel,” though a much smaller one. But the most prominent of the in-house polls is the ABC’s Vote Compass. First run in 2013, it attracts more than a million participants. While the ABC’s reach is undoubtedly bigger and more diverse than the SMH’s, not to mention the AFR’s, respondents to Vote Compass self-select — not only in the sense of deciding whether to participate (a feature, not sufficiently recognised, of all polls), but also in the sense that respondents (as in all viewers’ or readers’ polls) are not brought into the poll through a process of sampling.
External pollsters come to the fore. The first newspapers to commission national polls from companies they didn’t have a stake in were the Age and the SMH. When Age Poll (known in Sydney as the Herald Survey) was created in 1970, the polling was done by Australian Sales Research Bureau (subsequently Irving Saulwick & Associates) with samples drawn initially from voters in Sydney and Melbourne. From 1972 until the arrangement came to an end in 1994, polling was conducted nationally. Between 1996 and mid 2014, Fairfax — the Age, the SMH, and the AFR — used AGB McNair, subsequently ACNielsen McNair, ACNielsen and finally Nielsen, for its national polls.
The ownership of these companies was not something to which the newspapers drew their readers’ attention; Fairfax was satisfied that no conflicts of interest were involved. Following Nielsen’s decision to withdraw from the field, Fairfax turned to another foreign-owned provider. Since October 2014 the old Fairfax mastheads (now owned by the Nine Entertainment Co.) have depended on the French-owned Ipsos — the third-largest market and public opinion company in the world — for their national polling, and UComms and ReachTEL (with occasional exceptions) for their state and single-seat polling.
In 1973, after losing the APOP contract to McNair — a consequence of an ill-advised National Press Club speech by Roy Morgan shortly before the 1972 election, in which he claimed not to have “read a textbook on statistics, nor on sampling… nor on public opinion polls,” and boasted of his very special ability to interpret the figures from his computer — Morgan Research, through Gary Morgan, began a long association with Sir Frank Packer’s (later, Kerry Packer’s) Bulletin. Again, the magazine saw no reason to say who owned the poll. In 1992, Morgan switched to Time magazine — he was replaced at the Bulletin by AGB McNair — before switching back to the Bulletin in 1995. But after the Morgan Poll badly misread opinion ahead of the 2001 election, its contract came to an end. The Morgan Poll has not been signed up by any media company since.
After the axing of APOP in 1987, when the Herald & Weekly Times — and hence APOP — was acquired by News Ltd, the various mastheads involved in the APOP consortium made new arrangements. Some polled in-house, others engaged outside suppliers. On occasion, they sang from the same song sheet; for the 1998 election, Quadrant, run by Ian McNair, the last custodian of APOP, ran their polls. Again, there were no declarations of interests — or the absence of any conflicts — as is now the norm for contributors to some academic journals and online sites like the Conversation.
Since 2013, all News Ltd’s metropolitan mastheads — the widest-circulating newspapers in every state — have used (YouGov) Galaxy. Again, none makes any mention of YouGov’s interests. As with other outlets that don’t disclose such details, declarations of ownership are deemed irrelevant, and disclosing irrelevant information would simply waste valuable space. Ultimately, however, it is the mastheads — not the suppliers — that have to take responsibility for what questions are asked, when they are asked, and by whom they are asked.
Surveys for free. Other media outlets have established arrangements through which they get first access to polls they have neither purchased from an outside provider nor conducted in-house. The two most prominent pollsters to have come to arrangements of this kind are JWS Research, which produces a series called “True Issues,” and Essential or Essential Media (formerly Essential Media Communications), which publishes the Essential Report — originally weekly, now fortnightly, though more frequently during the campaign. JWS has a relationship with the AFR, Essential with Guardian Australia; previously, Essential had an arrangement with another online publication, Crikey.
Presumably, the AFR knows that JWS numbers the Mineral Council of Australia, the Australian Coal Association, and the Property Council of Australia among its clients; and the Guardian knows that Essential describes itself — a bit like UComms — as “a public affairs and research company specialising in campaigning for progressive social and political organisations.” If they don’t know about any of this, it’s not because either JWS or Essential keeps it a secret: the information is on their websites. The pollsters’ backgrounds and connections, far from discouraging the arrangements with their respective publishers, may serve to recommend them.
What’s in this kind of arrangement for the pollsters is publicity, their results being published on the front page of an important newspaper or its e-equivalent. What’s in it for the publishers is editorial material that is “exclusive” and free. The Roy Morgan Research Centre also gives away its poll findings, not via an intermediary but by posting them on its website and sending them to its clients.
OWNERS AND PLAYERS
When interviewers first ventured into the field in 1941 to conduct a poll for APOP, they didn’t tell respondents that the company was owned by a group of newspapers, much less tell them who owned the papers or managed them; typically, market research is conducted on the basis that respondents are not to be told for whom the research is being conducted lest it influence the results. (Telling respondents where they could read the results would have been a different matter.) Nor, when they published APOP’s results, did newspapers tell their readers who owned the poll or that newspaper executives had helped determine the questions.
Yet the fact that APOP was owned by a group of newspapers led by Keith Murdoch, an important political player on the conservative side of Australian politics, occasioned controversy; in particular, it caused concern to those who didn’t share Murdoch’s politics or trust him to conduct a proper poll. In the Worker, readers were warned that “the ‘polls’” were “financed by newspapers whose interests were opposed to the interests of the Labor Movement.” The stridently anti-Murdoch Smith’s Weekly, noting that APOP required its interviewers to “not be known as ardent supporters of a particular political party,” asked whether “the same qualifications” had been laid down “for its newspaper proprietor subscribers?” There were even demands that the government should set up an organisation — perhaps as “a branch of the Statisticians Department,” suggested one Labor MP — to conduct polls devoid of “political gerrymandering,” rather than leave polling to private enterprise.
None of the newspapers that had come together to create APOP (a not-for-profit company) and publish its findings were sympathetic to Labor. As Sally Young shows in her recently published history of “Australia’s newspaper empires,” Paper Emperors, between 1922 and 1943 none of these newspapers had editorialised in favour of Labor at a federal election; none, as she also shows, would do so until Fairfax broke ranks in 1961. In 1946, a member of the Tasmanian parliament alleged that Gallup interviewers had been conducting polls for the Liberal Party. Did the Mercury, a stakeholder in APOP, ask Roy Morgan whether this was true? Whether true or not, Morgan appears to have said nothing about it.
In 1959, while employed as APOP’s managing director, Morgan stood as a “Progressive Independent” for election to the Melbourne City Council; once elected, it was a position he would hold until after his contract with APOP came to an end. Councillors representing business interests formed a non-official party, the Civic Group, which largely controlled the council. By the time he was defeated, in 1974, Morgan had become its leader. The only official party on the council, Labor, had seen its influence decline. By contrast, Morgan’s first mentor as a public opinion researcher, George Gallup, far from seeking public office of any kind, made a point of not even voting. The Melbourne Herald covered Morgan’s 1959 campaign, including the fact that he conducted a survey of electors in his ward. But it went on publishing APOP findings on party support and political issues without mentioning Morgan’s political involvement.
By the late 1960s, suspicions within Labor’s ranks that APOP was under-reporting Labor’s vote encouraged Rupert Murdoch to establish ANOP. In those days when being an “underdog” was not considered an advantage, Murdoch was keen to do what it took to see Labor win. How he would have reacted if ANOP had done work for the Labor Party while being published by the Australian is difficult to say; while ANOP did some work for the Whitlam government, possibly brokered by the party’s secretary, Mick Young, it did not work for the Labor Party. ANOP’s work for the party would come after its connection with News was severed.
Failures to disclose how polls are conducted. What should polling companies — or, more to the point, those who publish their findings — disclose to those trying to make sense of the polls? During the current campaign, with its focus on the vote, the Australian (Newspoll), and the SMH and the AFR (Ipsos) have published the date(s) on which their polling was conducted, the size of the sample, and the sampling variance, or “margin of error,” due to sampling. But sampling variance, sometimes misrepresented as “the maximum sampling error,” rarely becomes part of any discussion of what the figures produced by the poll mean, and helps drive out any mention of non-sampling error. Under Morgan, APOP was not required to disclose the date(s) on which the polling was conducted, the size of the sample, or the sampling variance; under McNair, APOP at least disclosed the size of its sample. Saulwick disclosed the date of the fieldwork and the size of the sample (usually 1000), but said nothing about sampling variance. The same is true currently of YouGov Galaxy, and of the Morgan Poll. ReachTEL, which rejoices in publishing its results to the first decimal point, also says nothing about sampling variance.
Not every polling company or its client publishes the actual questions the poll has asked. Even with the question(s) on voting intention, there is a lack of disclosure. While Newspoll (and, in turn, the Australian) publishes the question it asks all respondents about how they intend to vote, as does YouGov Galaxy, no one following the Morgan Poll online, or Essential, or reading the Ipsos results in the AFR or SMH would know the question respondents had been asked. In particular, they wouldn’t know whether respondents had been presented with a list of parties from which to choose.
Presenting respondents with a list of parties may well prompt certain responses and repress others; not presenting respondents with a list may have different consequences. While the use of both approaches during the current campaign hasn’t attracted much attention from poll-watchers, Newspoll’s decision to add the United Australia Party to its list generated a discussion about how to compare polls that list a particular party with polls that do not. The AFR and SMH (and presumably the Age) publish the Ipsos figures for the Coalition, Labor and the Greens only; support for the other parties, which Ipsos also gathers, is swept out of sight by the papers and hidden under “other.”
Overlooked by most newspapers — the Australian, reporting Newspoll, is a notable exception — is the pollsters’ practice of posing a follow-up question to respondents who say they “don’t know” or are “unsure” how they will vote. This question is designed to get respondents to say to which party they are currently “leaning”; hence, the term “leaner.” Only after these respondents have been pushed — Essential pushes them twice — and the “don’t knows” reduced to a minimum, are the final voting-intention figures calculated and made public.
What do pollsters do with the remaining “don’t knows” — a figure that neither Ipsos nor YouGov Galaxy publishes? Newspoll makes it clear, as does Essential: “don’t knows” are excluded. In the past, however, not all pollsters have excluded them. Some pollsters have distributed them to one or other of the parties on the basis of which leader these respondents prefer or how they recall having voted at the last election.
There is also the not-so-small matter of the two-party-preferred figures. At the beginning of the campaign, Newspoll calculated these on the basis of preference flows at the 2016 election; so did Essential. How they distributed the first preferences that went to parties that didn’t exist in 2016 (the UAP, above all), they didn’t say. More recently, Newspoll has distributed preferences “based on recent federal and state elections,” an approach that has problems as well. Whether YouGov Galaxy, its stablemate, adopted this method for its national poll, conducted during the second week of the campaign, is hard to say from newspaper reports. Ipsos uses two methods: it looks at preference flows at the 2016 election, and it asks respondents who support minor parties to indicate whether they “will give a higher preference to the Labor Party candidate or the Liberal/National Party candidate?” In its last poll, happily, the two methods produced the same result. In its latest release, Morgan says it uses “respondent’s [sic] stated preferences.”
Where on the interview schedule the voting-intention questions are asked is something else few polls disclose. If the results of these questions are the most important results the poll generates — and no results are more closely scrutinised during a campaign — best practice suggests that the questions should be asked early in the interview; this ensures that the answers aren’t affected by the questions raised or answers given later. Ipsos asks its voting questions up-front. Under Morgan, perhaps to keep things low-key, APOP put them towards the end. Whatever it is that other pollsters do, they don’t advertise.
Even something as basic as the medium through which the interviews were conducted is not always clear; indeed, with some of the new technologies, it is not obvious that “interview” is still the appropriate word. Once upon a time, almost all interviewing was conducted face-to-face; in America, and beyond, face-to-face interviewing appeared to be part of “the Gallup method.” By the late 1970s, when more than 80 per cent of Australian adults had access to landlines, the industry shifted, largely, to telephones — interviewers dialling numbers at random, asking for someone in the household who met a set of demographic specifications (age and gender, typically), reading out the questions, and recording the answers. These were either recorded manually, to be punched into cards as code and processed by a computer, as Newspoll originally did; or, with Computer Assisted Telephone Interviewing, or CATI, recorded on screens and fed directly to a computer, soon becoming the industry standard.
The main hold-out was the Gallup Poll, which maintained its commitment to face-to-face interviewing; under McNair it continued to interview face-to-face until the end. In an industry that has largely moved on, Morgan still uses face-to-face interviewing for much of its work; during this election, all of Morgan’s national polls have been conducted face-to-face. Valuable in its own right, face-to-face interviewing helps Morgan — one of the country’s biggest market research firms — build a database of respondents that can be reached for other purposes, and by other means.
With the tweaking of telephone technologies and the rise of the internet — both of which have reduced costs massively — the pollster’s toolkit has become increasingly diverse. Ipsos, polling for the old Fairfax mastheads, continues to use what it only describes as “random digit dialling.” In its first poll of the campaign (though neither the AFR nor the SMH noted the fact), it managed to combine landlines with mobile phones — whether via CATI or by some means it didn’t say. The Australian says nothing at all about how Newspoll conducts its interviews; last time, respondents either answered online or were reached by robo-polling — questions asked on the telephone, but not by a live interviewer, and answered by someone in the household, though not necessarily the person from whom the pollster wants to hear — the data from the two methods somehow being combined. YouGov Galaxy appears to have moved its national polling online; at the last election, Galaxy (in line with its other brand, Newspoll) combined online polling with robo-polling — a mode of polling that YouGov doesn’t use in Britain. Essential has always polled online.
Whatever the mode, raw responses are never wholly representative of the population from which they are drawn. This is because some demographics are easier to reach than others, with those of non-English-speaking background and young men traditionally posing the biggest challenge, and not all of those reached agree to an interview. It is also because response rates, falling for years, are now typically in single digits — a change that may be more marked among some groups than others. And it is because, within a particular demographic, those who do respond may not be representative of those who do not; with weighted data one has to hope that this isn’t true — even when, as with Vote Compass (which we are told weights by gender, age, education, language, religion and even respondents’ unreliable recall of their past vote), it almost certainly is true.
If the actual distribution of a population’s relevant characteristics is known — location, age and gender are the parameters pollsters usually look at — weighting the data so that it better matches the distribution of these characteristics in the population at large addresses only the first two reasons. If other or additional demographics matter — characteristics that are overlooked (ethnicity or education, for example) or for which there are no population data that can be used (income, possibly) — the ability of weighting to fix even these problems can be severely limited.
A longstanding mystery is what pollsters actually do to weight their numbers. Newspoll acknowledges its numbers are weighted, but doesn’t say what variables it has used or what weights it has applied. Ipsos applies weights, but its first poll of the campaign didn’t adjust for all the variables the AFR says it did — age, sex, location. YouGov Galaxy weights its data, but the report of its most recent national poll, carried by News Ltd’s Weekly Times, doesn’t actually say that it does. Morgan, too, doesn’t say whether it weights its data, though it surely does.
In their failure to disclose almost anything about their polls, the political parties are in a class of their own. On Anzac Day, when the Coalition and Labor had agreed to a truce on advertising, the UAP declared in the News Ltd press — via one of its full-page ads, repeated several times since — that its polling showed that “15 per cent of Australians” had “decided to vote for the United Australia Party” and that “the majority” of those “undecided” (“over 28 per cent of Australians”) would also “vote for the United Australia Party and bring real change to Australia.” If these were the answers, any reader might have asked, what were the questions?
But in reporting poll findings that are unsourced — and, in this case, also completely implausible — the UAP is hardly alone. Where would newspapers be, especially in this campaign, without stories sourced to one party or another claiming to reveal what “internal polling” is showing in this electorate or that? Whether journalists ever see the original reports, or even summaries, is doubtful. No polling company is ever mentioned, no account of the methods is ventured, no data… no nothing. Reports of polls conducted by interest groups are almost never so bare.
Since November 2004, Britain has had an umbrella organisation to which virtually every polling organisation of any importance has belonged; members include both Ipsos and YouGov. Companies that join the British Polling Council agree to “fully disclose all relevant data about their polls.” Indeed, they agree to “publish details of the questions that they have asked, describe fully the way the data have been analysed and give full access to all relevant computer tables.” The council’s “objects and rules” require members to post both the unweighted data and a description of the weighting procedures on their websites within two working days of the original public release of the findings. This doesn’t offer pollsters many places to hide. The defence of proprietorial privilege and claims to intellectual property get short shrift.
An attempt to establish something much more modest for Australia was made more than thirty years ago, ahead of the 1987 election, by Jim Alexander (then at AGB McNair) and Sol Lebovic (at the time, running Newspoll). Their initiative was inspired, in part, by the formation of two groups said to have operated during the 1987 British election — the British Market Research Society Advisory Group on Opinion Polls and the Association of Professional Polling Organisations. But because they didn’t want to go it alone, and not everyone was up for it — Morgan Research, in particular, would not have supported it — nothing came of the proposal.
An initiative of this kind need not rest with the pollsters. There is nothing to stop media outlets or other Australian clients requiring polling companies to fully disclose their practices along the lines mandated in Britain. Some companies, no doubt, would rather forfeit the business than enter into a voluntary arrangement of this kind. But why would companies like Ipsos or YouGov, which have signed up to this sort of arrangement in Britain, decline to comply with such a request here?
Conflicts of interest. Ownership of polling companies, and of the companies that pay for their polls, routinely involves conflicts of interest that go beyond having a mission, like UComms, or a political position, like the consortium once built by the Herald & Weekly Times. Companies that conduct polls and companies that publish them employ labour — or, in the case of pollsters, as Gary Morgan is wont to insist, hire contractors. As a result, they stand to be affected by wage rates, payroll taxes, industrial disputes, leave entitlements, and so on. Does the polling they commission or conduct, however unwittingly, reflect this?
In the 1940s, Arthur Kornhauser, a researcher at Columbia University, set out to explore one aspect of American polling — was it “fair to organised labor?” After looking at the choice of topics on which they polled and the wording of their questions, he concluded that across the period he examined — the war years, 1940 to 1945 — pollsters had shown “a consistent anti-labor bias.”
There were no technical impediments to overcoming this bias, Kornhauser argued; “necessary safeguards” could be put in place to ensure that the job was done “objectively.” There were, however, “more formidable hurdles.” Polling organisations were “sizable business organisations” in their own right, he noted. In addition, they had business clients to satisfy — newspaper and magazine publishers, among them. “How far these influences will persistently stand in the way of balanced inquiry and the reporting of opinions about labor must be left for the future to answer.” But he wasn’t optimistic. One solution was for organised labour to do its own public opinion research; seventy years later, the mission UComms set itself might be seen as part of this. Another solution, “urgently” needed, was “research centres devoted to thoroughgoing, continuing attitude studies in the labor relations field.”
Opinions sympathetic to organised labour may not be the only views that a newspaper might be less than keen to publish. Companies responsible for commissioning polls sometimes have other interests to protect. For a newspaper to suppress the results of a question asked in a poll, after its executives have been involved in deciding whether to ask it, is unusual. But it has happened. In October 1958, after Roy Morgan had written up the results of an APOP question on newspaper readership, the Herald suddenly took fright and refused to publish it. The instructions to Morgan were unambiguous: “completely kill, destroy and otherwise wipe.”
Polling organisations may also have interests that may threaten their integrity — or appear to do so. During the debate about Indigenous land rights in the early 1990s, Gary Morgan agreed that the following words should accompany the Morgan Poll published in Time magazine: “Statement of Interest: The executive chairman of the Roy Morgan Research Centre, Gary Morgan, is also chairman of the WA mining company Haoma North West NL.” Until this statement appeared in small print, on 14 February 1994, Time had been publishing Morgan’s polls on land rights for over a year without any acknowledgement that the company had a potential conflict of interest.
Morgan’s interest in goldmining was hardly news; until February 2018, Haoma was a publicly listed company and Morgan makes no secret of his mining company. But since Time appears to have had no idea that Morgan was invested in mining — like UComms’s clients, presumably, it didn’t “routinely do ASIC searches of all companies with which we do business” — then in the absence of Morgan’s statement few of its readers would have had any idea either.
Identity matters. If material interests matter, at least potentially, so might identity; typically, of course, the two are connected. Since the emergence of polling, no Indigenous Australian, so far as I know, has been in charge of a poll or worked as part of a media team commissioning a poll in the mainstream media. Since APOP asked not a single question on Indigenous (or “Aboriginal”) issues until 1947 and no further questions until 1954, and after thirty years of polling and the asking of over 3600 questions had included only twenty-two questions on Indigenous issues, it is difficult not to conclude that some Indigenous involvement in the process of determining what questions to ask might have made a difference. And not just under Morgan’s stewardship; from 1973, when APOP turned to McNair, it asked just six questions out of nearly 2000 on Indigenous issues. What was true of APOP was true of the polls more generally. Over the same years, the Morgan Gallup Poll asked at least 600 questions in total, no more than two on Indigenous issues. ANOP, polling for the Australian from 1971 to 1974, asked just five out of nearly 600; Saulwick, from 1970 to 1979, just six out of nearly 1000.
With Indigenous involvement, not just the number of questions but also the nature of the issues — or the terms in which they were asked — might have been different. A question, for example, about whether “Aborigines should have the right to vote,” included for the first time by APOP in 1954, might have been included earlier; it might have been repeated sometime before the Commonwealth extended voting rights to Indigenous people in 1962; and the question of whether “Aborigines… should or should not be given the right to vote at federal elections” might not have been asked in November 1963, since their right to vote had already been “given” more than a year earlier.
Overwhelmingly, polling organisations in Australia — like the media companies to which they have usually had to answer — have been run by men. In recent years, this has changed, but not dramatically. At Ipsos, Jessica Elgood is in charge of what used to be called the Fairfax–Ipsos or Ipsos–Fairfax poll; at Morgan, Michele Levine, chief executive since 1992, once managed the Morgan Gallup Poll; and at ANOP, Margaret Gibbs built a formidable reputation, though as a qualitative researcher rather than as a pollster.
Having few, if any, women involved in constructing the polls can make a difference. For a few years during the war, two organisations sampled opinion for the press. One, of course, was APOP. The other was Ashby Research Service, run by Sylvia Ashby, the first woman to own a market research firm — not only in Australia but very likely the British Empire. Ashby sampled opinion in New South Wales for Packer’s two Sydney newspapers, the Daily Telegraph and the Sunday Telegraph. Polling in early 1942, she asked: “Should the Government form a People’s Army to fight in co-operation with the AIF and Militia if the Japanese invade Australia?” Respondents thought the government should. The men Ashby interviewed said that if a “people’s army” was formed, they wanted to join it; so, once she decided to ask them, did the women. Later that year, APOP asked its first question about a “merger” of the Australian Imperial Force and the Australian Military Forces. But it didn’t ask about the possibility of “a people’s army.” Even if it had, what are the odds that APOP would have asked whether women wanted to join?
Like the people they survey, pollsters — and those who pay them to ask some questions, not others, and to ask them in certain ways — range in their social attitudes from liberal to conservative, and in their political views from left to right. Whether these predispositions are conscious or unconscious is a separate matter. Among pollsters, diversity of outlook is much greater than diversity of ethnicity or gender. A similarly diverse media may well hire pollsters that make a good fit.
Polling in the 1970s, on issues the women’s movement was raising — the pill, abortion, prostitution, rape, divorce, child care, women in the workforce, how women should be addressed, and so on — and that other movements were raising — homosexual relations, the age of consent — provides one window into these predispositions at work. McNair, especially, but also Roy Morgan Research, commissioned by the Herald & Weekly Times (McNair), and by the Bulletin and the Women’s Weekly (Morgan), were inclined to ask about a more limited range of issues or to frame their questions in a more conservative way, than Irving Saulwick & Associates or ANOP, commissioned by the SMH and the Age (Saulwick) and by the Australian (ANOP). The pattern wasn’t wholly consistent; many of the questions asked by each of the pollsters were relatively neutral. And a number of topics — rape crisis centres, and the gender pay gap, for example — were ignored by all the men. But there was a pattern nonetheless.
AN OBLIGATION TO DIVULGE?
Knowing who owns a polling organisation can raise doubts about the bona fides of the polls it produces. In the case of UComms, Nine raised concerns about an independent operator, but when polling first began in Australia, much wider concerns were expressed about the in-house poll that Keith Murdoch had organised. While the UComms connection lasted no time at all, and its most controversial polling was conducted for the SMH only in New South Wales, APOP’s connection with the Herald & Weekly Times lasted for forty-six years — and for more than half of this time it was the only organisation conducting polls for the press nationwide.
Any list of the things that require fuller disclosure by the polls and by those who commission polls — if not to respondents then to readers — should not stop at naming who owns what or identifying who controls what they do. Pollsters and their paymasters are in the business of gathering information, publishing it, and using it to shape public deliberation and political debate. As a consequence, they should be under some obligation to reveal: anything that might pose, or appear to pose, a conflict of interest; the questions they ask and how they gather their data; and what they do to the data before they publish the results. •