«December 2015 Q&A Review Final Report Contents Introduction 3 Executive Summary 6 Questions #1 Topics 16 #2 Panel Composition 19 #3 Moderation of ...»
This reveals that the age profile of those who both ask the questions and react to the answers in the studio was markedly different from those who were being asked the questions and also those who were watching the broadcast of the program.
The comments we made about this lack of alignment in Question #5:
Audiences apply here. It is possible that the older viewing audience interprets the youthful style and tone of questioning as too direct, bordering on disrespectful and possibly even suggesting the program, in their eyes, lacks impartiality.
We acknowledge that we are speculating, but that view, if it did exist, may lie behind some of the criticisms levelled at Q&A.
The validity of such an opinion is tested later in this section (see Range of Perspectives below).
The data was supplied by the ABC. It does not include those who asked questions via Twitter, Facebook and Video nor those who asked questions because they put their hand up.
The age segmentation for panels does not perfectly match the age groups adopted for this exercise and therefore represents only a broad comparison.
Recommendation #12 Consistent with the requirement to present a diversity of perspectives, more questions should be selected from the over-35 age group.
Most of the programs we reviewed originated from the ABC studios in Ultimo, Sydney. Only two of the 23 were broadcast from outside Sydney51 with one further program broadcast not from the ABC studios in Ultimo but from the Sydney Showground.52 The manner in which this dictates the make-up of the studio audiences is addressed in the next section on Audiences. We conclude in that analysis that not only were 90% of studio audiences residents of Sydney but that 86% of those Sydney audiences were drawn from the central and inner suburbs.
It follows that as the questioners are largely drawn from the studio audience a similar geographical profile is likely to exist53. The ABC provided the following data analysis of those members of the live studio audience who were selected to ask questions.
Melbourne April 20 and Canberra June 15.
March 23 We calculate that in excess of 200 questions were asked during the sample programs and less than 10% were delivered by video, twitter or Facebook. The remainder originated from the live studio audience.
Q&A Review Final Report Of the questioners from NSW, 83% were residents of Sydney. We analysed the postcode data for the Sydney resident questioners grouping them into four regions.
1. Central comprising the Central and Inner Metropolitan postcodes.
2. Inner Suburbs comprising the North Shore, Northern Beaches, Gladesville-Ryde-Eastwood, South Western Suburbs and St George & Sutherland Shire.
3. Mid Suburbs comprising Western Suburbs and Parramatta Hills District.
4. Outer Suburbs comprising Macarthur Region and Outer Western Suburbs.
The following is the distribution of the Sydney based questioners across those four regions.
Q&A Sydney Questioners by Metropolitan Region
This correlates almost exactly with the configuration of the live audiences (see Question #5: Audience). In that section we comment on the practical reasons that contribute to the lack of attendees from the mid and outer suburbs.
We also acknowledge that the program’s decision to operate from the Sydney studios is driven by budgetary and logistical constraints and not by the preference of the Q&A team.
Q&A Review Final Report One of the consequences of this practice is that the studio audience is drawn from a comparatively small pool of residents of central and inner Sydney suburbs. This, in turn, leads to a high level of “repeat” attendees.
We believe the Sydney audience composition may also contribute to the large number of repeat questioners appearing on Q&A.
On 14 occasions during the review period a Q&A questioner had already asked a question in a previous program. In two instances the questioner was asking a question for the sixth time.
We do not hold the view that under no circumstances should anyone be allowed to appear as a questioner more than once but we were surprised at the frequency of repeat questioners54 and at the extreme examples of multiple questions referred to above.
With two exceptions there was no evidence of the repeat questioners having a specific agenda.
The exceptions were Nell Schofield whose two questions reflected environmental concerns and Andrew Wilson whose three questions reflected a consistent conservative viewpoint. But generally the questions appear to have been selected because they were interesting and original.
In Question #5: Audiences we conclude that the high number of repeat attendees in the studio audience is due substantially to the program being broadcast predominantly from a single location. In our view the fixed location also contributes to the number of repeat questioners.
We have reached the following conclusions:
1. The lack of questioners from areas outside of Sydney is not consistent with the program’s claim to represent “democracy in action” or with the ABC Editorial Standards 4.2 and 4.5. The questioners play a central role in the broadcast and that contribution should not be confined largely to residents of one city.
In 2014 Q&A adopted a new policy restricting questioners to asking only one question per calendar year.
Q&A Review Final Report
2. Broadcasting predominantly from a single location greatly increases the likelihood that the questions are sourced from a small and insufficiently representative pool. The frequency of repeat questioners testifies to this. If the program were broadcast from more locations the need for repeat questioners would be largely eliminated resulting in a greater diversity of perspectives being represented in the questions.
Our recommendations are similar to those in Question #5: Audiences.
Recommendation #13 The ABC should commit to broadcasting Q&A from the fullest possible range of locations across Australia.
Recommendation #14 Repeat questioners should be allowed only on an exceptional basis.
Recommendation #15 A set of Q&A Program Principles should be agreed between the program and ABC editorial management that, among other matters, details how the program intends to select its questioners and what protocols it will adopt in this regard to ensure the standards set by the ABC Editorial Policies are met. The Program Principles should be a public document, displayed on the Q&A website.
The above findings and recommendations are based on our analysis of the 23 programs taken together.
When assessing the programs individually we found few issues with the questions selected.
But there was one occasion where we believe questions asked failed a basic test of fairness.
In the program of May 25 where the Treasurer, Joe Hockey appeared on his own he was asked the following question by a member of the studio
“Mr Hockey, analysis from the National Centre for Social and Economic Modelling identifies how heavily the burden of budget consolidation falls on those less well off, highlighting the huge inequity in your Government's four-year blueprint for fiscal repair. How does this independent analysis fit with the budget which you described as responsible, measured and fair?” The Treasurer replied that he hadn’t seen the modeling commissioned by the Labor Party, as it hadn’t been made public. The moderator, Tony Jones, advised him that he understood it was being released the following morning but that Q&A had already seen a copy.
Mr Hockey protested that he hadn’t seen the report.
“Firstly, it hasn't been publicly released, right? So we haven't seen it.
You’re asking me about something I haven't seen, the Government hasn't seen and most of the media haven't seen. So you’ve obviously got an exclusive in getting this information.” The moderator persisted in putting questions to the Treasurer that were based on the NATSEM report.
1. The audience was not informed of the circumstances by which Q&A had been provided with an advance copy of the NATSEM report. As this was allegedly a document under the control of the Opposition the question arises whether it was deliberately provided to Q&A for the purposes of wrong-footing the Treasurer. Whatever the circumstances we believe Q&A should have declared how it came to be in possession of the report.
2. The initial question using the NATSEM data came from a member of the audience. The Q&A team selected the questions to be posed by audience members earlier in the day. It can be assumed from the preparedness of the moderator to ask a series of follow up questions also based on the NATSEM report that the totality of the questioning on this matter, involving both audience member and moderator, was orchestrated. We question whether this is consistent with the normal practices of Q&A and whether it is fair to the interviewee.
3. In any event we believe it is not fair to the interviewee to be questioned on a detailed economic report that he was not able to access. Fair treatment requires that he be given the facts in advance so that he can respond to the resulting questions.
We make no recommendations on these conclusions as we regard this particular event as an isolated lapse in judgement and not symptomatic of any systemic issue.
In your opinion, did the behaviour and responses of audiences influence your perception of the program’s impartiality? Did the composition of the audience seem predictable from week to week (if not, were there any obvious factors involved, perhaps including broadcast location)? In your view, does the method currently used to identify audience members work well, or do you believe there might be ways to improve selection processes?
It is not surprising that the behaviour of the Q&A audience is the subject of scrutiny and criticism. The selection of a live television audience is not a scientific procedure. Yet its role in the program’s production is far more active than is the case with other programs. Many of the audience members have already submitted questions; others have come prepared to add their opinions to the discussion if they get the chance. In other words a significant proportion of the audience is seeking active involvement in the program.
In a live program such as Q&A this can provide volatility, unpredictability, even an element of risk. But this unscripted conduct is part of the appeal of the program. The panelists may be directing their comments mainly to the viewing audience but it is the reaction of the studio audience to those comments that has the most impact. Primarily through sound but sometimes visually as well, there is an instant, direct and on occasions confronting response.
While that response can be provocative it can also oblige panelists to be more direct and unequivocal in their answers. It provides a formidable reminder to politicians not to waffle, avoid answering the question or stick to a pre-scripted “message”.
Sometimes viewers (and certainly some panelists) may feel the response is unfair, perhaps even that the behaviour of audience members is questionable.
Q&A Review Final Report Some criticism of audience behaviour is linked to the manner in which the audience is selected, alleging that the composition of the audience pre-determines how it will respond to the debate. That view was put
strongly by the Immigration Minister, Peter Dutton:
“the audience is stacked, the panel is stacked…”55 Much of our response to this term of reference focuses on the composition of the studio audience, how it is selected and whether it is consistent with the obligations of impartiality contained in the ABC Editorial Policies.
Applications for places in the Q&A audience are made on-line. The
application form states that the questions asked of each applicant are:
“to assist us in selecting a diverse and balanced audience.” As well as personal information such as age, gender and postcode the applicants are also asked which party they would vote for if a federal election for the House of Representatives was held today, whether the applicant is a member of a political party and whether, if successful, they would wish to ask a question of the panels.
Q&A retains this information in its own database.
Sydney Morning Herald, June 24 2015.
Among questions asked of potential audience members on their on-line application form is their likely voting intention or, if uncertain, what party they are “leaning” towards. This information is then used to structure the audience so that composition by voting intention broadly conforms to current opinion polls56.
To test whether this occurs we compared the average composition of the Q&A audience to average voting intentions as measured by Newspoll during the first six months of 2015. To test consistency we carried out the same exercise using 2012 data.
Q&A Audience Voting Intentions v Opinion Polls 2015 & 2012
“The voting proportions are based on a rough approximation of the current opinion polls - by which we mean polls over the rolling last 3 months. The intention is not to have a precise match to any number but to have an audience where the significant voting groups in the population are represented by significant groups in the audience”, Peter McEvoy, Q&A Executive Producer, August 2015.
Q&A does not consistently report the proportion of its audiences registered as Others. For this exercise we have assumed those not reported as Coalition, ALP or Greens are Others.
The composition is calculated from the figures published by Q&A. We have assumed they are accurately presented.