«December 2015 Q&A Review Final Report Contents Introduction 3 Executive Summary 6 Questions #1 Topics 16 #2 Panel Composition 19 #3 Moderation of ...»
Q&A Review Final Report During the 2015 review period the Q&A team achieved their objective with a remarkable level of accuracy. There was only a slight overrepresentation of Greens potential voters at the expense of the Coalition and ALP. But given that the difference of 1.5% equated to only 3 or 4 audience members (out of 250) it’s hard to regard this as having any significance.
It’s also apparent that the production team adopted a broadly consistent approach in 2012, although representatives of “Others” were certainly below the level the opinion polls would have indicated.
What is apparent from these figures, and what may not be fully appreciated by critics of Q&A, is that the program’s apportionment policy inevitably means the audience will comprise more opponents of the Government than supporters.
Assuming that ALP, Greens and Others form an unofficial opposition the following simplified portrayal of the audience composition occurs. Again, we have added 2012 data as a point of comparison.
Q&A Audience Voting Intentions Government v “Opposition”
The 2012 figures are less valid given that the Greens provided critical support for the ALP Government and therefore can’t be regarded as a true opposition.
But it is clear that during the sample period in 2015 Government supporters in the Q&A audiences were outnumbered 3 to 2. And, notwithstanding the qualifications attached to the 2012 results, it is reasonable to assume that this is a fairly standard pattern regardless of who is in Government.
Q&A Review Final Report It is a reality of our democratic system that those who support other parties often eclipse Government support in opinion polls. It might be possible for Q&A to adopt a different approach for apportioning the audience by voter intention by using a model whereby Government supporters occupy 50 percent of places and Opposition parties the other 50 percent. But that would involve imposing an artificial balance and one that Opposition parties might legitimately complain does not properly reflect their overall and individual party standing in the community.
We have concluded that Q&A audiences are, in terms of voter intention, an accurate and appropriate reflection of the prevailing sentiment of the population. Any expectation that a Government should automatically be entitled to majority or even equal support from the Q&A audience is misplaced and, in our view, would contravene the ABC Editorial Policies by unduly favouring one perspective over another.
We acknowledge that the process relies on audience members truthfully declaring their voting intention. It seems to us that abuse of this system would have an impact only if it was known that Q&A audience producers were struggling to source people of one particular voting intention and members of another voting intention group falsified their allegiance in order to take up the available places. We are unaware of any evidence to suggest that has or could occur.
Members of all voting groups are able to falsely represent their voting intention and unless there was an orchestrated campaign by just one group it is reasonable to assume the effects would largely be cancelled out.
The BBC goes to some considerable lengths to try and eliminate the risk of false declarations by audience members on Question Time. It outsources the audience selection and requires that individual audience members be spoken to prior to attending the studio59. This is an easier task for the BBC as Question Time audiences are generally much smaller than Q&A’s – around 140 members compared to 250 for Q&A.
Ric Bailey, Chief Advisor, Politics, BBC
Q&A Review Final Report Ric Bailey, who was responsible for more than 200 Question Time programs and is now the Corporation’s Chief Advisor on Politics, told us politicians in the UK are obsessed with the make-up of QT audiences with accusations of imbalance and stacking. He suggested politicians disliked the live audience because they can’t control the audience and its reaction.
“There is no perfect mathematical way to assemble an audience for QT.
You need to find a way to be as fair as possible under all circumstances.” Given that all the effort and resources expended by the BBC in vetting audiences has failed to eliminate criticism, we are not suggesting the ABC adopt this level of audience scrutiny.
But we do believe one aspect of the BBC’s production of QT provides significant protection against audience manipulation. QT travels the country drawing its audience from a constantly varying pool of applicants. Two-thirds of its programs are broadcast from outside London and even those from the capital are produced in the suburbs and not in a single central studio.
As we have identified earlier, Q&A during the sample period was broadcast from Sydney for 21 of the 23 episodes.
Later in this section we deal more fully with this issue.
The audience for the sample programs composed slightly more females than males (53% female, 47% male)60.
More significant though was the surprisingly youthful profile of the studio audience particularly when compared with the Q&A viewing audience and the age profile of the ABC TV audience as a whole.
Q&A Studio Audience Age Profile v TV Audiences and Panel Profiles
Three out of five members of the average Q&A studio audience were below the age of 35 (in fact one third of the audience was under the age of 25). This is the inverse of the ABC TV viewing audience generally and significantly out of kilter with the specific Q&A viewing audience.
It sets up an interesting dynamic. The home viewing audience of which 84% are over the age of 34 were observing (and presumably judging) the behaviour of a studio audience that had a markedly different age profile (only 40% were over the age of 34).
It’s possible that this disconnect between audiences contributed to some of the criticism of the studio audience behaviour. The timing, frequency and tone of applause and laughter from a young studio audience may appear to the older viewing audience as inappropriate and even disrespectful.
ABC Data The age segmentation for panels does not perfectly match the age groups adopted for this exercise and therefore represents only a broad comparison.
Q&A Review Final Report This could create a perception of a lack of impartiality among older viewers. But to sustain an argument that it amounts to an actual lack of impartiality requires evidence that a younger studio audience is collectively biased in its views on political and social issues.
While convention suggests younger people are more inclined to hold progressive views we have no evidence that that would apply to Q&A audiences which have been selected to achieve a balance of political perspectives. Our analysis of the actual behaviour of the studio audiences (see Audience Behaviour below) is a more reliable measure.
But we do hold the view that it would be preferable if the age profile of the studio audience could be adjusted by encouraging more attendees from the over-35 age group. Given the active role of the studio audience this would contribute to a greater diversity of perspectives on the program.
Recommendation #16 The age profile of the studio audience should be adjusted by including more attendees from the over-35 age group.
It might be assumed that the youthful age profile of the audience was a consequence of the program being broadcast from a studio in central Sydney where the immediate population is probably younger. That might be the case, but it is worth noting that the studio audiences for the two programs broadcast from outside Sydney, in Melbourne and Canberra62, had an even more youthful profile.
Q&A Studio Audience Profile by location
In Sydney it was sometimes difficult to attract an appropriate audience of 250 (see below) while in other centres there were more applicants than places in the audience, even though the size of audiences in Melbourne and Canberra was much greater than in Sydney (see below).
Later in this section we discuss the need for Q&A to be broadcast from a wider range of locations. If that were to happen it would enable the producers to have more control over the age profile of the studio audiences, particularly if the size of the audiences selected outside of Sydney was smaller.
It is worth noting that the BBC’s Question Time operates with an audience of 140 regardless of location.63 Ric Bailey, Chief Advisor, Politics, BBC
When Ric Bailey was responsible for the BBC’s flagship program, Question Time, he posted on the program’s website the following response to the FAQ: “Why are Question Time audiences always biased in favour of left wing policies?” “They are not. As indicated above, they are selected to reflect a broad range of views right across the political spectrum.
It is, however, notoriously impossible to make a judgment about the overall views of an audience based on the noise they make or the levels of applause.
It is also impossible to force people to speak in favour of a particular view, even if you know they are in the audience and hold that view. In fact, there has been more criticism recently that the audiences "sound" anti-government.
That is not because there are not people in the audience who support the government, but, in my view, because those people are less willing to air their views in public than those who attack the government. Five years ago, the opposite was true. This says more about the climate of British politics, than it does about the balance of the Question Time audience these are perceptions which tend to ebb and flow.” He holds that view today, and told us: “People and politicians think they can judge an audience by what it sounds like. That is not always true.” Nevertheless we have attempted to analyse the behavior of the Q&A studio audiences across 1864 of the programs under review to determine whether any patterns exist that may challenge requirements of the ABC Editorial Policies.
We have also carried out a similar exercise with the 13 programs in 2012 that we have previously identified as a suitable for comparison purposes.
Special programs were excluded.
Q&A Review Final Report We stress that our assessment is limited to expressions of a political viewpoint. We found that while it was possible to assign some audience behavior either in support of or opposition to a particular party we could not identify a simple and effective methodology to measure patterns of support and opposition to broader social issues.
We viewed each program and noted the number of audience “events”, usually comprising applause or laughter. We then counted the events where it was readily apparent that this either supported or opposed the position or statement of a political party.
In both the 2015 and 2012 programs only about 25-30% of audience events qualified for assigning in the above manner. Other events were either unrelated to a political position, neutral in response or not sufficiently clear-cut to justify inclusion (mixed responses from different parts of the audience).
Consistent with our findings in previous sections it is immediately apparent that the Government of the day is the lightning rod that attracts most audience response.
2015: Audience Behaviour
More than 70% of audience qualifying events could be measured as a response to a Government policy, position or statement. More than half of the events indicated opposition to the Government position, whether it was by laughter, or applause for a critical or alternate viewpoint.
On the other hand the Government also received the most support.
At first glance some may consider these findings as indicative of an antiCoalition bias. Yet the results from our 2012 analysis demonstrate that the critical focus of the studio audience was on the Government of the day rather than any particular party.
If anything, the results show the ALP in Government in 2012 received a more hostile response from the studio audience than the Coalition Government of 2015.
It should be emphasised that most audience responses are neutral, laughing at a witty riposte or applauding a passionate and articulate answer.
Our viewing of the programs and attendance in the studio also established that even when a response favours one political party over another it could be in response to hearing a well-argued position rather than just reflecting party allegiance.
In other words it cannot be assumed that the conduct of the audience will strictly accord to the voting intention profile.
Furthermore we saw little if any evidence that the audience response exceeded acceptable boundaries. Some may wish that the scrutiny of any Government by Australian society and media were done more gently but that is outside the control of the producers of Q&A.
We conclude that the critical and, on occasions, even hostile response from the studio audience to the Government of the day does not fail the standard of impartiality contained in the ABC Editorial Policies.
Audience behavior has been consistent whether in response to a Government of the Coalition or of the ALP.
Conversely any intervention to adjust the audience composition with a greater number of vocal Government supporters so as to balance out the critical response would, in our view, contravene standard 4.5 and be an inaccurate representation of prevailing community opinion. The Government of the day is the centre point of discussion and its actions and intentions are legitimate targets for audience reaction.
Q&A Review Final ReportLocation Constraints
As has already been pointed out elsewhere the vast majority of Q&A programs originated from the ABC studios in Ultimo, an inner city locale in Sydney. Only two of the 23 programs sampled were broadcast from outside Sydney65 with one further program broadcast not from the ABC studios in Ultimo but from the Sydney Showground.66 Inevitably the composition of the studio audience reflected the production location.
Q&A Studio Audience Composition by State & Territory