Online citizens’ jury: Adapting deliberation for the virtual world

In November 2020, I was an independent facilitator for a citizens’ jury organised by Griffith University and the Prince Charles Hospital, convened to consider research priorities for stroke rehabilitation. I worked with my colleague Dr Kym Madden to facilitate the event and encourage the jurors to make informed recommendations.

What was unusual about this citizens’ jury was that it was conducted entirely online.

Taking the jury online was a necessity rather than a choice: the researchers needed to complete the project by December 2020, and COVID-19 restrictions made an in-person jury impossible.

In this article, I discuss some of our planning decisions and challenges.

What is a citizens’ jury?

A citizens’ jury is a deliberative technique for policy making and community engagement. It’s a bit like a courtroom jury, but without legal authority.

A citizens’ jury usually involves a diverse group of (often randomly selected) citizens coming together over an extended period to listen to evidence, ask questions, and form a conclusion on the issue under discussion. The rationale behind the technique is that everyday citizens can (and should) contribute to policy making, but that meaningful contributions require sufficient information from experts and time for discussion.

In our case, the citizens’ jury was a research-priority-setting exercise, designed to explore how health consumers would prioritise budgets for stroke rehabilitation research, particularly in four areas recognised as under-researched (moving from hospital to community rehabilitation; returning to work; living with communication challenges; and wellbeing, mental health, and family/community support). The thinking behind this citizens’ jury was that citizens indirectly fund research grants and should, therefore, have a role in setting research priorities.

Our citizen’ jury was originally planned as a two-day, in-person event. This is a short timeframe for a citizens’ jury, and we always knew our program would be tight. We planned to hold the event at a neutral venue on Brisbane’s northside, with citizens recruited from an area that roughly matched the footprint of the Prince Charles Hospital.

Learning from previous experiences

Our research and facilitation team had some previous experience with citizens’ juries. Both my co-facilitator, Dr Kym Madden, and I have previously facilitated citizens’ juries and other research processes based on deliberation and consultation. Griffith University has a long history of conducting deliberative-style research, particularly about health policy.

COVID-19 had provided us all with a rapid education in online teaching and virtual meetings, but none of us had conducted qualitative, deliberative research entirely online.

We searched for insights and experiences from other researchers that might help us anticipate the challenges ahead. What we found was very limited discussion about the practical issues we might face.

We discovered just one report about an online citizens’ jury, conducted in Australia for the RACQ[1] by DemocracyCo.[2] In March 2020, DemocracyCo worked online to facilitate the final weekend of a citizens’ jury, but this was done after the citizens had already met in person for three days to listen to witnesses and discuss information. The facilitators used tools like Zoom, Poll Everywhere, and Google Docs to help the jury finalise their deliberations and prepare a report. DemocracyCo’s conclusion was that online deliberation wasn’t as social as in-person deliberation, but they were able to get the job done and participants enjoyed the process.

We discovered only one other mention of an online citizens’ jury, this one run by The University of Wales, examining citizens’ views about social care (called Measuring the Mountain).[3] At the time of our jury, we could access their program, but no information about how the jury was conducted. Late in 2020, the facilitators of Measuring the Mountain released videos of their online sessions.

Choosing an appropriate platform

Choosing the appropriate online platform was more challenging than we anticipated. Even though Zoom has become the go-to option for many businesses and universities, it wasn’t available to us. Our ethics approval required Australian-based servers and high-level data encryption. That left us with two choices: Microsoft Teams or Go To Meetings.

After extensive research by our unflappable project manager, we settled on Microsoft Teams. It seemed the best choice because:

  • We could see all the jurors on screen at one time
  • We could record every session and its associated chat
  • We felt it would be more familiar to jurors (being a Microsoft product).

Our biggest challenge with Microsoft Teams was its lack of a small-group breakout function. We needed jurors to deliberate about issues, and deliberation requires small-group conversations. We needed to find a way to break the large group into multiple small groups and to efficiently move between large- and small-group conversations.

We eventually decided to set up five separate meetings in Microsoft Teams – one for the large-group meeting (the witness sessions) and four small groups for deliberation. We pre-allocated jurors to a specific small group. Jurors received links for their meetings and a timetable for each day – both as a daily email and as part of their jury handbook.

Recruiting jurors

We initially planned to randomly select jurors from the electoral roll. For a range of reasons not linked to the decision to run the jury online, our recruitment methods needed to change.

After a number of false starts (each requiring new ethics clearance), we eventually recruited jurors through posts in social media. We extended our geographical reach across south-east Queensland, and recruited mostly by posting in Facebook groups. In an effort to recruit a broad range of jurors, and particularly to recruit jurors without lived experience of stroke, we focused our promotion on neighbourhood groups and special-interest groups not related to health.

We asked potential jurors to ensure they had the necessary technology (including a webcam), and to commit to the full jury program. In return, we offered each juror $200 as compensation for their time.[4]

Online recruitment proved successful, and we secured 15 participants. Even though our recruitment methods sacrificed the ideal of random selection, online recruitment brought a key advantage: all our jurors were comfortable with online interaction, reducing the need for extensive pre-jury training.

The week before the jury, our project manager met each juror online to check their technology and make sure they understood the jury process. A few days before the jury, jurors received a hard copy of the jury handbook, plus some stationery, tools for voting, and a supply of snacks. Jurors also received the jury handbook via email, plus a daily email with the day’s program and meeting links.

Designing the program

Our two-day, in-person citizens’ jury was reconfigured as a four-day, online event. We chose to extend the jury across four days because we wanted to limit the sessions to three hours – which we felt would be most people’s limit in the online environment. We scheduled the jury across four mornings to give us time to address any technology or content issues each afternoon.

A citizens’ jury involves a combination of witness presentations, deliberation, and time for reporting. We also wanted to include time for jurors to vote on key issues. We scheduled our program this way:

  • Day 1 – introduction to the process and first witness session
  • Day 2 – second and third witness sessions
  • Day 3 – fourth witness session plus an extended deliberation
  • Day 4 – extended deliberation, reporting back, and voting.

Each witness session included two or three speakers with time for questions, followed by deliberation in small groups.

At the end of each day, we asked jurors to email their notes to us. We spent the time between sessions summarising and preparing reports for the jurors to work with.

  • At the beginning of day two, we provided jurors with a verbal summary of the previous day’s deliberations.
  • At the beginning of the extended deliberation on day three, we provided jurors with a summary of all jurors’ contributions from days one and two. This summary became a working document that jurors could edit – it provided one column with every response received, one column with the responses summarised, plus empty columns for new ideas and priorities.
  • At the beginning of day four, we provided jurors with a summary of all jurors’ contributions from day three (structured in the same way as the previous day’s summary).

This program worked efficiently, but in retrospect it would have been more effective to have an overnight break between the final witness session and the extended deliberation. This would have allowed us to provide jurors with a complete summary before asking them to draw conclusions. We would have also benefited from an overnight break between the extended deliberation and reporting back.

An on-campus research hub

We decided to establish a research hub at Griffith University. For the four days of the jury, the project manager, co-facilitator, and I worked from a large teaching room to coordinate the event.

We were able to easily achieve this within the relevant COVID-safe planning guidelines, given the low virus levels in Queensland at the time.

We used multiple laptops within the one room, all linked into the jury via Microsoft Teams. During the small-group deliberations, we had one laptop assigned to each group. This allowed us to move between groups by physically moving between laptops, which made it easy for us to support group deliberations as needed and reduced the delays experienced when participants enter and leave online meetings.

The project manager handled any administrative or technology questions from jurors, provided any support we needed, and made sure the sessions were successfully recorded.

In some ways, having an in-person research hub was a success. It enabled us to support each other, debrief after each session, and make quick decisions to address little issues. It also provided a location for witnesses who wanted to present in person.

But we hadn’t fully anticipated the problems we experienced with sound. Even though we all used headphones to link into the jury and we were in a large room, we experienced significant feedback and echo. The first day was a steep learning curve, as we realised the importance of keeping our microphones on mute except when speaking and learned to interact with each other via the online space even though we were sitting in the same room.

Recruiting witnesses

Our witnesses were recruited through our hospital research partner (the Prince Charles Hospital). Witnesses included health professionals, people with lived experience of stroke, and the carers of people with lived experience of stroke.

We were unsure whether witnesses would be comfortable with an online, remote connection, so we offered them an option to present remotely or in person from the research hub at Griffith University. Our project manager briefed each witness about how the process would work and what questions we wanted them to address.

We felt that having witnesses present in person at the research hub would give us more control over the timing and potential technology issues, and that it would offer a more gentle and supportive environment (particularly for people with lived experience of stroke). Most of our witnesses chose the in-person option.

In some ways, in-person presentation was the right choice, because it gave an opportunity for witnesses to interact with the research team and, when necessary, to be supported during the sessions.

For me as facilitator, though, having witnesses attend in person proved almost irrelevant. My attention was so tightly focused on my computer screen that the goings-on in the room around me disappeared. I didn’t see people arrive or leave, and I didn’t have any flexibility for informal interaction (which I’d normally enjoy at an in-person event). While the jury was live and ‘in session’, my attention was fully focused on the online space.

On more than one occasion, I didn’t know whether a witness had arrived and was ready to present. I felt unable to turn around and look at the room because I didn’t want to distract jurors. I felt it would be disrespectful to jurors if they could see me looking around or having a conversation with someone in the room.

We decided to ask each witness to present informally, without audio-visual support (such as PowerPoint). We also asked them to respond to questions – in conversation with me – rather than simply speak for their allocated time. Our rationale for this was that we wanted to develop an engaging conversation with jurors, not a formal presentation. We also felt that avoiding audio-visual support would help to minimise the likelihood of technology problems. This approach was particularly successful for our witnesses with a lived experience of stroke.

Creating small groups for discussion

In an in-person citizen’ jury, participants move between large-group and small-group discussions. Facilitators have considerable flexibility about how and when to make this happen. Early in the jury’s time together, groups might be allocated and varied to encourage jurors to meet as many people as possible. As the jury progresses, groups may be structured to provide opportunities for jurors to discuss the topics that most interest them.

This type of flexibility wasn’t possible in our online citizens’ jury. To keep things simple for jurors, we designed a fixed schedule and pre-allocated small groups. We felt that the pre-allocated groups (each with just four participants) would increase the possibility of open and honest discussions amongst jurors, and provide more opportunities for jurors to build rapport. We also felt that having a fixed schedule would limit confusion and reduce the likelihood that technology failure might bring the entire process undone.

Decisions I might normally make during an in-person session, such as extending the discussion slightly or varying the program (perhaps in response to something a witness says or something I sense from jurors), were impossible in the online space.

Creating opportunities for decision-making, brainstorming, and voting

In an in-person citizens’ jury, jurors often record their ideas on butchers’ paper, which may be displayed around the room. Some facilitators use coloured dots for voting or sticky notes as a way to record new ideas. Jurors may spend time moving around the room, reading suggestions and chatting informally.

In an online environment, the opportunities for informal conversation and sharing ideas between groups are greatly reduced. We needed to find a workable method for jurors to record and share their ideas. We also needed to find a system for voting that would provide us with some flexibility while still being easy to use.

We were unsure whether our jurors would be comfortable with online note-taking and voting techniques, so we decided to keep the technology as simple as possible. When we first moved into small group discussions, we encouraged jurors to record their ideas in whatever way made sense for the group. We asked each group to nominate a notetaker and timekeeper. Our only request was that they send their notes to us at the end of each day. We made sure everyone understood that we would be collating the groups’ contributions and sharing them with all jurors.

Initially, some groups took notes by hand and shared them with us as emailed photographs. But by the second day, all groups were comfortably using the screen-sharing function to work in a word processing document. The shared screen became like a whiteboard as a focus for their discussion and notes (and it’s interesting that participants chose the familiar word processing screen, not the whiteboard provided with Teams).

We provided jurors with suggestions about how to work together and identify their top priorities. Most groups seemed to have no trouble doing this, and some groups developed detailed methods for sharing ideas and voting. One thing we noticed was that group deliberations and the move towards consensus happened more quickly than we expected – much more quickly than usually happens during an in-person event.

Reporting back and voting

Before the jury, we distributed hard copies of things we could use for voting, including a visual analogue scale, green and red cards (for yes/no), and large sheets with the letters A through D (for selecting options). We imagined that jurors would be able to hold up the relevant card or scale, and that we’d be able to count responses from the screen.

We weren’t sure whether these would be needed, but we wanted to have them available just in case. We also wanted small groups to have a readily available option for voting if they disagreed on something.

We quickly discovered that our screens were too small for online display and counting of responses. We also decided that this voting method would be too cumbersome and slow for the jurors, who would be forced to wait while we counted. We encouraged jurors to use the visual analogue scale and ranking techniques in their small groups, and many of them did.

In an in-person citizens’ jury, the jury often presents a single report to researchers or policy makers, often as a combined written and verbal report.

As our online jury progressed, we decided that the best method for reporting back would focus on the ideas of each small group – with each group reporting its top three priorities for each research area to both other jurors and researchers. We collected these ideas, then asked all jurors to vote using the visual analogue scale. We also wanted jurors to vote individually on a series of discrete choice questions, which they had discussed during their final deliberation. Our idea was that these two methods would combine to give us an overall ranking of top research priorities.

We used the chat function in Teams to collect votes for the research priorities and discrete choice questions (I read each priority/question aloud and gave jurors time to type their score into the chat). This turned out to be somewhat chaotic, particularly for jurors experiencing delays caused by slow internet connections.

Because the voting seemed chaotic and we wanted to be sure we had accurately recorded all jurors’ views, we gave jurors an opportunity to check our records after the jury ended. Immediately after the jury, we created an individual voting summary for each juror (summarising their voting on 48 research priorities and 10 discrete choice questions as shown in the group chat). We emailed the summary to each juror for them to either confirm or amend for our final results.

In retrospect, it’s clear that collecting priorities (from small group reports) needed to be separated from asking jurors to vote. If we’d scheduled a break between these two activities, we would have been able to return to jurors with an established list for voting. We could easily have used an online polling system, because our jurors were clearly proficient enough with the technology. An online poll would have been quicker, more accurate, and much more straightforward.

My reflections on facilitating an online jury

Facilitating any event is hard work. There’s always a lot going on and many different issues to manage. I always find it tricky to pay attention to both the process and the content.

I think these challenges are magnified in the online space. There are many more things that can go wrong, and limited options available if the technology fails. There’s less flexibility to respond to events and issues that emerge. It’s more difficult to engage participants and to respond to their cues about how the event is going.

My conclusion about the online citizens’ jury is that it’s different from an in-person event – the dynamics are different, the interaction is different, and the outcomes may well be different. In many ways it’s simply different – not worse, not better. I think it’s probably harder work for the facilitator, and it may well be harder work (or less rewarding work) for the participants.

My current thinking is that if I was able to choose between an in-person citizens’ jury and an online citizens’ jury, I’d choose the in-person event every time. But if it was a choice between an online jury or doing nothing, then online is clearly the way to go.


[1] Royal Automobile Club Queensland

[2] Jenke, E., & Fletcher, E. (2020). Complex Engagement – 7 lessons from our virtual citizens’ jury. DemocracyCo Blog. Available: https://www.democracyco.com.au/complex-engagement-online-7-lessons-from-our-virtual-citizens-jury/

[3] University of Wales. (2020). Measuring the Mountain Citizens’ Jury. Available: https://www.mtm.wales/citizens-jury2020

[4] In market research terms, $200 is poor compensation for the jurors. But for university research, where compensation must be recognised as a gesture of thanks and not coercion to participate, it is reasonably generous.