Trial information
The trial was designed and delivered by: Sally Baker, Tim McLennan, Melaku Alemu, Stefan Alexander, Cindy Han, Joshua James, Aaron Lauterbach, Alan McAlpine,
Alphia Possamai-Inesedy, Lien Pham, Cara Scobie, Sonal Singh, Esita Sogotubu,
and Melissa Tham.
The trial was funded by the Australian Centre for Student Equity and Success (ACSES).
Conflict of interest statement: We identify no relevant conflicts of interest.
Executive summary
Brief overview of the trial
This project aimed to co-design a tailored online employability resource for CALM students and to evaluate its impact on students’ self-reported graduate capitals measured using the graduate capitals scale via a Randomised Control Trial (RCT) approach and a follow-up qualitative evaluation. An online employability resource, titled “Your Story, Your Career in Australia” was co-designed by experts in careers, resettlement, and higher education as well as students with lived experience of forced migration to meet the specific needs of CALM students. This tool was to be trialled twice at six universities with two cohorts at two distinct timepoints. The first trial was to include CALM students who had been enrolled at university for more than six months and take place in April to May 2025, while the second trial was to include first-year CALM students that had been enrolled at university for six months or less and take place in September to October 2025.
Only four universities remained active in the trial at the end of the recruitment window, with one university withdrawing due to an insufficient CALM cohort size and another due to delays in compiling student population data in time for recruitment. Although there were 9,697 eligible students across four remaining implementing universities, only 246 students opted to participate, falling well short of the required sample size of 106 for each institution to coordinate its own trial in a parallel group RCT. As a result of the low participant uptake, the research team switched to a pooled RCT approach across universities. However, only 95 of the 246 students completed the pre-intervention survey, representing an average dropout rate of 60% across the implementing universities and decreasing the statistical power to 67.4%. A decision to discontinue the RCT was therefore made, as similar attrition over the remaining stages of the RCT would have further reduced the statistical power and the researchers’ ability to make accurate inferences based on the RCT results (Frumento & Gimelli, 2025).
Summary of outcomes
Several institutional and sector-level challenges prevented the full implementation of the RCT, including contract delays, difficulties in obtaining reciprocal ethics and data sharing agreements, inconsistent or incorrect student data records, restrictive casual staffing contracts, barriers to participant recruitment, and compressed timelines.
While it was not possible to determine the causal impact of the intervention on students’ graduate capitals via an RCT, user experience feedback on the online resource was provided by Peer Career Supporters (PCS) employed at five of the implementing universities. Twenty-four PCS provided feedback on the resource guided by reflective prompts that were embedded throughout the modules. Twelve of these students also provided consent for the research team to undertake reflexive thematic analysis on their responses, and five participated in a focus group. Key themes identified in PCSs feedback related to cultural responsiveness, accessibility, and insights on their learning about employability and careers.
Key lessons and recommendations
- Ensure ample time to appropriately plan, prepare, and design a RCT prior to implementation. In the context of this trial, at least 3–6 months should have been spent in planning and preparation.
- Pilot trials at individual universities before multi-university trials and evaluation projects are conducted to ensure that variations in internal processes (for example, recruitment strategies and communication processes) can be identified and accounted for when timelines are developed.
- Plan for the fact that not all universities permit opt-out approaches to student recruitment, which has implications for internal validity, power, and comparability across sites in a multi-institution RCT.
- Establish a national ethical recruitment standard for higher education sector RCTs. A system-wide framework enabling consistent and culturally responsive recruitment (including conditions for opt-out or broad-consent recruitment, incentive protocols, and streamlined channels) would remove one of the key barriers that undermined recruitment across institutions and disproportionately affected equity cohorts.
- Establish a national higher education evaluation infrastructure. Not a new evaluation framework, but the practical systems needed to support consistent evaluation across the sector. This includes a national data architecture and data-sharing protocols, a shared ethics clearinghouse, standardised variable dictionaries, and a central platform to support multi-university trials. This infrastructure would address structural challenges identified in the project, including data inconsistencies, ethics delays, staffing pressures, and the need to build long-term evaluation capacity across the sector.
- Carefully review projects that combine co-design and RCT methodologies and only select projects for trials if the interventions being evaluated are grounded in evidence and guided by a clearly defined theory of change. If these approaches are combined, RCT Leads should be engaged throughout the co-design stage to prevent misalignments between intervention design and RCT requirements.
An accessible Word doc is available here: Post-trial review report: Lessons learned from a multi-site RCT of the co-designed Culturally Responsive Graduate Capital employability training for domestic Culturally and Linguistically Marginalised (CALM) students.