Using Live Employer Briefs as a Form of Authentic Assessment to Enhance Problem Solving Skills: Evidence from an Undergraduate Psychology Module
This case study evaluates experiential learning and authentic assessment in a final-year Business Psychology elective module. Co-created with an employer, the module’s coursework required students to address real challenges presented in a live company brief. Students applied psychological theory to practice and submitted evidence-based consultancy reports, graded by the module organiser. The impact of authentic assessment was collected before/after the assignment via self-reported questionnaires
Responding to a need
Despite increasing recognition of the value of experiential learning and authentic assessment (Sokhanvar et al., 2021), psychology curricula have been slow to embed and evaluate impact on skills development. Research and communication skills are enhanced through oral conference presentation assessments (Greenwood, 2025), while assignments such as posters and information leaflets, which require students to present research for a general audience, promote academic integrity and improve performance (MacAndrew & Edwards, 2002; McGann et al., 2008). Problem-solving skills are highlighted by the Quality Assurance Agency for Higher Education (QAA, 2023) as both subject-specific and transferable skills essential for psychology graduates. Meanwhile, strategic problem solving will remain relevant to employers by 2030 (Seelan, 2025), with advanced problem-solving skills linked to up to 20% higher wages (Ederer et al., 2015). However, evidence about educational activities fostering problem-solving skills is sparse.
Furthermore, personal anecdotal observations as an advisor suggest that many students are unaware of how or where they develop problem-solving skills during their degree, with limited examples available for cover letters or interviews mainly related to their capstone dissertation work. Finally, and in line with the new QM Employability and Skills Framework, “all learners should be given the chance to apply their learning to a real-life context” and authentic assessment is an excellent tool for fostering skills (Koh et al., 2019; Banister, 2004). As such, this project responded to a clear opportunity: to embed work-integrated, authentic assessment rooted in employer collaboration into a final-year psychology module to enhance problem-solving skills development.
This case study forms part of a broader action research project evaluating the impact of authentic assessment on engagement, performance, and employability outcomes, but for the purposes of this report, results around skills development will be shared and recommendations related to both action research and implementation of such educational activities will be discussed.
“I liked that the coursework was very practical and that we met the representative of the company we wrote our pieces for. This is very useful to our future lives in the workplace.”— Student One
Context and Design
This evaluation project took place in 2025 Semester B as part of a Queen Mary Academy Fellowship, within the PSY318 Business Psychology module. Although the authentic assessment approach has been in place since 2022, this year’s cohort of 96 third-year undergraduate students was the focus of the project. The coursework involved a live brief co-created with an employer from the business sector, sourced via the module organiser’s professional network. The brief addressed a real organisational problem: ineffective staff selection methods, which is also a core topic in the module.
Authentic Assessment and Co-Creation
The design of the authentic assessment in this case study sought to align as closely as possible with Wiggins’ (1989) key principles of authentic assessment, which are also central to problem-based learning. First, the task was representative of real-world challenges faced by organisational psychologists, requiring students to apply psychological theory to a complex staffing problem. Second, clear performance standards were shared through a well-developed rubric, with specific emphasis on the quality of solution, application of theory, and originality. However, some principles were only partially met, which represents a limitation of this pilot iteration. While the opportunity to present to a real audience (the employer) was included, this was restricted to the top three performing students as an incentive, rather than being available to all. Additionally, there was no formative assessment process integrated into the activity, and self-assessment was only available to students who completed the survey. The recommendations section provides guidance for enhancing future iterations of the assessment design.
Importantly, the module organiser collaborated with the employer to develop a brief aligned with two key learning outcomes:
- Apply critical evaluation skills to real-world problems beyond lab findings.
- Critically assess workplace psychological interventions.
In Week 2, students were introduced to psychological theories about staff selection in a lecture. By Week 3, the coursework brief was released on QMplus, and students were guided by the lecturer and worked in pairs to develop questions for the employer. In Week 4, the employer led a live briefing session, being one of the best-attended sessions of the semester.
Students submitted an individual evidence-based consultancy report to solve the problem at hand by Week 9 (worth 25% of the module grade). Standard psychology essay criteria were used for grading.
Research and Data Collection
All students were invited to participate in the study anonymously. 14 completed both pre- (week 2) and post-assignment submission (week 10) surveys (15% response rate), with 58% attrition from the Time 1 sample. Surveys were conducted online via Gorilla and the project was ethically approved. Participants received a £10 Amazon voucher per completed survey. Survey completion took approximately 15 minutes and identical measures were included at both timepoints:
(a) Problem-Solving Inventory (Heppner & Petersen, 1981) – self-reported scale assessing the dimensions underlying the real-life, personal problem-solving process (e.g., “I trust my ability to solve new and difficult problems”)
(b) Problem-solving task with situational judgment questions with a 4-minute response limit:
- General scenarios
- Organisational psychology scenarios (related and unrelated with the brief). For example, a scenario on diversity in hiring.
(c) Task-specific confidence rating – self-reported scale (e.g., “how confident you are for the solution you provided?”)
(d) Perceived development of QM graduate attributes
Additional data included assignment grades, module evaluation responses, and open-ended feedback.
Implementation and Evaluation
Students completed surveys during class time on week 2 (Pre-) and week 10 (Post-Assignment). During this period, and specifically in week 4, the employer delivered a live briefing session as part of a scheduled lecture. Upon data collection, a coding rubric for problem-solving responses was used, co-designed by the module organiser and a Careers Advisor. Responses were rated on a 0–2 scale (0 = problem not identified/unrealistic solution; 2 = problem clearly identified with structured, evidence-based solution), based on existing literature and interview scoring methods (De Leng et al., 2017). The researcher and module organiser coded all responses while blinded to participant ID and timepoint, and their ratings were used in the analysis. To assess inter-rater reliability, a private instance of ChatGPT-4o was used to apply the scoring key to all responses. The AI tool showed 75% agreement with the researcher, indicating moderate reliability (Cohen’s kappa = .43).
Analysis reported includes a repeated measures ANOVA on problem-solving scores and descriptive statistics summarising the impact on graduate attribute development following the assignment.
“I liked the opportunity to have a guest speaker from a big company come in where we had our assignment on this case study as it allowed us work with real-life problems.”— Student Two
Impact
Quantitative data revealed positive shifts in students’ problem-solving performance:
Problem-solving task scores improved across all scenarios after completing the assignment F(1,13) = 11.24, p = .005. Although interaction between time and type of scenario was not statistically significant, problem-solving performance after completing the assignment was descriptively higher for the scenarios that were related to the assignment (M = .82, SE = .18) compared to the other two (general: M = .61, SE = .19; unrelated: M = .75, SE = .17)
Graduate Attributes data (n= 14) showed high levels of student agreement with statements aligned to the development of critical analysis and problem-solving skills (see figure below). For example:
- 71% of the students agreed/strongly agreed that completing the assignment helped their ability to solve problems.
- 86% agreed/strongly agreed that completing the assignment enabled them to think critically and apply their expertise in real life.
Although the results suggest positive trends, the sample size was small (n = 14), limiting statistical power. Additionally, students engaged in various educational activities over time, so changes cannot be attributed solely to this assessment. However, the inclusion of different problem scenarios aimed to control for this. These preliminary findings indicate potential benefits of the authentic assessment and employer engagement in skills development and awareness of attributes relevant to problem-solving, but further investigation with a larger sample is essential.
“The coursework was enjoyable and gave us an opportunity to apply our knowledge to a real-life example.”— Student Three
Recommendations for Education
- Alignment with Learning Outcomes and Employability & Skills Framework
Ensure that the live brief and authentic assessment are meaningfully mapped to the module’s learning outcomes and the embedment of employability in the curriculum across the programme, rather than added as a supplementary or token component. - Early Employer Engagement
Initiate communication with the employer several months before the semester begins to co-create and refine the live brief. Clearly outline mutual expectations and potential benefits for the employer, such as gaining fresh, evidence-based insights that can inform their professional practice. Aim to build an ongoing partnership, as sourcing guest employers can be time-consuming. Leverage your professional network, the Careers and Enterprise Employer Engagement Team, and the School’s Industrial Advisory Board to support this effort. - Student Preparation and Debriefing
Offer preparatory and/or debriefing sessions to help students understand expectations and reduce uncertainty associated with solving real, ill-defined problems. Embedding these sessions as part of formative activities would give students a safe space to explore ideas, receive feedback, and build confidence before completing the summative task. - Use of Appropriate Assessment Criteria
Employ marking criteria that reflect the skills being developed, particularly synthesis, application of knowledge, originality, and problem-solving. In this case study, traditional essay marking criteria were used, which may not explicitly capture these dimensions. However, these critical aspects were emphasised during debriefing sessions and in feedback to support student development. - Structured Reflection Opportunities
Build in time for structured reflection, either through a reflective piece or in-class activities. Although this was not clearly implemented in this case study aside to time 2 of the survey, reflection helps students consolidate their learning and become more aware of the skills and knowledge they are developing. - Engaging Students with a Real Audience
Provide students with an opportunity to present their work to a real audience. Although this case study gamified the experience and limited employer interaction to a few students due to constraints, a possible adaptation could include a brief pitching (online) session to the employer as part of a formative assessment.
Recommendations for Evaluation and Research
- Survey Design and Administration
Be mindful of low response rates caused by questionnaire fatigue and participant attrition, especially in studies requiring pre-/post-intervention data. To mitigate this, incentivise research participation, give time in-class for survey completion, keep surveys and tasks concise or consider using rich qualitative methods such as interviews for deeper insights.
Queen Mary University of London: Employability and Skills Framework - Internal website: https://qmulprod.sharepoint.com/sites/QMAEnhancement/SitePages/Embedding%20employability.aspx
“It’s always a pleasure to engage with QMUL students and hear their fresh perspectives on our current operational challenges. Dr Argyriou ensures my contribution is both impactful and effortlessly integrated into the module, while also being mindful of my busy schedule, making the experience rewarding and manageable.”— Employer's Feedback
References
- Banister, P. (2004). Assessment as a tool for fostering key skills. Psychology Learning & Teaching, 3(2), 109-113.
- De Leng, W. E., Stegers-Jager, K. M., Husbands, A., Dowell, J. S., Born, M. P., & Themmen, A. P. N. (2017). Scoring method of a Situational Judgment Test: influence on internal consistency reliability, adverse impact and correlation with personality?. Advances in Health Sciences Education, 22(2), 243-265.
- Ederer, P., Nedelkoska, L., Patt, A., & Castellazzi, S. (2015). What do employers pay for employees’ complex problem solving skills?. International Journal of Lifelong Education, 34(4), 430-447.
- Greenwood, S. (2025). Using student conferences as a form of authentic assessment to develop and enhance transferable communication skills in quantitative and data-driven disciplines: Evidence from a postgraduate public health program. Journal of Teaching and Learning for Graduate Employability, 16(1),18.
- Heppner, P. P., & Petersen, C. H. (1981). A Personal Problem Solving Inventory.
- Koh, K., Delanoy, N., Thomas, C., Bene, R., Chapman, O., Turner, J., ... & Hone, G. (2019). The role of authentic assessment tasks in problem-based learning. Papers on postsecondary learning and teaching, 3, 17-24.
- Macandrew, S. B., & Edwards, K. (2002). Essays are not the only way: A case report on the benefits of authentic assessment. Psychology Learning & Teaching, 2(2), 134-139.
- McGann, D., King, S., & Sillence, E. (2008). Information Leaflets: An Evaluation of an Innovative form of Assessment. Psychology Learning & Teaching, 7(1), 19-22.
- Seelan, S.T. (2025). Top Future Skills: What Employers Will Value Most by 2030. Retrieved from https://www.linkedin.com/pulse/top-future-skills-what-employers-value-most-2030-s-t-seelan-vunlc/
- Sokhanvar, Z., Salehi, K., & Sokhanvar, F. (2021). Advantages of authentic assessment for improving the learning experience and employability skills of higher education students: A systematic literature review. Studies in Educational Evaluation, 70, 101030.
- Wiggins, G. (1989). Teaching to the (authentic) test. Educational Leadership, 46(7), 41–47