The overall goal with the Evaluate stage is to continuously improve training activities, maximize their impact on trainees and the targeted community, and ultimately contribute to societal advancements. Below we outline the what, how, and when of evaluating training activities.
What: Before evaluating your training, consider what long-term benefits you want to achieve for both trainees and society: ‘What do we want our training to enable for our target audience in the long term and for society at large?’. Consult key stakeholders to align expectations and identify their desired outcomes to define an impact statement. The ELIXIR Training Platform, defines impact as ‘a measure of how participation in a training course improves someone’s understanding and awareness of a particular domain/topic, leading to change in their research/professional development as well as passing on of the knowledge/skills acquired to others’ (2018).
How: There are many elements to consider when designing and implementing evaluation processes in order to link your training activities to the impact that you hope your work contributes towards. This includes: identifying impact pathways, identifying the key performance indicators to collect, streamlining evaluation-data collection, its storage, its analysis, and reporting.
When: We can build a picture of the quality and impact of training by evaluating different elements at different timepoints, namely:
Evaluation Element | Timing | Examples | Respondents |
---|---|---|---|
Demographics and Reach | Before Training | information on audience, where did trainees find the training | trainees |
Training Material | Before/After Training | are materials FAIR, balance of theory and practice,exercises easy to follow | trainees, trainers, organisations |
Trainee Learning | During/After Training | did trainers learn something new will they use the tool again | trainees |
Training Event Satisfaction | After Training | venue suitability, organizational aspects, trainer support, would you recommend this course to others | trainees, trainers |
Trainee Impact | Medium-Long Term | have you used the tools in your work | trainees |
Societal Impact | Long Term |
Pedagogical point of view
Evaluating training poses a major challenge; most learning starts once our trainees head back to their research groups/places of work. For trainees to gain the most from the training they need to continue to use their newly gained skills in their “home” setting. It’s this “home” use that we are trying to assess through our long term feedback questions.
Additional things to consider when you evaluate training:
- Peer Review: Collaborate with colleagues to receive valuable feedback and improve training materials and delivery methods - do not train alone
- Balanced Evaluation: Combine quantitative and qualitative data to gain a comprehensive understanding of training effectiveness.
- Prerequisites and aligned exercises: Evaluate if the prerequisites defined allowed the participants to benefit from the training and if the exercises supported and reinforced the stated learning objectives.
- Tailored Evaluation: Determine the appropriate level of detail for your evaluation, considering the specific needs and interests of your stakeholders.
Logistical point of view
- Identifying Impact Pathways: Make use of the Logic framework model and ELIXIR impact toolkit to link different types of evaluation into ‘pathways to impact’.
- Identifying the key performance indicators to collect: Consider both quantitative and qualitative measures to evaluate quality and impact. Map indicators onto the impact pathway to provide evidence for your pathways to impact.
- Long-Term Feedback Planning: Incorporate long-term feedback mechanisms into project planning to ensure sufficient time and resources for data collection and analysis.
- Streamline data collection:
- create a central repository
- standardize the feedback by using a template form with defined answer options for all suitable questions, see the Training Metrics Database for available templates to start from
- ensure communication is secured in the longer perspective in order to conduct long term feedback surveys at 6 months, 1 & 2 years
- provide the means for anonymous feedback
- Data storage: Ensure that collected data is safely stored and accessible by relevant team members and compliant with data protection regulations.
- Data analysis: Treat analysis of the data as any other data, automate and standardize the analysis as much as possible and document each step of the process.
- Reporting: Present the findings in clear and concise reports, highlighting key trends and recommendations.
- Feedback from other sources: Identify a process for capturing and incorporating feedback from GitHub repositories, YouTube comments, oral feedback etc.