Skip to main content
UCP Knowledge NetworkApplied knowledge for action
Dr Annalisa Creta

Interview with Annalisa Creta, disaster management training specialist

Published on

Dr Annalisa Creta is a senior research fellow at the Sant’Anna School of Advanced Studies in Pisa, specialising in civilian crisis management and training. She serves as Monitoring and Evaluation expert within the TVC Consortium implementing the Union Civil Protection Mechanism (UCPM) training programme for deployment experts (LOT 1).

By Knowledge Network – Staff member

Monitoring and evaluation in disaster management training: more than a questionnaire? 

In a demanding field such as disaster management, where time is limited and priorities are constantly shifting, monitoring and evaluation (M&E) can easily be overlooked. It is not uncommon to ask: why focus on evaluation when there are seemingly more pressing matters at hand? 

Evaluation serves two key purposes: learning and development and accountability. It helps us understand what is working and where improvements are needed, while providing tangible evidence of success to stakeholders and funders. A well-designed evaluation integrates both aspects, making it a powerful tool for continuous improvement and transparency. 

Within disaster management training, evaluation is crucial for several reasons. 

  • Improving effectiveness – Does the training achieve its intended outcome? Are participants learning and is there measurable change?
  • Ensuring accountability – Organisations and funders require proof that resources are being used effectively. Evaluation provides this evidence, enhancing professionalism and credibility.
  • Sharing knowledge – By documenting successes and challenges, evaluation contributes to a shared understanding of good practices.
  • Boosting motivation – Seeing the positive impact validates efforts and can foster team cohesion and organisational support. 

Monitoring and evaluation are not just about checking boxes — they’re about learning, improving, and showing accountability.

Conceptual framework and methodology used in EU Civil Protection Mechanism (UCPM) deployable training 

Our M&E approach is built around three interconnected components. 

  • Internal M&E – This involves planning, design and delivery of activities ensuring implementation is consistent and aligned with outputs.
  • Participants’ and trainers’ evaluation – This component gathers feedback on satisfaction and learning outcomes, including perceived changes in knowledge, skills, attitudes and competencies. Trainers’ observations are also integrated to enrich the evaluation process.
  • Curriculum review – This includes periodic analysis of course data to identify gaps, overlaps and areas for refinement, contributing to the overall effectiveness. 

This methodology supports a culture of evaluation by embedding practices throughout the training cycle. Evaluation is treated as a shared responsibility across participants, trainers, course directors, consortium managers and the contracting authority. Encouraging active engagement helps foster an environment where feedback is valued and used

Training is increasingly seen not as a cost, but as a strategic investment in resilience.

Innovative methods 

To evaluate the sustained impact of training beyond immediate outcomes, two methods go further than traditional evaluation approaches previously used in UCPM training 

Post-course evaluation (6–12 months after training)  

Survey participants assess training impact on professional growth and readiness. Data include: 

  • how participants applied their learning in professional settings;
  • perceived changes in their knowledge, skills, attitudes and behaviours;
  • the extent to which learning was transferred to others;
  • barriers and enablers to applying lessons. 

This helps us understand the medium-term effects of training on individual performance and organisational practice. 

Longitudinal evaluation measure 

Starting in 2025–2026, this will track a group of participants over time as they progress through multiple Lot 1 trainings, and potentially into MODEX or deployments. Unlike a one-off survey, it follows participants across their learning and operational journey, offering a more comprehensive view of how training influences preparedness and performance. 

Previously, evaluation was largely limited to immediate post-course feedback. With these new tools, the scope now includes transfer-level evaluation, essential for understanding how training contributes to broader organisational change. 

These innovations also reflect a shift in mindset:  

  1. training evaluation is no longer niche but a shared responsibility;
  2. training is increasingly considered a strategic investment. 

Post-course and longitudinal evaluations are helping us connect learning to operational impact.

Evidence so far 

Many participants reported having improved their confidence and preparedness for international and domestic missions, even if they had not yet been deployed. 

Despite this, the overall sentiment was overwhelmingly positive. The training was often described as transformative, particularly for those in leadership roles. 

Contribution to policymaking 

M&E is essential for grounding civil protection policies in evidence. It allows us to systematically assess training and operations, showing what works and why. Tracking outcomes highlights immediate learning benefits as well as longer term behavioural changes. This is crucial for evaluating whether investments in capacity-building translate into improved disaster response. 

I firmly believe that M&E is not just a technical exercise. If used to its full potential, it can become a strategic tool for learning, accountability and continuous improvement.  

About the author

The Knowledge Network – Staff member

The Knowledge Network editorial team is here to share the news and stories of the Knowledge Network community. We'd love to hear your news, events and personal stories about your life in civil protection and disaster risk management. If you've got a story to share, please contact us.