Do My Paper For Me UK

Evaluations

Workshop

Women’s and men’s groups comparing group work results – Shan State, Myanmar.   [Photo: S. Marr]

The general purpose of evaluations is usually two-fold: to deliver accountability to donors by assessing project achievements, and to identify lessons learnt. This identification enables the replication of what went well and the modification of what did not.In our evaluations we assess projects based on the OECD/DAC evaluation criteria of relevance, effectiveness, efficiency, impact, and sustainability.Depending on the project context and the clients’ requirements, other criteria such as accountability to beneficiaries, quality, appropriateness and replicability may also be taken into account.We have a broad set of methodological instruments and pride ourselves in being thorough. For instance, we base our impact evaluations always on a robust approach, through which the conditions amongst a project’s target group ‘before’ and ‘after’ is compared with those of control groups. Firmly believing in the value of triangulation, we also favour mixed-method approaches that deploy both quantitative tools (such as household surveys) and qualitative methods (such as semi-structured interviews, story-telling, elements of appreciative inquiry, workshops with a range of group exercises such as trend analysis).

Being aware of different roles and perceptions amongst women and men, we often disaggregate findings by gender (for group work, we usually have separate groups for men and women).As social capital (trust, mutual support, norms, networks) is an integral part of community resilience, we have also developed a tool to assess cognitive social capital – this can be used both for evaluation (to help assess the effectiveness of a project and the sustainability of collective action) and for baselines or assessments (to inform programming choices).

In our evaluation reports, we take great care to always substantiate or qualify our findings. Our recommendations never stand alone, but are always accompanied by the underlying reasoning (we realise that recommendations must be understood and accepted before they can be followed). Whenever possible, we present and discuss preliminary findings and recommendations with project teams.

We also aim to deliver reports that are reader-friendly. With concise writing, clear structures, frequent use of charts, maps and illustrations, and effective executive summaries, we produce reports that can be easily understood and shared. Seeing information sharing as elemental to overall learning, we encourage our clients to publish reports  - to facilitate publication, reports are delivered in professional publication-ready layouts.

Evaluations: planning, process, results

Planning The time it takes to plan, prepare, conduct and finalize an evaluation is often under-estimated. From start (drafting of ToR) to finish (final evaluation report), allow for at least four months as a rule of thumb.Take your time to prepare the terms of reference (ToR) – the more detailed they are, the better. This is because the ToR are usually the primary document on which consulting networks like us base their offers. Comprehensive and precise ToR also help avoid misunderstandings.

Allow for at least one month to receive, review and select bids. Leaving time between the conclusion of tendering and the start of field research (ideally of one month or more) enables bidders such as Banyaneer to propose the most qualified consultants (after all, we’re usually busy consulting, and need to plan ahead as well). Finally, consider the time needed for review of evaluation reports and translation to local languages.

Sample research design

Process  How do we evaluate projects? See a sample research design here, showing an example of triangulation between various sources.

 

Results  Here are some of the evaluations we completed in the last two years.

Oxfam, Indonesia 2017Oxfam – Indonesia (2017) This evaluation of a project in the West Sumatra district of Agam finds that the focus on the resilience of small businesses paid off well – benefits exceeded costs by up to 31 times. Business owners increased profits by 163% and expect a 2-8 times faster post-disaster recovery. The  model of ‘SME resilience’ is worth replicating - see the report here.

Malteser International, Myanmar 2016Malteser International – Myanmar (2016)  This evaluation shows how a project in remote post-conflict areas of Kayin State helped enhance access to health services, water and sanitation. With an endline survey completed separately, we explored patterns of causality.

Welthungerhilfe evaluation, 2016Welthungerhilfe – Cambodia, Laos, Myanmar (2016)  The evaluation of a project for which Banyaneer had conducted the baseline finds that links between farmer groups and local governments have improved – consolidated groups add value for wider villages.

Care evaluation PNG, 2015CARE – Papua New Guinea (2015)  This study looks at a remote atoll, where ‘core  groups’ became very innovative promoters of climate-smart agricultural techniques – a sound approach for similarly remote areas. View report

 

Care evaluation Timor-Leste, 2015CARE – Timor-Leste (2015)  This evaluation of a community-based adaptation project shows how livelihoods were adapted to greater variability, and how water management was enhanced across two watersheds. View report

Canadian Red Cross - India, 2014Canadian Red Cross – India (2014)  Based on a survey amongst 2,270 respondents and community workshops in 27 project-supported and comparison villages, the evaluation is a nuanced analysis of project results (health, DRR, livelihoods).