Designing an Evaluation Methodology for Your Project

Monitoring and evaluation of project implementation are of key importance if your organization is working with donations from a third party. It creates transparency and trust and also helps your own organization to carry out good projects and to be able to learn from experiences from the past. But how exactly do you evaluate your projects?

Without proper planning and design of an evaluation strategy and methodology, it is going to be very difficult to be able to present good results. Even though normally the evaluation is the last step in the project cycle, you should design your strategy in the very first step to be able to collect the appropriate data throughout the project or program.

In the following paragraphs, we will describe in detail what you need to keep in mind while designing your evaluation methodology and how you can actually use it to your advantage when fundraising for your NGO.

What is evaluation?

To be able to design an evaluation methodology, you must clearly understand what the term evaluation means and how you can use it to your advantage.

Evaluation basically describes the analysis of the project’s success after the project cycle is finished. Based on the collected data in a baseline study, you describe and analyze achievements that have been reached through your project activities. At the same time, you also name and look detailed on possible problems and mistakes that have occurred during that time to be able to learn from these experiences in the future. Basically, you compare the planned results with the actual results and analyze possible disparities.

Figure 1: The role of evaluation in the project cycle

Figure 1: The role of evaluation in the project cycle

As you can observe in figure 1, evaluation is an important step in the project cycle and makes sure, lessons are incorporated in the planning of future projects.

Why is it important?

For the donor, the evaluation gives the opportunity to see and understand what exactly you have done with their money and what impact it had on the communities you worked in. A good evaluation is key to further cooperation with that particular donor based on transparency and trust and will help you frame your organization as a competent partner.

Even if your organization is small and at first you might feel like an evaluation is not necessary, it has actually many benefits. Besides the above-described effect for the donor, you also get to collect a lot of data that can be used in the future for applications, information brochures, or similar purposes. If you can clearly name the effect that your past projects had on the communities you are working in, it will be much easier to write new applications based on that and establish new relationships with other donors.

Of course, your evaluation will look different though if you carry out a million-dollar project across several countries or a small program in one village. That is why, in the first step of the project planning, you should take time and thought to design an appropriate and practicable evaluation methodology for your project.

What does “designing an evaluation methodology” mean?

As stated above, you will have very different expectations for your evaluation methodology if you have a project across several countries with a huge budget than if you have a small project with very limited resources. Of course, also your donors will have very different expectations.

Big organizations often outsource the evaluation to specialized organizations that have their own framework. As every project is different, there is no real blueprint as to how to evaluate. While you don´t have to invent the wheel again every time you start e new project, you should be careful that your evaluation methodology is adjusted and appropriate for the purpose that should be achieved.

To design your evaluation methodology basically means to assign certain resources to it, to determine the expected outcomes, and to accommodate it in the project planning. Also, you determine the methods to be used to achieve results and the timeframe of it. We will describe the details of this process in the following paragraphs.

Once you designed your evaluation methodology, you can also share it with your donor. This way, you let your donor know clearly what they can expect in the evaluation in the end and what will not be included. It is a very good way to manage expectations and make sure from the start that you are on the same page. By sharing your methodology in an early stage, you give your donor the chance to make remarks and demand for inclusion of certain measures if needed and avoid any misunderstandings at the end. To share a well-designed evaluation methodology with your donor is one step more towards transparency and good practice.

Designing an evaluation methodology – Important steps

There are several important steps to be taken into consideration while designing an evaluation methodology appropriate for your project or program. If possible, these steps should be taken together by the people responsible for the evaluation as well as those responsible for the project to result in a well informed and realistic strategy.

It is of key importance that the evaluation methodology is designed during the planning stage of the project, so that sufficient resources can be assigned to it and that necessary data can be collected throughout the project cycle. With a strategy in place and assigned roles, the evaluation and the connected data collection can take place at an ongoing basis and will not be an overwhelming task at the end.

Figure 2: Necessary steps for the design of the evaluation methodologyFigure 2: Necessary steps for the design of the evaluation methodology 

As can be seen in Figure 2, the steps for defining an evaluation methodology are the following: Defining the purpose, defining the scope, describing the intervention logic, formulating evaluation questions, defining methods and data, and assigning necessary resources to the evaluation. In the following paragraphs, we will describe these tasks in detail.

Purpose

To be able to design an appropriate evaluation methodology, you must be very clear about its purpose. Why is the evaluation carried out and who is it supposed to be? Does your donor request you to do the evaluation? Do you want to evaluate your projects on an internal basis to see the possible potential for progress? Is it both?

The purpose of the evaluation mostly sets the bar for its scope and form. Many times, the donor already has specific expectations that need to be met and specific regulations that need to be fulfilled. Sometimes even legal requirements come into play. The clearer you are about the purpose of your evaluation, the easier it is to define its form and the appropriate way to go about it.

Scope

The second thing you should take into consideration while designing your evaluation methodology is the scope of the evaluation. Deciding about the scope means to decide which interventions will be evaluated, which geographical area will be covered and which timeframe will be covered.

If you are working on a very small project, these questions are normally easy to answer. If your project just comprises a few interventions, a defined geographical area, and a limited timeframe, your evaluation should cover the entire project. If you already know though that the evaluation of certain aspects will be particularly challenging, it might be a good idea to exclude them to adjust the donor’s expectations towards the final evaluation. This might apply if your project just aims to kick start a process that will only show its impact in the long term (after your evaluation would take place) or if you already know that several measures out of your control will probably make it difficult to evaluate your own project activities. Be clear about it though, so that the donors know what to expect or have the opportunity to object if they do not agree with your approach.

In bigger projects with a range of measures and geographical focal areas, it might be a good idea to focus on some. If the project is embedded in a bigger program, it might make sense to focus on areas that have not been evaluated lately or that have reported problems and challenges in the past. Again though, make sure that you follow your legal obligations and that your donor agrees with your approach.

The intervention logic

In this step, you should be able to describe the interventions planned, their potential impact, and their interactions during your project phase. You should also take into consideration external actions that might have an influence on the implementation of your project, being positive or negative.

Writing down or making a diagram of the intervention logic makes sure that you clearly understand how the project is supposed to work and what was expected in the beginning. The intervention logic is dynamic and might change during the course of the project, but these changes can be documented and give a good insight into areas where plans and expectations needed to be adjusted to reality.

Evaluation Questions 

Once you spelled out in detail how your project is planned to work and how the expected impact is, you are in a position to formulate good evaluation questions. Evaluation questions are those questions that are supposed to be answered by your evaluation. It gives you the opportunity to specify what you actually want to analyze in your evaluation.

If you word your questions carefully, you can make sure a critical analysis can take place. Be careful not to end up with simple Yes/No questions, which seem easy to answer, but will give almost no insight in the end and thus will have very little additional value for the donor or your organization.

At the same time, you should try to find questions though that you will be able to answer. While project applications are full of promised impact, in reality, it is quite difficult to actually measure impact. To be able to assess the impact of your project, you would need to take a lot of data also out of your project to be sure no external events influenced the outcome of your project. Even with a huge dataset, it is almost impossible to be 100% sure of the impact your interventions had because things like policy changes, general opinion, or other events might play a role that you are not even aware of.

Sometimes that means to break one issue down to several questions. These questions can be quantitative (answered by hard data, numbers, etc.) or qualitative (opinions, perceptions, etc.) As shown in figure 3 below, you have to find the perfect middle ground between being too broad or too narrow-minded.

The question on the left (impact on education) is too broad, it would not be possible to answer this question in project evaluation. Impact evaluation on education would have to take into consideration many other factors like the general shift in attitudes, all other initiatives in the sector, policy changes, etc. Still, if all this data would be available, it would be very difficult to quantify the impact in comparison to other interventions. If you try to answer this question in this scope, the donors would know that your evaluation must be flawed. Be very careful with the use of the word “impact” in your evaluation!

Figure 3: Comparison between different evaluation questions. (own representation)

The question on the right in comparison is to narrow (number of schools). It would be possible to answer it with a simple number and would not give any further information about the quality of education or the actual use of these schools. It gives no room for critical analysis and thus would be no good evaluation question.

The questions in the middle show one way to ask a combination of quantitative and qualitative questions that can also lead to a more critical assessment of the project activities, but at the same time give a good picture of what the project has actually achieved.

Of course, the depth of these questions will vary according to the scope of your evaluation.

Methods and data

When you defined the appropriate evaluation questions, the next step is to think about the necessary data and the methods to analyze that data. There are plenty of tools and instruments available to conduct an evaluation, but to decide which ones are appropriate you have to take into consideration the availability of data, the quality of your data, and the resources available for the evaluation. Some tools need to input very detailed data, so if that data is not available, you can´t use these tools as well. Some instruments are very time-intensive, so if you did not allocate sufficient time and manpower for the evaluation, these instruments are also not a good fit.

Allocating resources to your evaluation strategy

It is also important not to forget to allocate resources to your evaluation methodology to be able to carry it out. Many times, the evaluation does not get enough attention in regard to resources and people do not have enough designated time to carry it out. Particularly in smaller projects, sometimes the project manager has to do the evaluation “on the side” of his or her normal tasks. This poses several risks, as not enough time is designated to the important task and the project manager might be biased.

Setting aside manpower and resources for the evaluation from the first project phase shows a responsible behavior on the organization’s side and guarantees that the evaluation will be carried out professionally.

Designing an evaluation methodology

Once you have carried out the above-mentioned steps (ideally in a team), you have gathered enough information to design your evaluation methodology. You have decided which methods you will need and which data you will have to collect for that, and ideally already allotted the corresponding responsibilities to the assigned staff so that everybody knows what her or his role is in the process.

If you put this information together in a document, it is also a good opportunity to share this with your donors or potential donors. A thought-through evaluation methodology shows that you and your organization are very familiar with the working area of your project, have put a lot of thought into the design, and are able and willing to critically analyze your project interventions. It creates transparency and thus more reason for the donors to trust you and your organization. It also makes common ground with respect to expectations towards the evaluation report in the end and gives all stakeholders the opportunity to add input if needed and desired.

Of course, designing the methodology is only the first step. Throughout the project, you have to be careful that it also gets implemented according to the plan and that no big problems arise. You can adjust your strategy if need be, but you should always be able to plausibly explain the reasons for the necessary adjustments to your donors and stakeholders.


About the author

Eva Wieners

Eva is based in Germany and has worked for nearly a decade with NGOs on the grassroots level in Nepal in the field of capacity development and promotion of sustainable agricultural practices. Before that, she worked in South America and Europe with different organizations. She holds a Ph.D. in geography and her field of research was sustainability and inclusion in development projects.

Subscribe
Notify of
guest
3 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Daniel Braima
4 years ago

Thanks this is helpful

Shiva Raj Panta
Shiva Raj Panta
4 years ago

Would like to know more about intervention logic; it is of key importance because it provides the compelling logic for the intervention.

Moonah
9 months ago

This was helpful. Thank you. Do you know of companies who do evaluations for companies.

DOWNLOAD

3
0
Would love your thoughts, please comment.x
()
x