backcontentsnext

Part V: Evaluation

Chapter 12: Evaluation

12.1 Why and with whom?

The goal of the evaluation must be clear to those involved. Evaluation of an intervention is carried out from two perspectives:

Change agents in nutrition education must not be defensive where evaluation is concerned. Whatever the qualifications of the members of the planning team, it must be competent enough to answer the questions related to the two issues mentioned above.

With whom?

The persons concerned with evaluation may be divided into four categories:

The population itself: it must be invited to participate in the evaluation process, since the actions to be evaluated, concern them directly.

The change agents in the communication process. They will play an important role in the evaluation process. Moreover, this evaluation will help them improve their performance.

The evaluation specialists internal or external to the planning team, provide technical expertise for the evaluation.

Participation of sponsors and government representatives will allow them to see the impact of the activities they have promoted to consider further expansion of the programme.

12.2 At what point should an evaluation be conducted?

The ideal is to plan for the evaluation of an intervention in social communication from the phase of conceptualization. Causal analysis and preliminary research are part of evaluation. Therefore, evaluation provides a strong foundation for the project. Evaluation can also be undertaken during the phase of formulation of a communication intervention. At that moment, there is still time to reflect, not only on the relevance of the intervention, but also on the order in which the activities should develop, on the expected results, and on the actions that need to be taken before any communication activity. The knowledge, attitudes and practices of the target population have to be measured before the intervention to provide a basis for comparison afterwards. It is never too late to think of evaluation; even during the implementation phase, lessons can be learnt from experience.

At what point should an evaluation be conducted?

12.3 How to carry out dynamic and participatory evaluation

Lefevre and Beghin (17), among other, propose an approach which ensures the participation of all in the evaluation of nutrition interventions. Three types of tools will be used: the causal analysis, the Hippopoc table and the dynamic model of intervention.

1. Causal analysis can help determine the relevance of the intervention. Causal analysis, as presented before, consists of creating in an intersectoral setting, a network of factors which affect the nutritional status of the population. The result is a tool for the selection of the appropriate intervention, for communication between the members of the planning team and for evaluating the relevance of the intervention. Causal analysis also enables the team to identify confounding factors which can influence the success or failure of the intervention.

2. In the Hippopoc table, inputs, procedures, outputs and outcomes of the intervention are place in cells.

The inputs are the elements which are going to be transformed into outputs by the intervention. Example: for nutritional education in the school, they are namely the teachers, pupils, and money invested in the production of the programme.

The processes are the activities undertaken in order to transform inputs into outputs. For example: teaching training, the teaching programme produced together with the pupils, the conceptualization and production of school manuals, etc.

The outputs are the results of activities carried out in the intervention. They correspond to specific objectives of the intervention. They are direct effects of it. Example: better trained teachers; better educated students, whose behaviour is different; school manuals of improved quality. All these should contribute to the long-term objectives of any nutrition project that is improvement of nutritional status of the target population.

3. The dynamic model is the graphic presentation of the interactions between inputs, processes, outputs and outcomes of an intervention. Like the two other tools mentioned above, it is elaborated by the team. This table presents the expected links between the inputs and the processes, between the processes and the outputs, and between the outputs and the outcomes. It uses arrows to illustrate linkages. In this way all participants of an intervention (the population, the change agents, the "experts") can understand the internal and external links.

How to carry out dynamic and participatory evaluation

12.4 How?

Causal analysis

The people in charge of the evaluation are the "specialists" and the other participants, namely the target population, the sponsors and/or government representatives and the communication agents.

If the causal analysis has not yet been done, the group must make it a priority. This involves a few days work at most. If the analysis has already been done, the group convenes to improve on the "causal network". Little time is needed to complete this activity.

The Hippopoc Table

Once the analysis of causes is completed, the group reassembles to develop the Hippopoc table of inputs, processes, and outcomes. The evaluators will be particularly attentive to this table. They will seek answer to the following questions:

What are the expected outputs of an intervention in social communication?

1. Accessibility of the target population to the message. e.g. women 15 to 45 years have been exposed to the message by one of the channels of communication (or by several)?

2. Retention of the message. e.g. X percentage of the target group have retained the message: "the child from 6 to 12 months must have at least four meals a day, in addition to the mother's milk".

3. Modification of knowledge, attitudes, values of the target-population. e.g. X percentage of this target group was able to explain the reasons for a minimum of four meals per day, and have shown the intention of following this advice.

4. Trial of proposed behavior. e.g.: this target-population has tried to give at least four meals per day to their breast-fed children aged six to twelve months.

5. The adoption of these habits. e.g.: X percent of this target group has adopted these food habits and integrated them into their daily life.

Note that the examples contain measures for comparison of results after the intervention.

What are the possible long-term outcomes of such an intervention?

6. Improvement of nutritional status. e.g.,: There has been an improvement in the nutritional status of X percent of children (as compared to a control group).

7. Improvement of health status. e.g., X percentage of children have suffered from the childhood diseases of compared to a higher percentage in the control group.

These long-term outcomes appear to be the effects of the educational intervention and other factors in the physical social, economic, and family environment.

The final results of the intervention could be considered not only as outputs but also as intermediary results. In our example, these are the number of officers trained or retrained in the programme, and the number of posters produced to ensure the transmission of the messages.

The dynamic model

It would be essential to illustrate graphically the relations between the inputs, processes, outcomes of the intervention.

This graph could be used by all those involved in the intervention to provide basic information on the project, to determine its success and failures and to plan for improvement.

Evaluation is like an energy source that can be drawn upon, for development of participatory communication activities. Evaluation is not simply an activity external to the intervention. It is therefore a crucial component of nutrition education.

backcontentsnext