Don’t assume that the effects of
your programme are always positive
It is tempting for MFIs to think that
positive impacts are the result of their
intervention, while negative impacts are
due to external factors beyond their
control. Cross-checking information from
other methods can help increase your
certainty about the findings.
9 Make the most of
your findings
Your survey will produce valuable
findings, and it is essential that the best
use is made of these. The Feedback loop
(see Imp-Act Practice Note 1 on
Feedback Loops) can help your MFI think
through the different ways in which your
survey findings may be useful and
ensure that the most is made of this
investment. For example, the board may
just want to read the report, but
management will want to act on it, while
field staff may find it helps them deepen
their understanding of clients.
The first key steps to take include:
1. Consider how the findings compare to
your expectations and what the
differences are. If evidence of
improvements is not as strong as you
had hoped, what will be your MFI’s
response and next steps?
2. Think about how these differences
can be explained; if they cannot, it may
require further research. You can follow
up in different ways, e.g. through focus
group discussions.
3. Consider how your findings will affect
aspects of policy such as product design
or delivery mechanisms.
Often a simple and focused impact
survey will provide more effective
information than a large, complex one
whose aims are not clear. When
expected changes are not supported by
survey findings, re-thinking what you
have assumed to be true can be very
revealing, as Case study 4 shows.
Further ways in which a survey can be
optimised include:
Repeating surveys at periodic
intervals:
If you make surveys a routine element
of your work - every one or two years
for example - this will add depth to your
understanding, and can allow an MFI to
build up “baseline” data over time. This
way you can compare the same clients
as they progress. However, if you wish
to compare data over time, you will need
to develop your analytical skills.
Integrating your impact survey tool
into your management information
system:
Try to make this process as simple as
possible. For example, link your survey
database to your financial records, so
you are able to access data on savings
made or loans taken in terms of
frequency, size, intended use etc,
without having to include these in the
survey.
Evaluating your success at
undertaking surveys, and their
benefit for your MFI:
This will ensure that you make the most
of future surveys. Evaluation is all the
more important in a exercise such as an
impact survey where a lot of resources
are used. You need to decide what
internal changes you need to make in
staff members’ skills and attitudes. You
also need to ask whether the impact
findings the survey has produced have
been timely, relevant, reliable, cost-
effective and replicable. A good impact
study can yield important information
that will prove to yourselves or donors
that your interventions are worthwhile.
But it will also allow you to improve the
quality of your MFI’s services,
particularly if it has included a few open-
ended qualitative questions for clients to
give their opinions on best practice and
changes they would like to see.
You should also highlight the benefits of
using an external consultant, if this was
the case.
CASE STUDY 4
At PRADAN, in India, the findings
from an impact survey confirmed
some of the MFI’s assumptions -
for example that they were
excluding both the wealthiest and
the poorest people.
It also urged them to re-think a
number of assumptions. In
particular, PRADAN had assumed
that participation in their self-help
group programme was leading to
empowerment of women, based on
anecdotal evidence from “winners”.
However, the survey findings did
not support these expected
changes, and revealed that the
majority of women clients were not
experiencing any significant
changes in their status or self-
esteem.
This led PRADAN to undertake
further investigations, which
revealed that the technical
background of many staff, as well
as the focus on developing new
livelihoods, meant that staff had
mainly focused on economic
improvements and “difficult issues”
like domestic violence were not
being discussed.
I M P-ACT PRACTICE NOTES • NUMBER FOUR • 2005 • PAGE SEVEN