Lack of reporting
Lack of reporting or easy accessibility to reports on projects has serious consequences both for national implementers, potential partners, funding (see above) and general understanding of the importance and evolution of public education work. The extremely limited number and nature of reports obtained by DAP in support of completed survey questionnaires, mirrors the difficulty experienced in its country support work in obtaining documentation on even well known country projects. Reporting weaknesses were also demonstrated in how some questionnaires were completed, possibly linked to staff turnover, and inadequacies of project files. In view of the current relatively limited scale of public education in RUD, the absence of such documentation represents a serious constraint for all who are working in the field. It is, of course, understandable that often hard pressed projects wish to devote the bulk of their efforts to implementation. But without some form of structured reporting, both of process and impact, experience gained can be easily lost. This is a particular problem given the often sporadic nature of funding and the frequent changes in personnel of many programmes and organizations.
Lack of evaluation
The type of evaluation will be determined by project capacity (both staffing and funding) and project goals. Many projects will have diffuse community development goals that may be unsuitable for a rigorous intervention-type (pre-posts, control) evaluation. However evaluation has to be built into all projects at the planning process, and underpinned by an understanding of what is appropriate to the circumstances and the project goals. There are many indications that this is a very weak area currently in public education work in general, with confusion between process and impact evaluation. Criticism of what is regarded as a “soft” or less than rigorous approach is very adversely affecting possibilities of support for needed programmes. Advocacy is needed to promote awareness of the legitimacy of different evaluation approaches, so that programme implementers are not “bullied” into adopting inappropriate models from other scientific disciplines. However, some programmes with very specific targets, will lend themselves to a type of controlled trial intervention model, and this can be usefully pursued. Training and guidance needs to be offered to programme implementers, who may have little background in evaluation models, in the need for and how to integrate an appropriate evaluation methodology into their work, and give the rationale for this choice in programme reporting.
Lack of published articles
The lack of published literature in this field is a major hindrance to its development (see above). Publication in peer reviewed journals is increasingly difficult, with a major bias towards “positive” programmes following a quite narrow “interventionist” model. Moreover, programme implementers may be short of time; they may also lack experience in producing written descriptions of programme approaches and findings. It is important that programme organizers and funders identify ways in which support can be given to write up and publish programme activities. Producing an article acceptable for publication requires considerable time and editorial skills, which may not be easily available. Some networks (e.g. INRUD) and some organizations (e.g. DAP) have developed strategies to provide editorial support for projects to write up their work. This approach needs to be much more broadly based and actively pursued.