The study succeeded in collecting data on organizations that support and/or implement RUD public education projects; as well as project types, locations, duration and costs; on what basis projects were planned; what materials were produced; and the main activities carried out. It was also very successful in gathering information about facilitating and constraining factors, and lessons learned; these will be crucial to defining methods for new projects to promote, prevent, or deal with them.
This very rich data set still leaves some areas blank. It was not successful, for example, in describing in any depth the motivations implementers may have had for selecting particular themes and activities. In other words, it is not clear why people are doing what they are doing.
Second, although planning groups existed in most projects, and some planning took place, even if this was somewhat loose, there is no sense or description of how the planning was done.
Third, there is very little information on the impact of various activities and approaches. Even where structured evaluation is reported to have taken place few evaluation reports were received, reflecting common experience in the field when seeking reports of IEC activities. This is unfortunate, for public education programmes are often accused of a lack of rigour in their work, leading to a questioning of their value and consequent difficulties in obtaining support for such programmes. It is perhaps understandable that many of the small organizations, struggling with resource constraints, might consider that available time and expertise should be devoted to implementation rather than to reporting and evaluation. The latter might be seen, however erroneously, as an exercise of more academic than practical value. However, even the larger programmes, with the notable exception of the Australian and Viet Nam projects, appeared weak in this area. It should be possible to build systematic monitoring and evaluation into even small projects - and certainly all large projects - if methodologies are kept simple, awareness raised of the necessity and project benefits of such work, and simple guidelines made easily accessible. More work must be done to report on and evaluate the impact of interventions if existing work is to be strengthened, successful approaches replicated and public education to receive the necessary support.
At present, because impact is rarely measured, because coverage data is so sparse and varied, and because the costs reported do not appear to be complete programme costs, it is not possible to compare relative costs and benefits. Such comparative data would be very useful in guiding future programmes towards success.
How medicines are obtained and used is influenced by many societal and structural factors, such as traditional practices and beliefs, adequate regulation of the drug market, or drug promotion. But we do not know the relative impact of such factors on the possible success of given public education strategies, nor do we know the extent to which structural changes may exert a positive or negative influence on the chances of initiating and successful implementing public education programmes. The national drug policy indicators, developed by the Action Programme on Essential Drugs, are one tool that will help to map correlations between upward or downward shifts in different components of drug policy implementation.
The data do not show how programmes could be made sustainable and this is surely a key factor if awareness and behavioural change are to be maintained. Many projects pointed to planning difficulties related to sporadic funding and support. They also articulated the need for coalitions or alliances of interests/agencies. Building alliances across a broad range of actors may be projected to correlate not only with successful programmes in the short term but with the likelihood of their continued support and sustainability. However, data are needed to support this assumption.
We still have little information about replicability or the degree to which a programme successfully implemented in one country or environment may be successfully replicated elsewhere, or what essential modifications will be necessary. Replication of developed country projects by developing countries should probably be approached with great caution given the huge differences in culture and pharmaceutical sectors. The culturally specific nature of consumer education and the need for materials to faithfully reflect visual and linguistic community realities, imply that replication without adaptation, at least of printed materials, will be an ineffective approach. However, this is already being done (without formative research into the material's acceptability or relevance) by some projects.