This dissertation for the Degree of Doctor of Psychology (Health) presents three pieces of work: 1) A Re-analysis of a Systematic Review of Psychological Interventions Used to Aid Smoking Cessation; 2) Evaluation and Discourse Analysis of the EC's Health Promotion Programme; 3) A consultancy case study: Evaluation of Educational Needs Assessment Methods Used in General Practices in Barking and Havering and Redbridge and Waltham Forest. The theme that ties these three pieces of work together is evaluation. The re-analysis of the systematic review of psychological methods for smoking cessation shows how errors can be made in evaluation and how different researchers can obtain different results in what is considered to be a method that reduces bias and produces an accurate picture of `evidence' to inform health policy and practice. The evaluation of the EC's Health Promotion Programme gives insight into a case study of an evaluation to inform health promotion policy at an European level. This piece of work presents the results of an independent evaluation. It highlights unexpected difficulties of drawing conclusions from data such as the practical problems of obtaining data and also the pressures that may come from the commissioners of evaluations. The discourse analysis of the Health Promotion Programme reveals how current discourses in health promotion may compel health promotion practitioners to carry out a certain type of evaluation in which in truth they may have little understanding or commitment. As a result, the practice of evaluation becomes a formality or ritual which is a burden to carry out. A panel of health promotion expert assessors found a lack of acceptable evaluation of projects that were funded by the European Commission. This suggests that if evaluation can be avoided, it will be. The same themes of lack of understanding, commitment and time for evaluation were unveiled in the case study. The consultancy case study evaluated educational needs assessment methods used in general practices. The use of evidence-based practice requires that practitioners understand how to evaluate research and incorporate it into their practice. This needs more emphasis in the education and training of health professionals. However there has been a move away from the more didactic approach to education in primary care to one of listening to people's needs and preferred methods of learning. At the same time the ubiquitous need to evaluate to find the best method prevails. This is regardless of obvious limitations to the interpretation of findings. In this case study, it seemed as though the evaluation was an after-thought, rushed to satisfy some other group higher up the hierarchy in the health authority. Similarly, the discourse analysis pointed to a situation in which the Commission's services are constructed as superior, thus leaving no mechanism to question their knowledge or ways of working. While there may be efforts on one level to encourage a two-way flow of information and knowledge, on another level, a construction of decision-makers as being superior means that information and knowledge only flow one way, top down. All three pieces of work have shown that practical limitations restrict the interpretation of evaluations. Lack of time, incomplete data, commitment and knowledge of evaluation revealed here lead to questions about the possibility and desirability of evidence-based health promotion. For evaluation to advance, there is a need for a better understanding of its purpose and for it to have more meaning for all of the stakeholders involved. This requires a rethink concerning evaluation methods in health promotion that recognise the restraints of evaluation and start inquiry from this premise.