Tracy has been with Eldis since 1996, having previously worked in the information sector, in both engineering and development libraries. She currently manages the day-to-day production of Eldis content with a particular interest in Ageing populations and Rising Powers in International Development.
Assessing the Quality of Evidence
The terms evidence-based and evidence-informed policy have become commonplace in the development vernacular and many development actors, including Eldis, work to promote the use of research evidence in the design and delivery of development policies and practice. But with the volume and accessibility of potentially useful research increasing all the time how do you go about ensuring that the evidence we use to make decisions is reliable, balanced, accurate and applicable to our needs?
If you want to delve deeper but don't have much time then the Assessing the Strength of Evidence How to Note published by the UK Department for International Development is a good place to start. It was developed to help their staff assess the strength of the evidence they are using to inform their own policy and programming choices.
This guidance, like much in development, draws heavily on well established and rigorous methodologies developed in the health sector. Some of the most commonly used of these are:
Grading of Recommendations Assessment, Development, and Evaluation (GRADE) offers a transparent and structured process for developing and presenting summaries of evidence, including its quality, for systematic reviews and recommendations in health care.
GRADE specifies an approach to:
- framing questions
- choosing outcomes of interest and rating their importance
- evaluating the evidence
- incorporating evidence with considerations of values and preferences of patients and society to arrive at recommendations.
Some useful resources on GRADE include:
- GRADE guidelines: 3. Rating the quality of evidence
- Grading of Recommendations Assessment, Development and Evaluation (GRADE) Working Group (GRADE)
PRISMA - or Preferred Reporting Items for Systematic Reviews and Meta-Analyses - is an evidence-based minimum set of items for reporting in systematic reviews and meta-analyses. PRISMA can also be used as a basis for reporting systematic reviews of other types of research, particularly evaluations of interventions. It may also be useful for critical appraisal of published systematic reviews, although it is not a quality assessment instrument in itself. It's worth noting that some questions have been raised about the applicability of the systematic review approach to international development and/or about how that approach has been applied.
The PRISMA Statement consists of a 27-item checklist and a four-phase flow diagram. The checklist includes items deemed essential for transparent reporting of a systematic review.
The following are some useful resources which go into more detail about the meaning and rationale of the checklist items, and explain how to apply the PRISMA Statement.
- The PRISMA Statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration
- PRISMA: Transparent Reporting of Systematic Reviews and Meta-Analyses (PRISMA)
UK NATIONAL INSTITUTE OF CLINICAL EXCELLENCE (NICE)
The National Institute for Health and Clinical Excellence (NICE) is the independent organisation responsible for providing national guidance on the promotion of good health and the prevention and treatment of ill health.
One important element in the development of their guidance is a rigorous assessment of the quality of the evidence base. This involves using an appropriate quality appraisal checklist, which are included in the NICE resources below.
- Methods for the development of NICE public health guidance, third edition
- National Institute for Health and Clinical Excellence, UK (NICE)
NICE is the independent organisation responsible for providing national guidance on the promotion of good health and the prevention and treatment of ill health.Outside of health other sectors are also working towards evaluating the quality of evidence in their work, and these provide some interesting alternative frameworks that some find more relevant and appropriate to their work. Increasingly in the NGO sector the call for for accountability and transparency in their activities means that quality of evidence is increasingly important for them.
BOND, the UK's NGO network has developed a set of principles for assessing the quality of evidence as part of its effectiveness programme that could be seen to be in response to calls for more NGO accountability, as expressed in such good practice suggestions as Addressing accountability in NGO advocacy from One World Trust.
Coming from a more academic perspective the Quality assurance and assessment of scholarly research: a guide for researchers, academic administrators and librarians, offers an overview of some of the key issues surrounding quality assurance and assessment of scholarly research. It provides definitions of quality; tools and mechanisms; and assurance and assessment of projects and programmes, researchers, institutions and scholarly journals. There are also more specific guidelines available which are aimed at particular stages of the research. For example, the Research Excellence Framework organisation provides an Assessment Framework and Guidance on Submissions.
Finally the UK Cabinet Office have published guidance on assessing the quality of qualitative research evidence in the paper Quality in qualitative evaluation: a framework for assessing research evidence. It presents a framework for appraising the quality of qualitative evaluations, and was developed with particular reference to evaluations concerned with the development and implementation of social policy, programmes and practice.
Knowledge, policy and power in international development: a practical framework for improving policyH. Jones, N. Jones, L. Shaxson, D. Walker / Overseas Development Institute, 2013DocumentDepartment for International Development, UK, 2012DocumentL. Spencer, J. Ritchie, J. Lewis, L. Dillon / 2003DocumentA. Boaz, D. Ashby / 2003DocumentS. Sutcliffe, J. Court / Overseas Development Institute, 2006