14 Matching Annotations
- Dec 2017
-
uk.cochrane.org uk.cochrane.org
-
Or have we substituted an easier one?”(Daniel Kahneman) [1]
Going for low hanging fruit rather than what matters
-
-
effectivehealthcare.ahrq.gov effectivehealthcare.ahrq.gov
-
Qualitative content analysis
Report includes data analysis packages for qualitative research
-
-
effectivehealthcare.ahrq.gov effectivehealthcare.ahrq.gov
-
B-3i.If you need to make compromises in terms ofthe comprehensiveness and formatting of evidence, does the level of risk/compromise you are willing to make change based on the type of decision?ii.What are factors you would consider in the risk you are willing to take (e.g., safety concerns, burden of disease, cost)?iii.If the evidence is wrong, what is the acceptable level of risk (e.g., permanent vs. transient adverse effects)?4.How important is the relationship with the producer of the evidence synthesis product?a.In terms of providing useful information tomake your decisionsb.In terms of credibilityExamplesof different types of rapid review products (See attachments sent in advance of the call)Rapid Review ProductDocument nameTitleEvidence inventorieslist what evidence is available, and often other contextual information needed for making decisions, but do no synthesis and do not attempt to present summaries or conclusions.Evidence Inventory SampleAcetylsalicylic Acid for Venous Thromboembolism Prophylaxis: an Update of Clinical EvidenceRapid responsesorganize and evaluate the literature to present the end-user with an answer based on the best available evidence but do not attempt to formally synthesize the evidence into a new conclusion. Usually this means reporting the conclusions of guidelines or systematic reviews, but some rapid response products apply a best evidence approach and report the results of primary studies if no secondary sources are available.Rapid Response SampleKnee-length versus Thigh-length Compression Devices for Treating Deep Venous Thrombosis“True” rapid reviewsperform a synthesis (qualitative, quantitative, or both) to provide the end-user with an answer about the direction of evidence and possibly the strength of the evidence.
Again reference to qualitative but means analysis not data,
-
“rapid reviews” perform a synthesis (qualitative and/or quantitative) to provide an answer about the direction of evidence and possibly the strength of evidence;
Notwithstanding mention of qualitative context indicates that this is about "effects"
-
- Sep 2017
-
www.researchgate.net www.researchgate.net
-
For over thirty yearsnow, the terms‘Hawk’and‘Dove’have been used to describe thephenomenon of Hawks as assessors who have high expectationsand subsequently fail many students and Doves who are morelikely to pass students as they err on leniency as opposed to hawkswho are more stringent (Alexander, 1996; McManus et al., 2006;Seldomridge and Walsh, 2006; Panzarella and Manyon, 2007).The consequences of this can be that one student can receivea higher grade than one of their peers either because of betterperformance or luck in being assessed by a lenient marker(Iramaneerat and Yudkowsky, 2007).More experienced assessors however are not immune to gradeinflation. Again North American authors state that many asses-sors who are non-tenured are reluctant to give low gradesbecause they rely on good evaluations from their students fortheir continued employment. Assessors who are tenured areoften reluctant to give low grades because they do not want towaste time in dealing with student appeals (Chambers, 1999;Walsh and Seldomridge, 2005; Gill et al., 2006; Isaacson andStacy, 2009
Hawks and Doves in Grade Assessment
-
-
regroup-production.s3.amazonaws.com regroup-production.s3.amazonaws.com
-
There was no obvious cluster relating to the conceptacceptability, therefore, clusters that could potentiallyrelate to qualitative studies and people’s opinions were used instead. This approach had a low precision(27%) and recall (36%). It was even more difficult to identify studies automatically relating to the concept‘uptake’. No clusters related to this concept, and a basic search on the term’uptake’in the title and abstract,did not yield any of thefive research studies that had been classified manually under this heading.
Qualitative concepts that challenge clustering
-
We used the Lingo3G document clustering utility from CarrotSearch.com (Carpinetoet al., 2009) to cluster (orgroup) the titles and abstracts of the studies in two scoping reviews.
Automated clustering tool Lingo3
-
As an estimate, for a scoping review of 120 studies, it could take the equivalent of 2to 3 weeks for one person one person’s time for, 2 to 3 weeks to undertake the process of coding studies,including developing and testing a coding tool, although this varies between each review
Time taken to code for scoping study
-
On the basis of our experience of carrying out scoping reviews, we estimate that it takes between 5 and20 minutes to apply codes to one research abstract, depending on the detail of coding required. However, priorto this process, time is needed to develop a coding tool. This involves: considering what data would inform theresearch questions, and therefore deciding the facets upon which to code, and creating a coding tool that isunambiguous, and both broad and detailed enough to describe the dataset. The tool is often developed usinga sample of data, and examining pre-existing tools.
Time taken to code records
-
-
regroup-production.s3.amazonaws.com regroup-production.s3.amazonaws.com
-
ynthesizing qualitative research enables reviewers to ask questions that inform the development of, or the implementation of, interventions. For example, in the context of intervention evaluation, they can help define relevant and important questions, help determine appropriate outcome measures by looking at “subjective” outcomes, look in detail at issues concerning implementation or the acceptability or appropriateness of an intervention, identify and explore unintended consequences, contribute to service delivery and policy development by describing pro-cesses and contexts, and inform and illuminate quantitative studies, for example, by contributing to the design of struc-tured instruments, assessing the fairness of comparisons in experimental studies, or unpacking variation within aggre-gated data (Davies, Nutley, & Smith, 2000).
Summary of purposes of QES
-
This article demonstrates the value of one relatively new approach, that of framework synthesis (Carroll, Booth, & Cooper, 2011; Thomas, Harden, & Newman, 2012). The distinguishing characteristic of this method is that it allows preexisting understanding (in the form of themes or categories) to be included in the analysis alongside (and combined with) con-cepts that emerge from the studies themselves (Dixon-Woods, 2011). This makes it particularly suitable for studies where a relevant related conceptual framework already exists, or where the findings from primary studies need to be explored in the light of perspectives of various stakeholders (e.g., practitioners, parents) in a structured and explicit way
Justification and rationale for framework synthesis
-
-
www.mrc.ac.uk www.mrc.ac.uk
-
Academic publication islikely to involve breakingthe process evaluation down into smaller parts, but careful cross-referencing between papers,and to full reports, shouldensure that the bigger picture, and the contribution of each article to the whole,is not lost.
Justification for the CLUSTER approach
-
It is useful if the intervention and its evaluation draw explicitly on one or more sociological or psychological theories, so findings can add to the incremental development of theory. However, evaluators should avoid selecting one or more pre-existing theories without considering how they apply to the context in which the intervention is delivered. Additionally, there is a risk of focusing narrowly on inappropriate theories from a single discipline. For example, some evaluations have been criticised for drawing predominantly upon individual-level behaviour change theories,where the aim is to achieve community, organisational or population-level changes, for which the sociological literature may have offered a more appropriate starting point(Hawe et al. 2009).
Repeats point made above
-
It is useful if interventions, and their evaluations, draw explicitly on existing social science theories, so that findings can add to the development of theory. However, evaluators should avoid selecting ‘off-the-shelf’ theories without considering how they apply to the context in which the intervention is delivered. Additionally, there is a risk of focusing narrowly on inappropriate theories from a single discipline; for example, some critics have highlighted a tendency for over-reliance upon individual-level theorising when the aim is to achieve community, organisational or population-level change (Hawe et al., 2009).
Potential limitations of framework approach to synthesis
-