3 Matching Annotations
  1. Jul 2018
    1. On 2016 Oct 11, Alem Matthees commented:

      References for the above comment

      1) Hawkes N. Freedom of information: can researchers still promise control of participants' data? BMJ. 2016 Sep 21;354:i5053. doi: 10.1136/bmj.i5053. PMID: 27654128. http://www.bmj.com/content/354/bmj.i5053

      2) White PD, Goldsmith KA, Johnson AL, Potts L, Walwyn R, DeCesare JC, Baber HL, Burgess M, Clark LV, Cox DL, Bavinton J, Angus BJ, Murphy G, Murphy M, O'Dowd H, Wilks D, McCrone P, Chalder T, Sharpe M; PACE trial management group. Comparison of adaptive pacing therapy, cognitive behaviour therapy, graded exercise therapy, and specialist medical care for chronic fatigue syndrome (PACE): a randomised trial. Lancet. 2011 Mar 5;377(9768):823-36. doi: 10.1016/S0140-6736(11)60096-2. Epub 2011 Feb 18. PMID: 21334061. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3065633/

      3) Confidentiality: NHS Code of Practice. November 2003. https://www.gov.uk/government/publications/confidentiality-nhs-code-of-practice

      4) General Medical Council (2009). Confidentiality guidance: Research and other secondary uses. http://www.gmc-uk.org/guidance/ethical_guidance/confidentiality_40_50_research_and_secondary_issues.asp

      5) Queen Mary University of London. PACE trial protocol: Final version 5.2, 09.03.2006. A1.6-A1.7 [Full Trial Consent Forms] https://www.whatdotheyknow.com/request/203455/response/508208/attach/3/Consent forms.pdf

      6) Appeal to the First-tier Tribunal (Information Rights), case number EA/2015/0269: http://informationrights.decisions.tribunals.gov.uk/DBFiles/Decision/i1854/Queen Mary University of London EA-2015-0269 (12-8-16).PDF

      7) Tracking switched outcomes in clinical trials. http://compare-trials.org/

      8) White PD, Chalder T, Sharpe M, Johnson T, Goldsmith K. PACE trial authors' reply to letter by Kindlon. BMJ. 2013 Oct 15;347:f5963. doi: 10.1136/bmj.f5963. PMID: 24129374. http://www.bmj.com/content/347/bmj.f5963

      9) Goldsmith KA, White PD, Chalder T, Johnson AL, Sharpe M. The PACE trial: analysis of primary outcomes using composite measures of improvement. Queen Mary University of London. 8 September 2016. http://www.wolfson.qmul.ac.uk/images/pdfs/pace/PACE_published_protocol_based_analysis_final_8th_Sept_2016.pdf

      10) Wikipedia. Multiple comparisons problem. Accessed 02 October 2016. https://en.wikipedia.org/wiki/Multiple_comparisons_problem

      11) Matthees A, Kindlon T, Maryhew C, Stark P, Levin B. A preliminary analysis of ‘recovery’ from chronic fatigue syndrome in the PACE trial using individual participant data. Virology Blog. 21 September 2016. http://www.virology.ws/wp-content/uploads/2016/09/preliminary-analysis.pdf

      12) Tuller D. Trial By Error, Continued: Questions for Dr. White and his PACE Colleagues. Virology Blog. 4 January 2016. http://www.virology.ws/2016/01/04/trial-by-error-continued-questions-for-dr-white-and-his-pace-colleagues/

      13) Walwyn R, Potts L, McCrone P, Johnson AL, DeCesare JC, Baber H, Goldsmith K, Sharpe M, Chalder T, White PD. A randomised trial of adaptive pacing therapy, cognitive behaviour therapy, graded exercise, and specialist medical care for chronic fatigue syndrome (PACE): statistical analysis plan. Trials. 2013 Nov 13;14:386. doi: 10.1186/1745-6215-14-386. PMID: 24225069. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4226009/

      14) White PD, Goldsmith K, Johnson AL, Chalder T, Sharpe M. Recovery from chronic fatigue syndrome after treatments given in the PACE trial. Psychol Med. 2013 Oct;43(10):2227-35. doi: 10.1017/S0033291713000020. PMID: 23363640. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3776285/

      15) Beth Smith ME, Nelson HD, Haney E, et al. Diagnosis and Treatment of Myalgic Encephalomyelitis/Chronic Fatigue Syndrome. Rockville (MD): Agency for Healthcare Research and Quality (US); 2014 Dec. (Evidence Reports/Technology Assessments, No. 219.) July 2016 Addendum. Available from: http://www.ncbi.nlm.nih.gov/books/NBK379582/

      16) Matthees A. Treatment of Myalgic Encephalomyelitis/Chronic Fatigue Syndrome. Ann Intern Med. 2015 Dec1;163(11):886-7. doi: 10.7326/L15-5173. PMID: 26618293. http://www.ncbi.nlm.nih.gov/pubmed/26618293

      17) Kennedy A. Authors of our own misfortune?: The problems with psychogenic explanations for physical illnesses. CreateSpace Independent Publishing Platform. 4 September 2012. ISBN-13: 978-1479253951. https://www.amazon.com/Authors-our-own-misfortune-explanations/dp/1479253952

      18) Sharpe M, Goldsmith KA, Johnson AL, Chalder T, Walker J, White PD. Rehabilitative treatments for chronic fatigue syndrome: long-term follow-up from the PACE trial. Lancet Psychiatry. 2015 Dec;2(12):1067-74. doi: 10.1016/S2215-0366(15)00317-X. Epub 2015 Oct 28. PMID: 26521770. https://www.ncbi.nlm.nih.gov/pubmed/26521770

      19) Higgins PTJ, Altman DG, Sterne JAC; on behalf of the Cochrane Statistical Methods Group and the Cochrane Bias Methods Group. Chapter 8: Assessing risk of bias in included studies. Version 5.1.0 [updated March 2011]. http://handbook.cochrane.org/chapter_8/8_assessing_risk_of_bias_in_included_studies.htm

      20) Schulz KF, Grimes DA. Blinding in randomised trials: hiding who got what. Lancet. 2002 Feb 23;359(9307):696-700. PMID: 11879884. http://www.who.int/rhl/LANCET_696-700.pdf

      21) Boot WR, Simons DJ, Stothart C, Stutts C. The Pervasive Problem With Placebos in Psychology: Why Active Control Groups Are Not Sufficient to Rule Out Placebo Effects. Perspect Psychol Sci. 2013 Jul;8(4):445-54. Doi: 10.1177/1745691613491271. PMID: 26173122. http://pps.sagepub.com/content/8/4/445.long

      22) Button KS, Munafò MR. Addressing risk of bias in trials of cognitive behavioral therapy. Shanghai Arch Psychiatry. 2015 Jun 25;27(3):144-8. doi: 10.11919/j.issn.1002-0829.215042. PMID: 26300596. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4526826

      23) Wood L, Egger M, Gluud LL, Schulz KF, Jüni P, Altman DG, Gluud C, Martin RM, Wood AJ, Sterne JA. Empirical evidence of bias in treatment effect estimates in controlled trials with different interventions and outcomes: meta-epidemiological study. BMJ 336 (7644): 601–5. DOI:10.1136/bmj.39465.451748.AD. PMID 18316340. PMC 2267990. http://www.bmj.com/cgi/content/full/336/7644/601

      24) Kindlon TP. Objective measures found a lack of improvement for CBT & GET in the PACE Trial: subjective improvements may simply represent response biases or placebo effects in this non-blinded trial. BMJ Rapid Response. 18 January 2015. http://www.bmj.com/content/350/bmj.h227/rr-10

      25) Knoop H, Wiborg J. What makes a difference in chronic fatigue syndrome? Lancet Psychiatry. 2015 Feb;2(2):113-4. doi: 10.1016/S2215-0366(14)00145-X. Epub 2015 Jan 28. PMID: 26359736. https://www.ncbi.nlm.nih.gov/pubmed/26359736

      26) johnthejack (Peters J). Using public money to keep publicly funded data from the public. 29 June 2016. https://johnthejack.com/2016/06/29/using-public-money-to-keep-publicly-funded-data-from-the-public/

      27) Coyne JC. Further insights into war against data sharing: Science Media Centre’s letter writing campaign to UK Parliament. 31 January 2016. https://jcoynester.wordpress.com/2016/01/31/further-insights-into-the-war-against-data-sharing-the-science-media-centres-letter-writing-campaign-to-uk-parliament/


      This comment, imported by Hypothesis from PubMed Commons, is licensed under CC BY.

    2. On 2016 Oct 11, Alem Matthees commented:

      On 4 October 2016 I submitted a BMJ Rapid Response to this article but one week later it has not been posted on the BMJ website ( http://www.bmj.com/content/354/bmj.i5053/rapid-responses ). I am not sure whether this delay is normal or if it has been rejected. So I will post it on PubMed Commons. The following is as previously submitted, except for the addition of one word to help clarify a sentence, and the addition of a PMID for reference 22 (and removing a stray character). I also had to move the references to a separate PubMed Commons comment below this one:

      The PACE trial investigators never had total control over the data to begin with

      Thank you for covering this issue. Some comments:

      a) This article states that it was “not possible” to contact me[1]. However, my email address shows up in the first page of results with a Google search for Alem Matthees.

      b) Regarding the modification of consent forms for future trials to address FOIA data releases. The FOIA was implemented in January 2005, before PACE trial participants were recruited[2]; under the legislation, trial data that is unlikely to identify participants is not personal data, and it was always QMUL’s responsibility to be aware that trial data is within scope of the FOIA, but they failed to inform participants of this possibility. Similarly, confidentiality guidelines from the NHS[3] and GMC[4] state that consent is not necessary to release de-identified data. The trial consent form promised that identities will be protected[5], and they have been. The Information Commissioner and Information Tribunal considered and rejected the assertions that FOIA data releases would significantly affect recruitment in future studies[6].

      c) The lesson here is not about ‘controlling’ data, it is that if data is not analysed or published in a fair and transparent way, people will seek to acquire and re-analyse it, particularly when debatable claims were made that affect the lives of millions of patients. The major deviations from the published trial protocol, the recovery criteria in particular, is what motivated me. Outcome switching is recognised as a major problem in the research community[7].

      While it was important to find out the protocol-specified primary outcomes that were abandoned, the changes to the recovery criteria (a secondary analysis) were the most problematic. I sought the data after QMUL refused to release the protocol-specified outcomes for improvement and recovery. It is misleading to promote ‘recovery’ rates of 22% when based on indefensible criteria, such as thresholds of ‘normal’ fatigue and physical function that overlap with trial eligibility criteria for severe disabling fatigue, and where one-third still met Oxford CFS criteria.

      d) White et al. previously downplayed the changes to the primary outcomes as the primary measures “were the same as those described in the protocol”[8]. Now that the results for the protocol-specified primary outcomes are known and people are comparing them with the post-hoc equivalents, Professor White is arguing that “They’re not comparing like with like […] They are comparing one measure with a completely different one—it’s apples and pears”.[1]

      Professor White also stated that going back to the protocol makes no difference, as adjunctive CBT and GET are still statistically significantly better than specialist medical care alone[1]. However, statistical significance is not the same as clinical significance, and going back to the protocol decreases the response rates in the CBT and GET groups from approximately 60% down to 20% (compared to 45% down to 10% for SMC alone)[9].

      While it may be argued that the above does not change the conclusion that adjunctive CBT and GET are superior to SMC alone, the trial investigators conducted many analyses without correcting for multiple comparisons[10]; based on a quick look at the p values, some of the differences reported may not be statistically significant when using a more conservative approach.

      Moreover, the data has been re-analysed and going back to the protocol not only decreased the 'recovery' rates from 7-22% down to to 2-7%, but the differences between adjunctive therapy groups and SMC alone are not significant[11]. There appears to be a consistent pattern of outcome switching and major changes to thresholds that inflate the results by several times over.

      e) The article states that White et al. “had answered critics who have made legitimate scientific points”. However, there are numerous legitimate questions or problems that are unaddressed[12].

      f) The article states that out of 37 FOIA requests made, “many” were rejected as vexatious. But only 3 or so have been rejected under S.14 (e.g. see whatdotheyknow.com), the first one was in relation to details about the timing and nature of the changes to the protocol, the other two or so by others, relating to trial data. Asking for trial data or for details about methodology is not harassment.

      g) It remains unclear whether all the changes to the trial protocol were made and independently approved before analysing data. Statements about this issue appear to relate to the 2011 Lancet paper only, but there is no mention of change to the recovery criteria in the statistical analysis plan that was finalised shortly before the unmasking of data[13]. The ‘normal range’ is described in the 2011 Lancet paper as a post-hoc analysis[2], and this ‘normal range’ then formed part of the revised recovery criteria published in Psychological Medicine in 2013[14] without any mention of approval. I urge the trial investigators to clarify once and for all whether the changes to the recovery criteria were made after the unmasking of any trial data and whether these were independently approved.

      h) Patients want to get better but many are simply not impressed with the methodology or results of PACE: 80% of candidates definitely or provisionally diagnosed with CFS were excluded from the trial[2]. The CFS and ME case criteria used were problematic[15-17]. Only a small minority of broadly defined CFS patients reported benefit from CBT or GET (around 10-15% over SMC). That benefit was modest and transient, with no significant advantages at 2.5 year follow-up[18].

      Subjective self-reports are important, but modest improvements are difficult to separate from a placebo response and other reporting biases when a trial is non-blinded and tests therapies that aim to change patients’ perceptions about their illness[19-23]. This issue becomes more relevant given that there was a complete absence of meaningful improvements to multiple objective outcomes[24] (the small improvement in walking distance for the GET group has been attributed by CBT/GET proponents to participants pushing themselves harder on the test rather than being fitter[25]).

      i) QMUL spent £245,745 on legal fees trying to prevent release of the requested data[26], and were also part of a failed lobbying attempt to be removed from the FOIA[27].

      References

      [continued below...]


      This comment, imported by Hypothesis from PubMed Commons, is licensed under CC BY.

  2. Feb 2018
    1. On 2016 Oct 11, Alem Matthees commented:

      On 4 October 2016 I submitted a BMJ Rapid Response to this article but one week later it has not been posted on the BMJ website ( http://www.bmj.com/content/354/bmj.i5053/rapid-responses ). I am not sure whether this delay is normal or if it has been rejected. So I will post it on PubMed Commons. The following is as previously submitted, except for the addition of one word to help clarify a sentence, and the addition of a PMID for reference 22 (and removing a stray character). I also had to move the references to a separate PubMed Commons comment below this one:

      The PACE trial investigators never had total control over the data to begin with

      Thank you for covering this issue. Some comments:

      a) This article states that it was “not possible” to contact me[1]. However, my email address shows up in the first page of results with a Google search for Alem Matthees.

      b) Regarding the modification of consent forms for future trials to address FOIA data releases. The FOIA was implemented in January 2005, before PACE trial participants were recruited[2]; under the legislation, trial data that is unlikely to identify participants is not personal data, and it was always QMUL’s responsibility to be aware that trial data is within scope of the FOIA, but they failed to inform participants of this possibility. Similarly, confidentiality guidelines from the NHS[3] and GMC[4] state that consent is not necessary to release de-identified data. The trial consent form promised that identities will be protected[5], and they have been. The Information Commissioner and Information Tribunal considered and rejected the assertions that FOIA data releases would significantly affect recruitment in future studies[6].

      c) The lesson here is not about ‘controlling’ data, it is that if data is not analysed or published in a fair and transparent way, people will seek to acquire and re-analyse it, particularly when debatable claims were made that affect the lives of millions of patients. The major deviations from the published trial protocol, the recovery criteria in particular, is what motivated me. Outcome switching is recognised as a major problem in the research community[7].

      While it was important to find out the protocol-specified primary outcomes that were abandoned, the changes to the recovery criteria (a secondary analysis) were the most problematic. I sought the data after QMUL refused to release the protocol-specified outcomes for improvement and recovery. It is misleading to promote ‘recovery’ rates of 22% when based on indefensible criteria, such as thresholds of ‘normal’ fatigue and physical function that overlap with trial eligibility criteria for severe disabling fatigue, and where one-third still met Oxford CFS criteria.

      d) White et al. previously downplayed the changes to the primary outcomes as the primary measures “were the same as those described in the protocol”[8]. Now that the results for the protocol-specified primary outcomes are known and people are comparing them with the post-hoc equivalents, Professor White is arguing that “They’re not comparing like with like […] They are comparing one measure with a completely different one—it’s apples and pears”.[1]

      Professor White also stated that going back to the protocol makes no difference, as adjunctive CBT and GET are still statistically significantly better than specialist medical care alone[1]. However, statistical significance is not the same as clinical significance, and going back to the protocol decreases the response rates in the CBT and GET groups from approximately 60% down to 20% (compared to 45% down to 10% for SMC alone)[9].

      While it may be argued that the above does not change the conclusion that adjunctive CBT and GET are superior to SMC alone, the trial investigators conducted many analyses without correcting for multiple comparisons[10]; based on a quick look at the p values, some of the differences reported may not be statistically significant when using a more conservative approach.

      Moreover, the data has been re-analysed and going back to the protocol not only decreased the 'recovery' rates from 7-22% down to to 2-7%, but the differences between adjunctive therapy groups and SMC alone are not significant[11]. There appears to be a consistent pattern of outcome switching and major changes to thresholds that inflate the results by several times over.

      e) The article states that White et al. “had answered critics who have made legitimate scientific points”. However, there are numerous legitimate questions or problems that are unaddressed[12].

      f) The article states that out of 37 FOIA requests made, “many” were rejected as vexatious. But only 3 or so have been rejected under S.14 (e.g. see whatdotheyknow.com), the first one was in relation to details about the timing and nature of the changes to the protocol, the other two or so by others, relating to trial data. Asking for trial data or for details about methodology is not harassment.

      g) It remains unclear whether all the changes to the trial protocol were made and independently approved before analysing data. Statements about this issue appear to relate to the 2011 Lancet paper only, but there is no mention of change to the recovery criteria in the statistical analysis plan that was finalised shortly before the unmasking of data[13]. The ‘normal range’ is described in the 2011 Lancet paper as a post-hoc analysis[2], and this ‘normal range’ then formed part of the revised recovery criteria published in Psychological Medicine in 2013[14] without any mention of approval. I urge the trial investigators to clarify once and for all whether the changes to the recovery criteria were made after the unmasking of any trial data and whether these were independently approved.

      h) Patients want to get better but many are simply not impressed with the methodology or results of PACE: 80% of candidates definitely or provisionally diagnosed with CFS were excluded from the trial[2]. The CFS and ME case criteria used were problematic[15-17]. Only a small minority of broadly defined CFS patients reported benefit from CBT or GET (around 10-15% over SMC). That benefit was modest and transient, with no significant advantages at 2.5 year follow-up[18].

      Subjective self-reports are important, but modest improvements are difficult to separate from a placebo response and other reporting biases when a trial is non-blinded and tests therapies that aim to change patients’ perceptions about their illness[19-23]. This issue becomes more relevant given that there was a complete absence of meaningful improvements to multiple objective outcomes[24] (the small improvement in walking distance for the GET group has been attributed by CBT/GET proponents to participants pushing themselves harder on the test rather than being fitter[25]).

      i) QMUL spent £245,745 on legal fees trying to prevent release of the requested data[26], and were also part of a failed lobbying attempt to be removed from the FOIA[27].

      References

      [continued below...]


      This comment, imported by Hypothesis from PubMed Commons, is licensed under CC BY.