- Jul 2018
-
europepmc.org europepmc.org
-
On 2016 Jun 03, David C. Norris commented:
The ‘spam’ clinical messaging applications advanced by Schneeweiss will be Medicine’s just deserts for adopting “analytic approaches that embrace the data turmoil”<sup>1</sup> of a routine clinical practice which eschews scientific method. Only scientific practice, capturing in computable form exquisite clinical observation together with detailed clinical reasoning, holds the promise of individualized care within a true learning system.<sup>2</sup> Whatever obstacles Medicine presents to scientific reform, equally fearsome technical and social-engineering problems bedevil ‘big data’.
These problems arise from Judea Pearl’s “golden rule of causal analysis: No causal claim can be established by a purely statistical method, be it propensity scores, regression, stratification, or any other distribution-based design.”<sup>3(p350)</sup> Sound causal inference in non-experimental settings apparently requires graphical methods to support correct reasoning about confounding.<sup>4</sup> When deconfounding variables are missing—surely typical wherever inference relies on ‘big data’ scavenged from the waste bins of routine clinical practice—then quantitative bias analysis<sup>5</sup> becomes essential for credibly evaluating the uncertainty engendered by residual confounding. Would the software automation Schneeweiss envisions be grounded in such methods? If so, will “users with little training”<sup>1</sup> adopt methods at which most professionals balk?
[2] Weed LL, Weed L. Medicine in Denial. Charleston, SC: CreateSpace; 2011.
[4] Howards PP, 2012
This comment, imported by Hypothesis from PubMed Commons, is licensed under CC BY.
-
- Feb 2018
-
europepmc.org europepmc.org
-
On 2016 Jun 03, David C. Norris commented:
The ‘spam’ clinical messaging applications advanced by Schneeweiss will be Medicine’s just deserts for adopting “analytic approaches that embrace the data turmoil”<sup>1</sup> of a routine clinical practice which eschews scientific method. Only scientific practice, capturing in computable form exquisite clinical observation together with detailed clinical reasoning, holds the promise of individualized care within a true learning system.<sup>2</sup> Whatever obstacles Medicine presents to scientific reform, equally fearsome technical and social-engineering problems bedevil ‘big data’.
These problems arise from Judea Pearl’s “golden rule of causal analysis: No causal claim can be established by a purely statistical method, be it propensity scores, regression, stratification, or any other distribution-based design.”<sup>3(p350)</sup> Sound causal inference in non-experimental settings apparently requires graphical methods to support correct reasoning about confounding.<sup>4</sup> When deconfounding variables are missing—surely typical wherever inference relies on ‘big data’ scavenged from the waste bins of routine clinical practice—then quantitative bias analysis<sup>5</sup> becomes essential for credibly evaluating the uncertainty engendered by residual confounding. Would the software automation Schneeweiss envisions be grounded in such methods? If so, will “users with little training”<sup>1</sup> adopt methods at which most professionals balk?
[2] Weed LL, Weed L. Medicine in Denial. Charleston, SC: CreateSpace; 2011.
[4] Howards PP, 2012
This comment, imported by Hypothesis from PubMed Commons, is licensed under CC BY.
-