Ed Butler
Policies Around Medical Data Ownership and Access
I have heard policy debates around the ownership and access to medical data since the early 1990s. This is an area where the public actually has conflicting goals. The first of which is the need to make rapid progress in medicine to save lives and reduce suffering, while the second is the idea that personal data should be private and controlled by each individual. Medical scientists and software developers need access to high quality clinical data, especially now that machine learning models can be built with this data to accomplish tasks not previously possible. The emergence of big data resources resulting from digitizing most commerce, including healthcare, have led to concerns about how it is used and compensated. Not least is the concern by individuals about their rights to privacy (including in the EU the “right to be forgotten”). Access to data is not to be confused with ownership of data. Patients in the U.S. have the rights (via HIPAA and the 21st Century Cures Act) to get copies of their medical records. It is also understandable that healthcare delivery systems need clinical documentation not only to serve patients but also to get paid, to defend themselves, and to improve their operations. These are not simple issues, and solutions go well beyond the FDA’s remit, but the future of radiology AI will be gated by the policy response.
Public Fear of AI
Support for Regulatory Reform
In an attempt to improve the current process, the FDA has proposed the Digital Health Software Pre-certification (Pre-Cert) Program. The Pre-Cert program intends to create a more streamlined regulatory process where manufacturers who have demonstrated a culture of quality can commit to monitoring the real-world performance of their products in the U.S. market.
It is noteworthy that the FDA and other regulatory agencies around the world recognize that the public will benefit from modernizing the process. However, as much as the industry may want self-certification, the recent tragedies associated with Boeing’s 737 Max-8 may also be considered. An entirely different federal agency, the Federal Aviation Administration (FAA) regulates commercial aircraft, so this is clearly not the FDA. A comparison could be made to the FAA’s program to allow certain aircraft manufacturers to self-certify. The FDA Pre-Cert program allows AI developers to deploy new SaMDs that have passed internal review within their own Quality Systems, deferring agency attention to post-market surveillance for pre-certified entities. During the workshop, it will be informative to hear if anyone questions the parallels between these self-certification programs. In Boeing’s case, recent disclosures of emails by Boeing employees reveal disturbing consequences of perceived business imperatives overshadowing quality concerns. Even though this example comes from a completely different domain, aircraft design, components, and pilot training, the support for regulatory reform for AI in radiology SaMDs can be informed by the knowledge of the unintended consequences of FAA’s program. Murphy’s Law, “that which can go wrong will go wrong” has not been repealed.
Getting Involved
The FDA is inviting comments on this workshop until mid-March 2020. A robust discussion from multiple perspectives will help the agency balance these difficult questions. To learn more about the FDA comment process, visit: FDA Public Workshop – Evolving Role of Artificial Intelligence in Radiological Imaging.
This resource was first published prior to the 2020 rebranding of CuraCloud to Keya Medical. The content reflects our legacy brand.
Recent Comments