News & Insights

Client Alert

November 18, 2022

GAO Issues Report on Artificial Intelligence in Health Care


On September 29, 2022, the United States Government Accountability Office (“GAO”) and the National Academy of Medicine (“NAM”) jointly published a detailed Report to Congressional Requesters titled Artificial Intelligence in Health Care: Benefits and Challenges of Machine Learning Technologies for Medical Diagnostics (“Report”).1Available at GAO 22-104629, Artificial Intelligence in Health Care: Benefits and Challenges of Machine Learning Technologies for Medical Diagnostics.  The Report proceeds in two parts: the first is the GAO’s Technology Assessment, which discusses advances presented by actual and emerging medical machine learning (“ML”) diagnostic technologies for five key disease states, including certain cancers, diabetic retinopathy, Alzheimer’s disease, heart disease, and COVID-19, as well as challenges to the development and adoption of these technologies; the second is the NAM’s Special Publication regarding barriers to and considerations for facilitating the clinical use of artificial intelligence (“AI”)-based decision support tools in medical diagnosis.

Challenges to AI/ML Development and Use

The Report recognizes the vast potential of AI and ML (which is a subset of AI) to increase the effectiveness, efficiency, and accuracy of diagnostic processes.  The Report also observes that these technologies are not yet widely implemented.  According to the Report, some of the key challenges to the development and adoption of these technologies include:

  • Reluctance of healthcare providers (“HCPs”) to adopt technologies that have not been reviewed by FDA or adequately validated, including in diverse and generalizable data sets reflective of real-world settings, or rigorously reviewed by FDA for safety and effectiveness. The GAO cited a review of 516 studies evaluating image-based AI algorithms that found that 94% of these studies were inadequate to validate the alorithms;
  • Difficulties faced by developers in evaluating and validating AI/ML diagnostic technologies, such as challenges in accessing high quality, real-world data to use for software training and validation;
  • Hesitation by HCPs to adopt technologies if they do not understand how the technology works or its limitations or biases, or if the HCP is not provided with clinical evidence about the technology’s performance.
  • Gaps in regulatory standards to facilitate the development of AI/ML, such as needed clarifications regarding the standards for clinical validation; how FDA will draw the line between AI/ML decision support software that is exempt from FDA regulation versus software that is not; and how FDA will review modifications associated with adaptive algorithms. Regarding the latter, the GAO noted that FDA issued a discussion paper in 2019 with a proposed regulatory framework for “Predetermined Change Control Plans” but has not yet met its stated goal of issuing draft guidance on this topic (originally planned for 2021; FDA’s Center for Devices and Radiological Health has identified a draft guidance on Marketing Submission Recommendations for A Change Control Plan for Artificial Intelligence/Machine Learning (AI/ML)-Enabled Device Software Functions as a document that it intends to publish in Fiscal Year 2023, as resources permit).  The NAM noted that uncertainties related to the potential for algorithms to be modified after initial FDA authorization as a result of ML (adaptive algorithms), without clarity regarding the regulatory standards for such modifications, “may leave health care providers in the uncomfortable position of…potentially facing liability if patient injuries occur” in connection with adaptive algorithms.2Report at 61. More broadly, the NAM observes that, “Uncertainty about which [AI/ML] tools will receive FDA oversight – and which marketing authorization process the FDA may require (e.g., premarket approval, 510(k) or de novo classification) – likely fuels provider discomfort with using AI-DDS [AI diagnostic decision support] tools.”3Id. at 58.
Policy Recommendations, Including Recommendations That May Influence FDA

In the Report, both the GAO and the NAM offer a number of policy recommendations to enhance the development and deployment of AI/ML diagnostic technologies.  These recommendations are wide-ranging and include ones geared toward providing incentives for clinicians to adopt these technologies, such as creating reimbursement programs to encourage consistent use of diagnostic AI decision support tools and integrating these technologies into medical education and accreditation programs.  Significantly, several recommendations are directed toward FDA:

  • To improve HCPs’ confidence in AI/ML diagnostic products, the GAO advised that FDA and other policymakers “could create incentives, guidance, or policies to encourage or require the evaluation of ML diagnostic technologies across a range of deployment conditions and demographics representative of intended use.”4Id. at 28. Notably, in April of this year, FDA issued for public comment a draft guidance for medical devices titled Diversity Plans to Improve Enrollment of Participants from Underrepresented Racial and Ethnic Populations In Clinical Trials.5Available at https://www.fda.gov/media/157635/download. The GAO’s recommendation may encourage FDA to maintain the draft guidance’s approach of placing responsibility with industry to create plans to ensure that AI/ML technologies (and other devices) are clinically validated in diverse populations.
  • The GAO also observed that, to help establish clinical validity and generalizability of device results, it would be useful to ensure that AI/ML technologies are “trained using high-quality data that are representative of the intended patient population, then tested and validated on diverse external datasets” separate from the training data.6Id. at 23. FDA has espoused the use of distinct training and validation data sets to ensure generalizability of device performance and the absence of bias in AI/ML technologies, as reflected in the Agency’s Good Machine Learning Practice for Medical Device Development: Guiding Principles7Available at https://www.fda.gov/media/153486/download. Even though this document is not a binding regulation, the Report’s recommendation will likely facilitate continued FDA expectations consistent with the document when reviewing individual AI/ML technologies.  Notably, certain other principles in the document align with the Report’s findings, and with similar concepts in FDA’s recently issued final guidance on clinical decision support (“CDS”) software8The CDS guidance, issued September 28, 2022, is available at https://www.fda.gov/media/109618/download.  King & Spalding’s client alert about the guidance is available here. (e.g., the principle that users be provided with various “clear, essential information” about AI/ML technology, such as performance data, data used to train and test the software, inputs and known limitations, and the basis for software decision-making).  This principle relates to the Report’s finding, noted above, that clinicians’ lack of understanding about how AI/ML technology works is a hindrance to adoption9See supra, note 7.; the principle also echoes FDA’s push in its recently issued Clinical Decision Support Software: Guidance for Industry and Food and Drug Administration Staff for detailed information about CDS software to be made available to users so that they can “independently review the basis” for the software’s recommendations.10Supra note 8.
  • Of potential help to AI/ML developers, the GAO recommends that policymakers, including FDA, “develop or expand access to high-quality medical data” for use by software developers “to develop and test ML medical diagnostic technologies.”11Report at 29. To this end, GAO advises that “policymakers could reach agreement about data standards or share best practices for collecting and sharing data,” facilitate “mechanisms for data sharing, such as data commons,” or “use incentives, such as grants or access to databases, to encourage data sharing” by healthcare institutions.12Id.
  • Also of potential interest to AI/ML developers, the NAM recommends FDA to consider that, “over the long term, notice-and-comment rulemaking may offer advantages over the continued use of guidance documents” for regulating in this space.13Id. at 68. Referencing FDA’s CDS guidances as an example, the NAM states that, “Unfortunately, guidance documents – whether draft or final – have no binding legal effect and do not establish clear, enforceable legal rights and duties on which software developers, clinicians, state regulators, and members of the public can rely. …[T]here can be long term costs when agencies choose to rely on guidance and voluntary compliance instead of promulgating enforceable regulations.”14Id. at 61.

As one illustration, the NAM flags FDA’s 2021 Artificial intelligence/Machine Learning (AI/ML)-Based Software as a Medical Device (SaMD) Action Plan, which envisions developers “incorporating ongoing post-marketing monitoring and updating of software tools after they enter clinical use.”15Id. Noting that this “implies that vendors and developers of AI/ML tools will need access to real-world clinical health care data to support ongoing monitoring of how the tools perform in actual clinical use,” the NAM states that “regulating via non-binding guidance documents,” instead of by regulations that create enforceable requirements, is less desirable because it removes a legal pathway by which HCPs could otherwise share protected health information with developers/manufacturers that is needed for robust post-marketing monitoring.16Id.  Specifically, “[t]he HIPAA Privacy Rule contains an exception that lets HIPAA-covered health care providers, such as hospitals, share data with device manufacturers to help them meet their FDA regulatory compliance obligations….  Unfortunately, when FDA regulates manufacturers by means of guidance documents and other non-mandatory programs, this important HIPAA pathway for accessing data may be unavailable, because guidance documents create no enforceable legal obligations.”17Id.

The NAM also concludes that “[s]afe clinical use of AI/DDS tools will ultimately require state-level medical practice regulations and common law addressing issues such as appropriate staffing for, and use of, AI/DDS tools in clinical settings.  To foster optimal development of state law, it is helpful to have federal regulations providing a stable demarcation between the FDA’s role versus that of the states.  Federal guidance documents, due to their non-binding nature and ease of revision, may not meet this need.”18Id. at 62.v

Conclusion

The Report was provided to Congressional Requesters, including ranking members and leaders of FDA Congressional oversight committees, as well as FDA and other affected agencies.  There is potential for these recommendations to impact FDA behavior, and we will watch closely to see how the Report’s recommendations may shape FDA’s activities going forward.  As noted, we anticipate some recommendations will support the continuation of existing FDA initiatives.  The recommendation that the Agency consider changing its approach of regulating via guidances and other informal documents—and proceed by notice-and-comment rulemaking instead—is significant and, if adopted, would represent a substantial departure from FDA’s longstanding modus operandi in the software space.  In addition to the considerations identified in the Report, proceeding by rulemaking would also offer benefits of better ensuring FDA’s compliance with administrative law principles and, potentially, better aligning FDA’s regulatory positions with statutory provisions limiting its authority over software, as discussed in our recent client alert about FDA’s latest CDS guidance.19See note 8 supra.

Considering the length of the Report (over 100 pages), we have prepared a detailed summary accessible here.  We are available to discuss the Report’s implications for particular companies and will continue to monitor developments relevant to AI/ML technologies and other health-related software. 

*      *      *

King & Spalding LLP regularly counsels digital health, medical device, and pharmaceutical companies and healthcare institutions on a range of issues affecting digital health technologies. Please let us know if you have any questions or concerns concerning the Report discussed in this Client Alert, or if we can be of other assistance in navigating the rapidly evolving digital health landscape.