AI applied in healthcare

Fonte: aprendis
Revisão em 14h33min de 24 de dezembro de 2024 por Henrikpereira (discussão | contribs) (→‎EU AI Act)
Saltar para a navegaçãoSaltar para a pesquisa

Authors (HEADS-INFORM-2025):

  • Henrique Pereira
  • Olívia Oliveira

Introduction

From early times, we have always tried to use Artificial Intelligence (AI) in various industries (successfully or not), and the healthcare sector is no exception. The application of AI and Machine Learning (ML) in medicine and healthcare services can offer unprecedent opportunities to improve disease prevention, diagnosis, treatment, and management, contributing to more effective and efficient clinical outcomes, and even better public health indicators, and also the overcome of major public health problems using big data.[1] [2] [3]

Applications of AI in Healthcare

The applications of AI in healthcare first started as a "grey zone" of Medical Devices (MD), but now are commonly aggregated under the classification of Software as Medical Device (SMD), with applications ranging from Disease Diagnosis to Clinical Decision Support.

Disease Diagnosis and Prognosis: AI algorithms can analyse large volumes of medical data to identify patterns that may escape human observation. This is particularly useful in medical imaging, where AI can assist in the early detection of diseases such as cancer by analysing photografies, X-rays, magnetic resonance imaging (MRI), and computed tomography (CT) scans. [4] [5]

Patient Monitoring: Medical devices equipped with AI can continuously monitor patients' vital signs, predict potential deteriorations in health status, and alert healthcare professionals in real-time. This allows for quicker interventions which can significantly improve patient outcomes. [6] [7] [8]

Health Data Management: AI facilitates the management of electronic health records by aiding in the organisation, analysis, and interpretation of complex data. This can improve administrative efficiency with enough autonomy (limited by nature) to allow more time for healthcare professionals to focus on patient care and R&D. [9] [10]

Clinical Decision Support: AI-based clinical decision support systems provide recommendations to physicians by considering best practices and up-to-date clinical evidence. This helps standardise care and reduce medical errors.[11] [12]

Regulatory Aspects of AI as Medical Devices

According to regulatory agencies (FDA, and EMA), a medical device can be defined generaly as any instrument, apparatus, software, or material used for diagnostic, therapeutic purposes, mitigation of disease, and prevention of disease [13] [14]. Therefore, AI and ML applications that fullfill any of the previous definition, are implied to in this spectrum of specific regulation. In fact, Healthcare applications of AI and ML are classified as medical devices because they can influence directly clinical decisions. This requires developers and manufacturers to comply with stringent standards to ensure the safety and effectiveness of the products before they are available to be used. [1] [15].

Regulatory Framework by the FDA

The Role of the FDA

The United States Food and Drug Administration (FDA) is responsible for ensuring the safety and effectiveness of medical devices in the United States, including those enabled by AI and ML. The FDA recognises the transformative potential of AI in healthcare and has been working to develop guidelines that address the unique challenges of these devices.[16] [17]

Guidelines and Published Principles

Good Machine Learning Practice (GMLP) for Medical Device Development: The FDA has published guiding principles for the development of AI-based medical devices, emphasising the need for transparency, robustness, and risk management. [18] [19] Clinical Decision Support Software Guidance: The FDA also provides guidance on when clinical decision support (CDS) software is considered a medical device and thus subject to regulation [20]

Regulatory Pathways

The FDA has established various regulatory pathways for AI-enabled medical devices, including:

  • 510(k) Premarket Notification: For devices that are substantially equivalent to a legally marketed predicate device. [21]
  • De Novo Classification: For novel devices with low to moderate risk that do not have a predicate. [22]
  • Premarket Approval (PMA): For high-risk devices that require extensive clinical evidence. [23]

Developers must submit appropriate documentation demonstrating safety and effectiveness, including clinical trial data when necessary. [24]

Post-Market Surveillance

Taking into account that AI and ML products may evolve over time due to repeated learning on new data, there is a particular interest of FDA keep a close monitoring. Thus, FDA requires ongoing post-market surveillance for AI medical devices to monitor performance, manage risks, and implement necessary updates. [25] [26]

FDA-Approved AI Medical Devices

The FDA has approved numerous AI-based medical devices and keeps an updated website where anyone can check what authorizations were made, which are being evaluated, the scope of their use and many other details. [15] The data provided by the FDA on this matter also generated public discussion about how medical AI devices were evaluated [27] [28], and also the inevitable comparison of the FDA vs EMA regulatory landscape [29]

Regulatory Framework by the EMA and the European Union

The Role of the EMA and MDR

The European Medicines Agency and the Medical Devices Regulation of the European Union establish the regulatory framework for medical devices in Europe [30] [31] The MDR, which became fully applicable in May 2021 (updated in 2024), updates and strengthens the requirements to ensure the safety and effectiveness of medical devices, including those based on AI [32]

EU AI Act

European Union, by the European Parliament, took the leap and became the world first in regulating AI as a comprehensive law - The EU AI Act. It is believed that this regulation will significantly impact the medical device sector [33] [34] [35] The EU AI Act adopts a risk-based approach, categorising AI systems into different risk levels (unacceptable, high, limited, minimal). Medical devices using AI are typically considered high-risk and are subject to stringent requirements. [36] One important aspect of the EU AI Act is how it sets requirements for transparency, safety, and data governance, which also applies to AI systems used in medical contexts. Such obligations include the need of a data quality framework, full documentation, and emphasis on human oversigh _Shaping_Europe’s_Digital_Future»_2019)-37|[37] [38]

Conformity Assessment and CE Marking

Before an actual authorization to deploy on the European medical market, according to the MDR, medical devices must undergo a conformity assessment by a Notified Body to verify compliance with the MDR. AI based medical devices are no exception. [39] [31] Notified Bodies are independent organisations designated by EU countries to assess the conformity of certain products before being placed on the market [39] [40]. Upon successful assessment, devices receive CE marking, indicating conformity with EU safety, health, and environmental protection standards [41]

Post-Market Surveillance and Vigilance

Manufacturers are then required to establish and maintain a post-market surveillance system to monitor the performance of their devices, reporting any incidents or field safety corrective actions to the relevant authorities. [42]

Guidance from the EMA

The EMA provides guidance on the use of AI in the context of medicines regulation, focusing on aspects such as data quality, algorithm validation, and ethical considerations _European_Medicines_Agency_(EMA)»_2023)-43|[43] _European_Medicines_Agency_(EMA)»_2023)-44|[44]

AI Medical Devices approved in European Union

Althought there is an European database for approved medical Devices (EUDAMED), because of its infancy as a centralised database, still lacks completness. Previously, the approval of the work of Notified Bodies, were managed by local authorities (like INFARMED in Portugal ). It is expected this central database to be poppulated in the future. [39] [45] As the EUDAMED database is not complete, or offers better access to the documentation of AI Medical Devices in Europe (for now...), we only have secondary information available publicly, mainly in the Radiology field [46] or in selected review papers [29]

Comparison

Comparing both Regulatory Agencies [29] we can see visible differences in the following table (source: https://www.thelancet.com/action/showFullTableHTML?isHtml=true&tableId=tbl1&pii=S2589-7500%2820%2930292-2):

USA Europe*
Regulatory agency
Organisation FDA Accredited private Notified Bodies; manufacturer’s self-responsibility for (low risk) medical devices; mutual recognition between EU States, EFTA States, and Turkey
Centralised or decentralised Centralised Decentralised
Regulatory pathway
Specific pathway for AI/ML-based medical devices None None; general requirements are safety, performance, and reliability; clinical studies generally assess high-risk devices; requirements can vary across Notified Bodies
Premarket approval Most stringent regulatory category for high-risk medical devices (class III); devices must provide valid scientific evidence from non-clinical and clinical studies showing safety and effectiveness NA
510(k) pathway For class I, II and III medical devices for which premarket approval is not indicated; submitters must compare their device to one or more similar legally marketed devices; it can include non-clinical and clinical performance data NA
De-novo premarket review For class I or class II medical devices for which general controls alone, or general and special controls, provide reasonable assurance of safety and effectiveness for the intended use NA
Approval pathway
Type of approval Approval by FDA CE mark
Public access to approval documents Yes Very limited by availability

CE=Conformité Européenne. EFTA=European Free Trade Association. FDA=US Food and Drug Administration. NA=not applicable.

  •  Europe to referes to EU member states, EFTA countries, and Turkey.

Challenges and Best Practices in Developing Medical AI

Good Machine Learning Practices

Regulatory agencies agree that there is a need for Transparency and Explainability in proposed Algorithms. These should be understandable and explainable to ensure trust in the results, which in turn have to provide clear information about how the AI systems reaches conclusions [47] _International_Medical_Device_Regulators_Forum»_2022)-48|[48]. It is also essential to conduct clinical studies to validate the device's effectiveness and safety in real-world settings [3] [49], as well implement a framework to identify and mitigate potential risks associated with the use of the device. This involves continuous monitoring and updating of the AI system [50] [26].

Ethical and Safety Considerations

From the Data Privacy standpoint, ensuring the protection of patients' personal data is fundamental, complying with regulations like the General Data Protection Regulation (GDPR) in the EU or others [51] [52]. Bias and Fairness: Developers of AI medical devices must avoid biases (and identify those that persist) in algorithms that could lead to disparities in care. This involves using diverse and representative data sets during development [53] [54], or the explicit information of specific uses where bias was not present. The developers also have to state clear definitions of responsibility in case of device failure, including liability for harm caused by decisions influenced by AI medical devices  [55] [56], or adhere to the policy of Human in the Loop to lower the risk of unintended harm [57].

Ongoing Regulatory Developments

Regulatory bodies are continuously adapting to the rapid evolution of AI technologies: Namelly addressing how to regulate AI systems that learn and evolve over time after deployment [58] [59]; and also discussing how to harmonise AI medical device regulations globally, facilitating innovation while ensuring safety [60] [61].

Conclusion

AI applied in healthcare represents a promising frontier for medical and health improvement, pottentially offering significant benefits to patients and healthcare professionals alike. However, it is crucial that these technologies are developed and implemented responsibly, following regulations established by the competent regulatory agencies. Regulatory compliance ensures that AI medical devices are safe, effective, and adhere to the necessary ethical standards in order to be allowed to operate in clinical environments [62] [63].

References

  1. 1,0 1,1 Lacalle, Helena. 2024. ‘AI Medical Device Software under EU MDR & IVDR’. Decomplix. 30 August 2024. https://decomplix.com/ai-medical-device-software-eu-mdr-ivdr/.
  2. Topol, Eric J. 2019. ‘High-Performance Medicine: The Convergence of Human and Artificial Intelligence’. Nature Medicine 25 (1): 44–56. https://doi.org/10.1038/s41591-018-0300-7.
  3. 3,0 3,1 Luo, Jake, Min Wu, Deepika Gopukumar, and Yiqing Zhao. 2016. ‘Big Data Application in Biomedical Research and Health Care: A Literature Review’. Biomedical Informatics Insights 8 (January):1–10. https://doi.org/10.4137/BII.S31559.
  4. Esteva, Andre, Brett Kuprel, Roberto A. Novoa, Justin Ko, Susan M. Swetter, Helen M. Blau, and Sebastian Thrun. 2017. ‘Dermatologist-Level Classification of Skin Cancer with Deep Neural Networks’. Nature 542 (7639): 115–18. https://doi.org/10.1038/nature21056.
  5. ‘Artificial Intelligence (EMA)’. 2023. 18 December 2023. https://www.ema.europa.eu/en/about-us/how-we-work/big-data/artificial-intelligence.
  6. Matias, Igor, Matthias Kliegel, and Katarzyna Wac. 2024. ‘Providemus Alz: Ubiquitous Screening of Preclinical Alzheimer’s Disease with Consumer-Grade Technologies’. In Companion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing, 743–51. UbiComp ’24. New York, NY, USA: Association for Computing Machinery. https://doi.org/10.1145/3675094.3678425.
  7. swissinfo.ch, S. W. I. 2023. ‘Ferramenta de IA prevê estado de saúde de pacientes melhor do que a maioria dos médicos’. SWI swissinfo.ch (blog). 7 June 2023. https://www.swissinfo.ch/por/ferramenta-de-ia-prevê-estado-de-saúde-de-pacientes-melhor-do-que-a-maioria-dos-médicos/48575102.
  8. Lee, Su-In, Safiye Celik, Benjamin A. Logsdon, Scott M. Lundberg, Timothy J. Martins, Vivian G. Oehler, Elihu H. Estey, et al. 2018. ‘A Machine Learning Approach to Integrate Big Data for Precision Medicine in Acute Myeloid Leukemia’. Nature Communications 9 (1): 42. https://doi.org/10.1038/s41467-017-02465-5.
  9. Davenport, Thomas, and Ravi Kalakota. 2019. ‘The Potential for Artificial Intelligence in Healthcare’. Future Healthcare Journal 6 (2): 94–98. https://doi.org/10.7861/futurehosp.6-2-94.
  10. Johnson, Alistair E. W., Mohammad M. Ghassemi, Shamim Nemati, Katherine E. Niehaus, David A. Clifton, and Gari D. Clifford. 2016. ‘Machine Learning and Decision Support in Critical Care’. Proceedings of the IEEE. Institute of Electrical and Electronics Engineers 104 (2): 444–66. https://doi.org/10.1109/JPROC.2015.2501978.
  11. Shortliffe, Edward H., and Martin J. Sepúlveda. 2018. ‘Clinical Decision Support in the Era of Artificial Intelligence’. JAMA 320 (21): 2199–2200. https://doi.org/10.1001/jama.2018.17163.
  12. Musen, Mark A., Blackford Middleton, and Robert A. Greenes. 2014. ‘Clinical Decision-Support Systems’. In Biomedical Informatics: Computer Applications in Health Care and Biomedicine, edited by Edward H. Shortliffe and James J. Cimino, 643–74. London: Springer. https://doi.org/10.1007/978-1-4471-4474-8_22.
  13. Regulation (EU) 2017/745 of the European Parliament and of the Council of 5 April 2017 on Medical Devices, Amending Directive 2001/83/EC, Regulation (EC) No 178/2002 and Regulation (EC) No 1223/2009 and Repealing Council Directives 90/385/EEC and 93/42/EEC (Text with EEA Relevance). 2024. http://data.europa.eu/eli/reg/2017/745/2024-07-09/eng.
  14. Health, Center for Devices and Radiological. 2023. ‘How to Determine If Your Product Is a Medical Device’. FDA, August. https://www.fda.gov/medical-devices/classify-your-medical-device/how-determine-if-your-product-medical-device.
  15. 15,0 15,1 Health, Center for Devices and Radiological. 2024. ‘Artificial Intelligence and Machine Learning (AI/ML)-Enabled Medical Devices’. FDA, July. https://www.fda.gov/medical-devices/software-medical-device-samd/artificial-intelligence-and-machine-learning-aiml-enabled-medical-devices.
  16. Commissioner, Office of the. 2024. ‘FDA Releases Artificial Intelligence/Machine Learning Action Plan’. FDA. 8 September 2024. https://www.fda.gov/news-events/press-announcements/fda-releases-artificial-intelligencemachine-learning-action-plan.
  17. Benjamens, Stan, Pranavsingh Dhunnoo, and Bertalan Meskó. 2020. ‘The State of Artificial Intelligence-Based FDA-Approved Medical Devices and Algorithms: An Online Database’. Npj Digital Medicine 3 (1): 1–8. https://doi.org/10.1038/s41746-020-00324-0.
  18. Health, Center for Devices and Radiological. 2023. ‘Good Machine Learning Practice for Medical Device Development: Guiding Principles’. FDA, October. https://www.fda.gov/medical-devices/software-medical-device-samd/good-machine-learning-practice-medical-device-development-guiding-principles.
  19. ‘Machine Learning-Enabled Medical Devices: Key Terms and Definitions | International Medical Device Regulators Forum’. 2022. 9 May 2022. https://www.imdrf.org/documents/machine-learning-enabled-medical-devices-key-terms-and-definitions.
  20. Health, Center for Devices and Radiological. 2023. ‘Your Clinical Decision Support Software: Is It a Medical Device?’ FDA, October. https://www.fda.gov/medical-devices/software-medical-device-samd/your-clinical-decision-support-software-it-medical-device.
  21. Health, Center for Devices and Radiological. 2024. ‘Premarket Notification 510(k)’. FDA. 22 August 2024. https://www.fda.gov/medical-devices/premarket-submissions-selecting-and-preparing-correct-submission/premarket-notification-510k.
  22. Health, Center for Devices and Radiological. 2024. ‘De Novo Classification Request’. FDA, September. https://www.fda.gov/medical-devices/premarket-submissions-selecting-and-preparing-correct-submission/de-novo-classification-request.
  23. Health, Center for Devices and Radiological. 2023. ‘Premarket Approval (PMA)’. FDA. 15 August 2023. https://www.fda.gov/medical-devices/premarket-submissions-selecting-and-preparing-correct-submission/premarket-approval-pma.
  24. Health, Center for Devices and Radiological. 2023. ‘Content of Premarket Submissions for Device Software Functions’. 8 September 2023. https://www.fda.gov/regulatory-information/search-fda-guidance-documents/content-premarket-submissions-device-software-functions.
  25. Health, Center for Devices and Radiological. 2024. ‘522 Postmarket Surveillance Studies Program’. FDA. 8 September 2024. https://www.fda.gov/medical-devices/postmarket-requirements-devices/522-postmarket-surveillance-studies-program.
  26. 26,0 26,1 ‘Proposed Regulatory Framework for Modifications to Artificial Intelligence/Machine Learning (AI/ML)-Based Software as a Medical Device (SaMD)’. 2019. Accessed 4 December 2024. https://www.regulations.gov/docket/FDA-2019-N-1185.
  27. Wu, Eric, Kevin Wu, Roxana Daneshjou, David Ouyang, Daniel E. Ho, and James Zou. 2021. ‘How Medical AI Devices Are Evaluated: Limitations and Recommendations from an Analysis of FDA Approvals’. Nature Medicine 27 (4): 582–84. https://doi.org/10.1038/s41591-021-01312-x.
  28. ‘Medical AI Evaluation Database’. n.d. Accessed 4 December 2024. https://ericwu09.github.io/medical-ai-evaluation/#.
  29. 29,0 29,1 29,2 Muehlematter, Urs J., Paola Daniore, and Kerstin N. Vokinger. 2021. ‘Approval of Artificial Intelligence and Machine Learning-Based Medical Devices in the USA and Europe (2015–20): A Comparative Analysis’. The Lancet Digital Health 3 (3): e195–203. https://doi.org/10.1016/S2589-7500(20)30292-2.
  30. ‘Overview - European Commission’. 2024. 21 November 2024. https://health.ec.europa.eu/medical-devices-sector/overview_en.
  31. 31,0 31,1 ‘Regulation - 2017/745 - EN - Medical Device Regulation - EUR-Lex’. n.d. Accessed 24 November 2024. https://eur-lex.europa.eu/eli/reg/2017/745/oj.
  32. Erro de citação: Etiqueta <ref> inválida; não foi fornecido texto para as refs de nome (Regulation (EU) 2017/745 2024)
  33. ‘EU AI Act: First Regulation on Artificial Intelligence’. 2023. Topics | European Parliament. 8 June 2023. https://www.europarl.europa.eu/topics/en/article/20230601STO93804/eu-ai-act-first-regulation-on-artificial-intelligence.
  34. ‘Artificial Intelligence Act, Regulation (EU) 2024/1689 - Links’. n.d. Accessed 5 December 2024. https://www.artificial-intelligence-act.com/Artificial_Intelligence_Act_Links.html.
  35. ‘The Impact of the EU’s AI Act on the Medical Device Sector’. n.d. Accessed 19 November 2024. https://www.ibanet.org/impact-european-union-artificial-intelligence-act.
  36. Veale, Michael, and Frederik Zuiderveen Borgesius. 2022. ‘Demystifying the Draft EU Artificial Intelligence Act’, June. https://doi.org/10.48550/arXiv.2107.03721.
  37. _Shaping_Europe’s_Digital_Future»_2019)_37-0|↑ ‘Ethics Guidelines for Trustworthy AI Shaping Europe’s Digital Future’. 2019. 8 April 2019. https://digital-strategy.ec.europa.eu/en/library/ethics-guidelines-trustworthy-ai.
  38. ‘Building Trust in Artificial Intelligence, Machine Learning, and Robotics’. n.d. ResearchGate. Accessed 24 November 2024. https://www.researchgate.net/publication/324006061_Building_Trust_in_Artificial_Intelligence_Machine_Learning_and_Robotics.
  39. 39,0 39,1 39,2 ‘Notified Bodies - EUDAMED’. n.d. Accessed 19 November 2024. https://ec.europa.eu/tools/eudamed/#/screen/notified-bodies?submitted=true.
  40. ‘Guidance on Qualification and Classification of Software in Regulation (EU) 2017/745 – MDR and Regulation (EU) 2017/746 – IVDR’. n.d. Accessed 5 December 2024. https://ec.europa.eu/docsroom/documents/37581.
  41. ‘CE Marking - European Commission’. n.d. Accessed 24 November 2024. https://single-market-economy.ec.europa.eu/single-market/ce-marking_en.
  42. Lacalle, Helena. 2024. ‘Post-Market Surveillance - 10 Questions about PMS & MDR’. Decomplix. 13 January 2024. https://decomplix.com/post-market-surveillance-pms/.
  43. _European_Medicines_Agency_(EMA)»_2023)_43-0|↑ ‘Reflection Paper on the Use of Artificial Intelligence in the Lifecycle of Medicines | European Medicines Agency (EMA)’. 2023. 19 July 2023. https://www.ema.europa.eu/en/news/reflection-paper-use-artificial-intelligence-lifecycle-medicines.
  44. _European_Medicines_Agency_(EMA)»_2023)_44-0|↑ Erro de citação: Etiqueta <ref> inválida; não foi fornecido texto para as refs de nome («Artificial Intelligence | European Medicines Agency (EMA)» 2023)
  45. ‘EUDAMED Database - EUDAMED’. n.d. Accessed 5 December 2024. https://ec.europa.eu/tools/eudamed/#/screen/home.
  46. ‘Health AI Register’. n.d. Accessed 5 December 2024. http://radiology.healthairegister.com/.
  47. Mittelstadt, Brent, Chris Russell, and Sandra Wachter. 2019. ‘Explaining Explanations in AI’. In Proceedings of the Conference on Fairness, Accountability, and Transparency, 279–88. FAT* ’19. New York, NY, USA: Association for Computing Machinery. https://doi.org/10.1145/3287560.3287574.
  48. _International_Medical_Device_Regulators_Forum»_2022)_48-0|↑ Erro de citação: Etiqueta <ref> inválida; não foi fornecido texto para as refs de nome («Machine Learning-Enabled Medical Devices: Key Terms and Definitions | International Medical Device Regulators Forum» 2022)
  49. Kelly, Christopher J., Alan Karthikesalingam, Mustafa Suleyman, Greg Corrado, and Dominic King. 2019. ‘Key Challenges for Delivering Clinical Impact with Artificial Intelligence’. BMC Medicine 17 (1): 195. https://doi.org/10.1186/s12916-019-1426-2.
  50. ‘ISO 14971:2019(En), Medical Devices — Application of Risk Management to Medical Devices’. n.d. Accessed 24 November 2024. https://www.iso.org/obp/ui/#iso:std:iso:14971:ed-3:v1:en.
  51. ‘General Data Protection Regulation (GDPR) Compliance Guidelines’. n.d. GDPR.Eu. Accessed 24 November 2024. https://gdpr.eu/.
  52. Sun, Tara Qian, and Rony Medaglia. 2019. ‘Mapping the Challenges of Artificial Intelligence in the Public Sector: Evidence from Public Healthcare’. Government Information Quarterly 36 (2): 368–83. https://doi.org/10.1016/j.giq.2018.09.008.
  53. ‘Why Is My Classifier Discriminatory?’ n.d. ResearchGate. Accessed 24 November 2024. https://www.researchgate.net/publication/356086647_Why_is_my_classifier_discriminatory.
  54. Rajkomar, Alvin, Michaela Hardt, Michael D. Howell, Greg Corrado, and Marshall H. Chin. 2018. ‘Ensuring Fairness in Machine Learning to Advance Health Equity’. Annals of Internal Medicine 169 (12): 866–72. https://doi.org/10.7326/M18-1990.
  55. Directorate-General for Justice and Consumers (European Commission). 2019. Liability for Artificial Intelligence and Other Emerging Digital Technologies. Publications Office of the European Union. https://data.europa.eu/doi/10.2838/573689.
  56. Price, W. Nicholson, II, Sara Gerke, and I. Glenn Cohen. 2019. ‘Potential Liability for Physicians Using Artificial Intelligence’. JAMA 322 (18): 1765–66. https://doi.org/10.1001/jama.2019.15064.
  57. Bakken, Suzanne. 2023. ‘AI in Health: Keeping the Human in the Loop’. Journal of the American Medical Informatics Association 30 (7): 1225–26. https://doi.org/10.1093/jamia/ocad091.
  58. Gerke, Sara, Timo Minssen, and Glenn Cohen. 2020. ‘Ethical and Legal Challenges of Artificial Intelligence-Driven Healthcare’. Artificial Intelligence in Healthcare, 295–336. https://doi.org/10.1016/B978-0-12-818438-7.00012-5.
  59. Reddy, Sandeep, Sonia Allan, Simon Coghlan, and Paul Cooper. 2020. ‘A Governance Model for the Application of AI in Health Care’. Journal of the American Medical Informatics Association: JAMIA 27 (3): 491–97. https://doi.org/10.1093/jamia/ocz192.
  60. ‘WHO Guidance on Artificial Intelligence to Improve Healthcare, Mitigate Risks Worldwide | UN News’. 2021. 28 June 2021. https://news.un.org/en/story/2021/06/1094902.
  61. ‘IMDRF Strategic Plan 2021-2025 | International Medical Device Regulators Forum’. 2020. 25 September 2020. https://www.imdrf.org/documents/imdrf-strategic-plan-2021-2025.
  62. ‘Coordinated Plan on Artificial Intelligence 2021 Review | Shaping Europe’s Digital Future’. 2021. 21 April 2021. https://digital-strategy.ec.europa.eu/en/library/coordinated-plan-artificial-intelligence-2021-review.
  63. Officer (OCIO), Office of the Chief Information. 2021. ‘HHS Artificial Intelligence (AI) Strategy’. Page. 22 December 2021. https://www.hhs.gov/programs/topic-sites/ai/strategy/index.html.