Members

Regulation and standardization of AI in healthcare

AI Technology is making its presence in many fields now a days with advancement of IT. Medical field and specifically medical device field is also accepting that technology.
You must be aware that AI is mimicking human mind like reasoning, learning and problem solving in simple language recognising some objects, doing translation of language in truer sense and not merely word translation which is based on feed data.
Now you can imagine this great IT based technology to be used for medical treatment hence the reliability has to be great.
Since lot of progress has been made with AI technology and its use started in medical devices, regulators may consider multiple approaches for addressing the security and effectiveness of AI in healthcare, including how international standards and other best practices are currently wont to support the regulation of medical software.This is not an easy taskas it is going to be the necessity to get actual clinical evidence for AI throughout its life cycle, this calls for for extra clinical evidence to support adaptive systems and not like normal software qualified as medical device.
AI thus becomes new risk, but that is not covered within the present portfolio of standards and guidance for software. Normal approach taken for software will not resolve that risk so different approaches are going to be required.Security of AI and its performance is thoroughly validated before it is put into Market. As these new approaches are being defined, the present regulatory landscape for software should be considered as start point .
The Medical Device Regulation (MDR) and In Vitro Diagnostic Regulation (IVDR) include several generic requirements which will apply to software. Let us understand them These contains the following:
1. General obligations of manufacturers, like risk management, clinical performance evaluation, quality management, technical documentation, unique device identification, post-market surveillance and corrective actions.
2. Requirements regarding design and manufacture, including construction of devices, interaction with the environment, diagnostic and measuring functions, active and connected devices; and
3. Information furnished with the device, like labelling and directions to be used.
Above need to be modified suitably for AI.
In the U.S., the FDA recently published a discussion paper for a proposed regulatory framework for modifications to AI/machine learning based SaMD. It is based upon practices from current FDA premarket programs, including 510(k), De Novo, and Premarket Approval (PMA) pathways. It utilizes risk categorization principles from the IMDRF, alongside the FDA benefit-risk framework, risk management principles within the software modifications guidance, and therefore the Total Product Life Cycle (TPLC) approach from the FDA Digital Health Pre-Cert program.
Similarly, AAMI also has come out with white paper based on conference held in 2020. AAMI and BSI in consultation with stakeholders, came out with some recommendations,
1.Develop standard medical terminology and Taxonomy through IMDRF and other regulatory bodies.
2. IMDRF to create working group for AI.
3.MAP other international regulatory standard (Where such exist) and observe GAP for improvement.
4. Developing guidance on factors affecting data quality.
5.Establishing common set of criteria for deployment of AI in healthcare system.
6. Developing risk management guidance to assist in applying ISO 14971 to AI as medical technology.
7. Developing on guidance on factors considered in the Validation of AI system and on the use of non-traditional approaches to demonstrate reasonable assurance to product quality.

Views: 2

Comment

You need to be a member of On Feet Nation to add comments!

Join On Feet Nation

© 2024   Created by PH the vintage.   Powered by

Badges  |  Report an Issue  |  Terms of Service