
The Food and Drug Administration has issued new guidelines on how it will regulate mobile health software and products that use artificial intelligence to help doctors decide how to treat patients.
The guidelines, contained in a pair of documents released Thursday morning, clarify the agency’s intent to focus its oversight powers on AI decision-support products that are meant to guide treatment of serious or critical conditions, but whose rationale cannot be independently evaluated by doctors.
To further define the types of products that will require greater scrutiny, the FDA gave the example of a clinical decision support (CDS) tool that, without explaining its rationale, identifies hospitalized type 1 diabetic patients at high risk of severe heart problems following surgery. If such a product were to give an inappropriate recommendation, the agency said, it could result in serious harm to the patient.