Skip to main content

EU says health apps and app stores must follow medical device laws

The EU says makers and distributors of health-related app stores or applications will now be considered medical device makers and must adhere to associated regulations.
By Jessica Hagen , Executive Editor
Someone holding a tablet
Photo: Tassii/Getty Images

The European Union has released two significant guideline changes to expand its medical device framework, updating the classification rules for medical device software to include app platform providers with a direct medical purpose and medical device artificial intelligence. 

The Medical Device Coordination Group Guidelines announced that app stores, algorithm developers and software developers of apps intended for medical use are now considered "economic operators" and must comply with the same regulations as medical devices. 

The EU says that any software that provides therapy or diagnosis is generally considered class IIa, but may move to class IIb or III depending on the risks associated with the software or if it is "intended to prevent illness."

Higher expectations were also given to offerings claiming to be interoperable with electronic health records, with the EU stating they will not only need to comply with medical device regulation and In Vitro Diagnostic Regulation, but also the forthcoming European Health Data Space Regulation. 

"It's a good thing that the regulatory framework is catching up with the reality of modern digital health. For years, we have seen a proliferation of health apps and wearables, many with questionable medical claims," Dr. Guido Giunti, chief data officer at St. James Hospital in Ireland, told MobiHealthNews. 

"Until now, actors like Apple and Google have been acting as if they were mere vending machines for these digital health solutions. We need to account for the fact that in some aspects, they are an actual part of the healthcare supply chain."

Still, Dr. Giunti says some smaller startups or academic projects may struggle to meet the new regulatory requirements, which may reduce innovation in clinical settings. 

"Having the bar set at a higher place will have a chilling effect," Dr. Giunti said. "From a patient safety perspective, this is a win. We need more transparency and accountability, especially now with the wave of AI solutions."

THE LARGER TREND

Last year, the EU AI Act went into effect, setting new guardrails for the development, market placement, implementation and use of artificial intelligence in the European Union.

The law introduced a voluntary Code of Practice that promotes transparency, data documentation and risk management. The law requires developers to disclose how their AI systems are trained and to ensure safety and security throughout the design process. 

The Council wrote that the Act is intended to "promote the uptake of human-centric and trustworthy artificial intelligence while ensuring a high level of protection of health, safety, [and] fundamental rights … including democracy, the rule of law and environmental protection, to protect against the harmful effects of AI systems in the Union, and to support innovation."