This very morning, the École normale supérieure in Paris is hosting a landmark colloquium: "What medicine in the AI era?" organised with the CCNE and AP-HP.[1] Yesterday evening, Euronews published a study showing that AI models rival physicians at complex medical reasoning — with the caveat that "humans must remain the ultimate reference".[2] Two signals posing the same question: how to adopt medical AI intelligently and compliantly?
According to the 2026 AI in business panorama (Bpifrance), 2026 marks the shift from pilot projects to large-scale industrialisation.[3] For physicians, this means that choosing an AI tool is no longer experimental — it is a structural choice that commits the practice to legal compliance.
Microsoft (Copilot), OpenAI (ChatGPT), Google (Gemini) and Amazon (AWS) are US companies subject to the CLOUD Act of 2018. This law allows US federal authorities to demand access to data stored by US companies — even when physically hosted in Switzerland or Europe.
Concretely: if you use Microsoft Azure Healthcare (hosted in Zurich) to document consultations, US authorities could theoretically demand access to that data without notifying you, and without Swiss courts having any say. Only a provider whose parent company is established outside the United States escapes this jurisdiction — such as Infomaniak, a Swiss company under Swiss law based in Geneva.
The EU AI Act classifies medical AI systems as "high risk" (Annex III). From 2 August 2026, providers of such systems must:[4]
See also our articles on medical secrecy and AI, nFADP and medical AI and AI-related medical error liability.
Why not simply use ChatGPT for medical documentation?
ChatGPT can generate acceptable medical text for general cases. But it has three major problems for clinical use in Switzerland: (1) data transits through OpenAI servers subject to the US CLOUD Act, potentially violating the nFADP; (2) there is no data processing agreement (Art. 9 nFADP); (3) OpenAI may use data to train its models. For occasional documentary use on anonymised cases, the risk is limited — but for regular clinical documentation with identifiable health data, it is non-compliant.
Does the CLOUD Act apply to tools hosted in Europe?
Yes, if the parent company is American. The CLOUD Act (2018) allows US federal authorities to demand access to data stored by US companies, even when physically hosted in Europe or Switzerland. Microsoft Azure Switzerland, Amazon AWS Zurich and Google Cloud Zurich all remain subject to the CLOUD Act because their parent companies are American. Only providers whose parent company is established outside the US — such as Infomaniak in Switzerland — escape this jurisdiction.
Can medical AI replace physicians?
No, and that is not its goal. The colloquium 'What medicine in the AI era?' (6 May 2026, ENS Paris, CCNE) reaffirmed this principle: 'humans must remain the ultimate reference for performance and safety assessment.' AI frees up cognitive and administrative time so the physician can focus on what only they can do: clinical decision-making in the context of the patient.
How much does professional medical AI cost?
Solutions range from CHF 30 to 300/month depending on features, user volume and support level. Consumer solutions (ChatGPT Plus, Copilot Pro) cost CHF 20–30/month but are not nFADP and GDPR compliant. Professional medical solutions typically start at CHF 50–100/month for a solo practitioner. The real cost to compare: time saved on documentation — if a solution saves 3 hours/week, it pays for itself quickly even at CHF 100/month.
Swiss Infomaniak hosting · Data processing agreement included · 4 languages · 6 document templates · Data never used for training.
Try free →