top of page

The Case Against Medical Chatbot Patentability

Author: Liu Wenkai

It has been observed over the past years that patients are increasingly relying on chatbots for self-diagnosis and treatment. Under most circumstances mainstream chatbots demonstrated surprisingly good accuracy and consistency, even if they were not specifically trained with medical datasets.[1] As tech companies continue to develop more powerful chatbots catered specifically toward improving diagnosis and treatment, such as Google’s Med-PaLM 2, AI will only become more influential in the field of medicine. This being the case, the question of medical chatbot patentability becomes increasingly prominent, particularly because there is intrinsic tension between the profit-driven characteristic of tech companies and the public nature of medicine and health. The first step toward resolving this question -- defining medical chatbots in patent law, is not as straightforward as one may think.


Generally speaking, while there were some debates over the years about AI legal personhood,[2, 3] most would likely agree by now that we are not yet ready, either legally or ethically, to consider AI as persons given the current stage of development. This perspective is well reflected in patent law -- while AI systems may obtain patents upon satisfying certain criteria,[4] both US and UK courts have ruled against AI inventorship[5, 6]. This, then, provides the fairly defined general understanding of what AI is in patent law -- the invented, rather than the inventor. Empirical observation suggests that under the current legal regime, there is no real differentiation made between medical chatbots and other AI systems. Therefore, principally speaking programmers of medical chatbots should be able to obtain patents for their work in most major jurisdictions, as long as the normal requirements of utility, novelty and non-obviousness are satisfied.


I, however, hold the belief that in delineating medical chatbots within the scope of patent law, we ought to diverge from the conventional interpretation. Like the majority, I don't perceive medical chatbots as robotic physicians, irrespective of their capabilities in acing the medical exams or interpreting a chest X-ray. However, I also find it challenging to categorise them as patentable inventions, as commonly asserted. The reason is that TRIPS Agreement, which is the global agreement under the World Trade Organization that sets minimum standards for intellectual property protection, has a specific provision stating that members of the Agreement may exclude the patentability of “diagnostic, therapeutic and surgical methods for the treatment of humans or animals.”[7] While optional, most major jurisdictions in the world, such as the UK, the EU, China and Japan chose to adopt this provision, thereby making diagnosis and treatment methods unpatentable. The US remains a notable exception.


As radical as it may sound, I think a medical chatbot is in and of itself a diagnosis and treatment method. Think about it, medical chatbots essentially employ the deep learning and data processing abilities of AI to provide new ways to diagnose conditions and search for treatment methods. While there is no denying that the usage of AI for these tasks is highly innovative, from a technological neutrality standpoint the addition of a new technology does not absolve the fact that medical chatbots derive their foundation from traditional clinical wisdom, having been trained on existing medical data and understanding. Admittedly, medical chatbot as diagnosis and treatment method is quite different from conventional medical methods, in the sense that it is not directly administered to the patient. But ultimately, methods are just different means to an end. Similar to conventional medical methods, a medical chatbot assists physicians and patients in working towards the end of accurate diagnosis and effective treatment


There are compelling legal and ethical reasons why most major jurisdictions choose to exclude the patentability of diagnosis and treatment methods. From a legal standpoint, this exclusion guarantees that doctors have the freedom to utilise the latest medical techniques without concerns of patent infringement or licensing expenses.[8] This aligns with the ethical argument that as an organized profession founded upon the higher code of practice and honor in the Hippocratic Oath, medicine should not be treated as a commercial industry. Consequently, the underlying objective of patent law, which is to incentivize innovation through inventor rewards, not only lacks relevance in the field of medicine but also contradicts the overarching goal of public health to provide the best possible treatment. [9] [10].


Similarly, these arguments can be extended to the exclusion of medical chatbot patentability, considering that healthcare workers and patients are the primary users of such technology. Let's imagine a scenario where there exists a highly advanced and well-trained medical chatbot that consistently provides more accurate diagnoses and suggests better treatment approaches. In such a case, it raises ethical concerns if certain doctors or patients are denied access to this chatbot due to the developer's refusal to grant a license or because they cannot afford it.


And before even addressing the issue of access, it is important to question the ethics of tech companies profiting from a reconstructed use of conventional diagnosis and treatment methods, which were originally developed by healthcare workers and scientists altruistically. This raises concerns about the appropriation of medical knowledge for commercial gain, potentially hindering the widespread availability and affordability of these innovative tools. 


It is unlikely that any court would cite the diagnosis and treatment method provision to deny or invalidate a patent for medical chatbots. Such an interpretation would be too radical for any fair-minded judge to make. However, it is essential that lawmakers around the world consider the reasoning behind this provision when drafting future AI laws. The potential of medical chatbots to address the issue of global disparity in medical resources cannot be fully realized if the strongest bots are monopolized by tech companies. By excluding the patentability of medical chatbots, we can ensure that these technologies remain accessible and affordable for all healthcare professionals and patients.

  1. Reardon S. AI chatbots can diagnose medical conditions at home. How good are they? [Internet]. Scientific American. 2019 [cited 2024 Mar 30]; Available from: https://www.scientificamerican.com/article/ai-chatbots-can-diagnose-medical-conditions-at-home-how-good-are-they/

  2. Jowitt J. Assessing contemporary legislative proposals for their compatibility with a natural law case for AI legal personhood. AI & Society. 2021;36:499-508.

  3. Marshall B. No legal personhood for AI. Patterns. 2023;4(11).

  4. Lud S. Patentability of AI chatbots in Germany [Internet]. Reddie & Grose. 2023 [cited 2024 Mar 30]; Available from: https://www.reddie.co.uk/2023/06/13/patentability-of-ai-chatbots-in-germany/.

  5. Thaler v. Vidal, 43 F.4th 1207 (Fed. Cir. 2022); Available from: https://cafc.uscourts.gov/opinions-orders/21-2347.OPINION.8-5-2022_1988142.pdf.

  6. Thaler v Comptroller-General of Patents, Designs and Trade Marks [2023] UKSC 49; Available at: https://www.supremecourt.uk/cases/uksc-2021-0201.html.

  7. Agreement on Trade-Related Aspects of Intellectual Property Rights, Annex 1C of the Marrakesh Agreement Establishing the World Trade Organization. 1994 [cited 2024 Mar 30]. Section 27(3)(a); Available from: https://www.wto.org/english/docs_e/legal_e/27-trips.pdf.

  8. Basheer S, Purohit S, Reddy P. Patent exclusions that promote public health objectives. Annex 4 in: Experts' study on exclusions from patentable subject matter and exceptions and limitations to the rights. SCP/15/3. Geneva: World Intellectual Property Organization. 2010 Sep 2.

  9. Mitnovetski O, Nicol D. Are patents for methods of medical treatment contrary to ordre public and morality or "generally inconvenient"? Journal of Medical Ethics. 2004;30(5):470-475.

  10. Wadlow C. Regulatory data protection under TRIPs Article 39(3) and Article 10bis of the Paris Convention: Is there a doctor in the house? Intellectual Property Quarterly. 2008;12(4): 355-415.

bottom of page