Article
Version 1
Preserved in Portico This version is not peer-reviewed
Adversarial Artificial Intelligence in Insurance: From an Example to Some Potential Remedies
Version 1
: Received: 7 October 2022 / Approved: 11 October 2022 / Online: 11 October 2022 (15:44:36 CEST)
A peer-reviewed article of this Preprint also exists.
Amerirad, B.; Cattaneo, M.; Kenett, R.S.; Luciano, E. Adversarial Artificial Intelligence in Insurance: From an Example to Some Potential Remedies. Risks 2023, 11, 20. Amerirad, B.; Cattaneo, M.; Kenett, R.S.; Luciano, E. Adversarial Artificial Intelligence in Insurance: From an Example to Some Potential Remedies. Risks 2023, 11, 20.
Abstract
Artificial intelligence (AI) is a tool that financial intermediaries and insurance companies use in most cases or are willing to use it in almost all their activities. AI can have a positive impact on almost all aspects of the insurance value chain.: pricing, underwriting, marketing, claims management, after-sales services. While it is very important and useful, AI is not free of risks, including its robustness against cyber-attacks and so-called adversarial attacks. Adversarial attacks are conducted by external entities to misguide and defraud the AI algorithms. The paper is designed to provide a review of adversarial AI and discuss its implications for the insurance sector. The study starts with a taxonomy of adversarial attacks and presents a fully-fledged example of claims falsification in health insurance. Some remedies, consistent with the current regulatory framework, are presented.
Keywords
AI; insurance; adversarial attacks
Subject
Business, Economics and Management, Finance
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (0)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)
* All users must log in before leaving a comment