News

AI: Air Canada liable for chatbot errors

Many people can already no longer do without the everyday support of artificial intelligence (AI). This applies to companies as well as private individuals. However, an example from Canada now makes it clear that the use of chatbots also entails a risk. An airline there is liable after a chatbot falsely promised a refund.

AI error is attributed to Air Canada

To err is human, but what about artificial intelligence? Of course they make mistakes too. If you use chatbots like ChatGPT in your private life, this can still have foreseeable negative consequences. But what about companies? The airline Air Canada has now had to painfully learn that there is a high business risk in its use. A court has now ordered the company to pay 812 Canadian dollars (judgment).

ki

Although this may not threaten Air Canada’s existence, it shows that the use of AI will also occupy us legally in the near future. The background to this is incorrect information provided by a chatbot used by Air Canada. The AI promised a customer a discounted price, which the airline offers to customers who have recently suffered a bereavement. However, those affected must also call up precisely these bereavement prices in the context of last-minute tickets. This was not the case in the dispute.

Customer followed instructions from chatbot

Without the support of the AI, it is possible that the customer would not have made a mistake. The customer simply followed the instructions given by the chatbot. Interested in the bereavement fare, the customer asked the bot how to proceed. Contrary to Air Canada’s guidelines, the AI recommended booking a normal ticket first. The customer should then contact the airline again to obtain a corresponding discount. As the chatbot did not act in the interests of Air Canada, the airline does not consider itself responsible for the error.

Rather, the company is of the opinion that the chatbot was not acting as part of Air Canada, but as an “independent legal entity”. Accordingly, it must also be liable for its own statements. Apparently, the airline’s argument was not convincing. Ultimately, the court was of the opinion that a company can be held liable for all content that can be viewed on its own website. How exactly this content is generated is irrelevant. This case clearly shows that companies should not blindly rely on the reliability of AI.

Related Articles

Neue Antworten laden...

Avatar of Basic Tutorials
Basic Tutorials

Gehört zum Inventar

12,625 Beiträge 3,018 Likes

Many people can already no longer do without the everyday support of artificial intelligence (AI). This applies to companies as well as private individuals. However, an example from Canada now makes it clear that the use of chatbots also entails a risk. An airline there is liable after a chatbot falsely promised a refund. AI error is attributed to Air Canada To err is human, but what about artificial intelligence? Of course they make mistakes too. If you use chatbots like ChatGPT in your private life, this can still have foreseeable negative consequences. But what about companies? The airline Air … (Weiterlesen...)

Antworten Like

Back to top button