Air Canada Held Liable for Chatbot's Misleading Advice

TL;DR Summary
Air Canada was forced to give a partial refund to a passenger who was misled by its chatbot regarding the airline's bereavement travel policy. The passenger, Jake Moffatt, followed the chatbot's advice to book a flight and request a refund within 90 days, only to have the request rejected. Despite Air Canada's argument that the chatbot is a separate legal entity, a tribunal ruled in favor of Moffatt, awarding him a partial refund and additional damages. Air Canada has disabled its chatbot and will comply with the ruling.
- Air Canada Has to Honor a Refund Policy Its Chatbot Made Up WIRED
- Air Canada found liable for chatbot's bad advice on plane tickets CBC.ca
- Air Canada responsible for errors by website chatbot after B.C. customer denied retroactive discount Vancouver Sun
- Air Canada is responsible for chatbot's mistake: B.C. tribunal | CTV News CTV News Vancouver
- Air Canada ordered to pay customer who was misled by airline’s chatbot The Guardian
Reading Insights
Total Reads
0
Unique Readers
0
Time Saved
2 min
vs 3 min read
Condensed
84%
561 → 88 words
Want the full story? Read the original article
Read on WIRED