- cross-posted to:
- [email protected]
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
- [email protected]
Air Canada must pay damages after chatbot lies to grieving passenger about discount | Airline tried arguing virtual assistant was solely responsible for its own actions::Airline tried arguing virtual assistant was solely responsible for its own actions
This is the best summary I could come up with:
Air Canada must pay a passenger hundreds of dollars in damages after its online chatbot gave the guy wrong information before he booked a flight.
Jake Moffatt took the airline to a small-claims tribunal after the biz refused to refund him for flights he booked from Vancouver to Toronto following the death of his grandmother in November last year.
Before he bought the tickets, he researched Air Canada’s bereavement fares – special low rates for those traveling due to the loss of an immediate family member – by querying its website chatbot.
Unhappy with this situation – a support bot telling him the wrong info – Moffatt took the airline to a tribunal, claiming the corporation was negligent and misrepresented information, leaving him out of pocket.
Air Canada, however, argued it shouldn’t be held liable for the chatbot’s faulty outputs, without explaining why, which baffled tribunal member Christopher Rivers.
Air Canada said its chatbot provided a link to a page on its website explaining that refunds for discounted fares cannot be claimed retroactively, and Moffatt should have clicked on it.
The original article contains 626 words, the summary contains 180 words. Saved 71%. I’m a bot and I’m open source!