Canadian Court Rules Air Canada Liable for A.I. Chatbot Error
by Daniel McCarthy
Photo: Shutterstock.com
Air Canada is on the hook for incorrect information that its artificial intelligence (A.I.) chatbot shared with a traveler recently, one of the first of what could be a slew of A.I.-related issues in the future as the technology invades all points of travel.
The issue boiled down to how a traveler, Jake Moffatt, could use a bereavement fare, and whether or not the place he received the information, the chatbot on the Air Canada website, could be held liable for incorrect information. Here is what happened, according to the case.
Moffatt, in November 2022, needed to book a flight from Vancouver to Toronto following the death of his grandmother. While booking, he asked the A.I. support chatbot on the Air Canada website about getting access to a bereavement fare, which not all airlines offer.
According to the case, the chatbot told Moffatt that Air Canada does offer bereavement fares, which it does, and that he could apply for bereavement fares retroactively after purchasing the full-fare ticket.
The issue was that Air Canada does not actually allow for bereavement fares to be applied retroactively, something that it clearly states on its website despite whatever the chatbot said. Moffatt booked the flight, anticipating that he would be able to go back to Air Canada after the fact and get a discount on the $845 he paid for the one-way flight.
When he applied for a bereavement fare about a week later, with a screenshot of what the chatbot said to him and proof of death, he was told by Air Canada that the chatbot was incorrect and that he’d be tasked with paying the full fare.
In the case, the carrier said that it should not be held liable for information that its chatbot gave, despite its placement on its website, because it was a “separate legal entity that is responsible for its own actions.”
Members of the civil tribunal ruled that Air Canada, despite its assertions, was, in fact, liable to pay Moffatt the difference between the bereavement fare and what he previously paid. In the decision, tribunal member Christopher Rivers wrote “I find Air Canada did not take reasonable care to ensure its chatbot was accurate.”
“While Air Canada argues Mr. Moffatt could find the correct information on another part of its website, it does not explain why the webpage titled ‘Bereavement travel’ was inherently more trustworthy than its chatbot. It also does not explain why customers should have to double-check information found in one part of its website on another part of its website,” he said.

