A Legal Analysis of the Air Canada A.I. Chatbot Decision
by Paul Ruden /As all TMR readers are surely aware, the arrival of ChatGPT and other Artificial Intelligence-based large learning models with cute names have taken the supplier side of the travel industry by storm. Travel is not alone in its ambition to use AI to cut costs, but, given the industry’s direct and unavoidable connection with tens of millions of travelers going far away from home, the use of AI technology raises some special issues.
To be clear, this article is not intended as a general treatment of artificial intelligence or its diverse uses throughout the travel supply chain. Our purpose here is to learn some lessons from a court decision in Canada, rendered, coincidentally I’m sure, on Valentine’s Day.
The case is Moffatt v. Air Canada, 2024 BCCRT 149 (CanLII), issued by a Canadian Civil Resolution Tribunal (“CRT”), the equivalent of an American small claims court. Our small claims courts do not typically issue written opinions, however, and this case is very interesting for that and many other reasons.
I am mainly going to use the court’s own words, indented, to explain what happened. The bolding of text is mine. My comments are in brackets and are intended to address the learning this case represents for U.S.-based travel advisors and their clients.
In November 2022, following the death of their grandmother, Jake Moffatt booked a flight with Air Canada. While researching flights, Mr. Moffat used a chatbot on Air Canada’s website. The chatbot suggested Mr. Moffatt could apply for bereavement fares retroactively.
[That advice was wrong. Air Canada’s policy did not allow retroactive access to bereavement fares. Another part of Air Canada’s website made this clear, but Mr. Moffatt did not check it.]
Moffatt says Air Canada must provide them with a partial refund of the ticket price, as they relied upon the chatbot’s advice. They claim $880 for what they say is the difference in price between the regular and alleged bereavement fares.
Air Canada says Moffatt did not follow the proper procedure to request bereavement fares and cannot claim them retroactively. Air Canada says it cannot be held liable for the information provided by the chatbot. Finally, it relies on certain contractual terms from its Domestic Tariff. Air Canada asks me to dismiss Mr. Moffatt’s claim….
I find that I am properly able to assess the documentary evidence and submissions before me. Further, bearing in mind the CRT’s mandate that includes proportionality and a speedy resolution of disputes, I find that an oral hearing is not necessary in the interests of justice.
… the CRT may accept as evidence information that it considers relevant, necessary, and appropriate, whether or not the information would be admissible in a court of law.
[That procedural decision is similar to what happens in U.S. small claims courts]
Did Air Canada negligently misrepresent the procedure for claiming bereavement fares, and if so, what is the remedy?
In a civil proceeding like this one, Mr. Moffatt, as applicant, must prove their claims on a balance of probabilities. This means “more likely than not”.
[This standard of proof is the same as the U.S. “preponderance of the evidence” standard for civil cases.]
It is undisputed that Air Canada provides certain accommodations, such as reduced fares, for passengers traveling due to the death of an immediate family member.
Mr. Moffat says while using Air Canada’s website, they interacted with a support chatbot. While Air Canada did not provide any information about the nature of its chatbot, generally speaking, a chatbot is an automated system that provides information to a person using a website in response to that person’s prompts and input. The parties implicitly agree that Mr. Moffatt was not chatting with an Air Canada employee.
Mr. Moffat says they asked the Air Canada chatbot about bereavement fares. They include a screenshot of the chatbot’s response, which says, in part, as follows: Air Canada offers reduced bereavement fares if you need to travel because of an imminent death or a death in your immediate family.
…
If you need to travel immediately or have already traveled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form.
It is undisputed the words “bereavement fares” were a highlighted and underlined hyperlink to a separate Air Canada webpage titled “Bereavement travel” with additional information about Air Canada’s bereavement policy. Air Canada provided a screenshot of part of what I infer is the hyperlinked Air Canada webpage.
The webpage says, in part, the bereavement policy does not apply to requests for bereavement consideration after travel has been completed.
Relying on the information provided by the chatbot, on November 11, Mr. Moffatt booked a one-way flight from Vancouver to Toronto, departing on November 12, for $794.98. On November 16, relying on the same information, they booked a one-way flight from Toronto to Vancouver, departing on November 18, for $845.38.
Mr. Moffat says on November 11, they spoke to an Air Canada representative by telephone about bereavement rates to determine what the discount may be. Mr. Moffatt says they were told the fare for each flight would be approximately $380. There is no evidence the Air Canada representative told Mr. Moffatt about whether or not they could retroactively apply for bereavement rates.
[Mr. Moffatt requested the bereavement fare within the 90-day window specified by the chatbot. Additional correspondence with Air Canada continued for several months, including the screenshot taken earlier.]
On February 8, an Air Canada representative responded and admitted the chatbot had provided “misleading words.” The representative pointed out the chatbot’s link to the bereavement travel webpage and said Air Canada had noted the issue so it could update the chatbot.
[Applying a common sense understanding of the dispute, the court found that Mr. Moffatt was alleging “negligent misrepresentation.”]
Negligent misrepresentation can arise when a seller does not exercise reasonable care to ensure its representations are accurate and not misleading.
To prove the tort of negligent misrepresentation, Mr. Moffatt must show that Air Canada owed them a duty of care, its representation was untrue, inaccurate, or misleading, Air Canada made the representation negligently, Mr. Moffatt reasonably relied on it, and Mr. Moffatt’s reliance resulted in damages.
[The court’s analysis of the law is consistent with U.S. law that would govern a similar dispute here.]
Here, given their commercial relationship as a service provider and consumer, I find Air Canada owed Mr. Moffatt a duty of care. Generally, the applicable standard of care requires a company to take reasonable care to ensure their representations are accurate and not misleading.
Air Canada argues it cannot be held liable for information provided by one of its agents, servants, or representatives – including a chatbot. It does not explain why it believes that is the case. In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions. This is a remarkable submission. While a chatbot has an interactive component, it is still just a part of Air Canada’s website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.
…. There is no reason why Mr. Moffatt should know that one section of Air Canada’s webpage is accurate, and another is not.
[The court then rejected Air Canada’s attempt to escape liability by asserting the terms of its tariff as a binding contract with the traveler. Perhaps because the carrier was represented by an employee who was not an attorney, Air Canada did not provide the actual tariff language and the court therefore rejected the contract defense.]
Mr. Moffatt is entitled to be put in the position they would have been in if the misrepresentation had not been made. The measure of damages is generally considered the difference between the price paid and the actual market value at the time of the sale.
[That analysis is also consistent with the one used in the U.S. generally in similar circumstances. The opinion then addressed the difference between Mr. Moffatt’s paid fare and the amount of the bereavement fare that Air Canada also disputed but without introducing evidence. The court therefore undertook an analysis of its own regarding the various taxes and other charges that applied to each fare in question.]
In total, then, I find Mr. Moffatt should have paid $979.48 for their two flights. Since they paid $1,630.36, I find they are entitled to damages of $650.88.
[Air Canada argued that it was entitled to a set-off of $200 for the value of a coupon it had offered Mr. Moffatt but which he had refused. The court rejected the set-off argument and added pre-judgment interest dating from the “first email requesting the bereavement fare refund, to the date of this decision” and reimbursement of the $125 in fees Mr. Moffatt had to pay the CRT to file and hear his case.]
What can we learn from the Moffatt decision?
One key lesson from the Moffatt decision is that anyone relying on a website’s chatbot for information or decisions should, in every case, make screenshots and/or save the chat transcript (not always offered). Moffatt would have had a more difficult case if he had not preserved the specific advice/information provided by the Air Canada chatbot.
It is well established now that learning models that underlie ChatGPT and other AI apps and software have important limitations. They often make mistakes, in part because of limits on the data on which their “learning” was based. Some mistakes are simply the result of the fact, and it is a fact, that an LLM, chatbot, or whatever you want to call it is not human, and cannot have human experiences or comprehend everything the way a human does.
That is not to say that humans do not make mistakes either. But as the “next new thing,” AI apps often are taken as more advanced and “perfect” than they are in fact. So, again, if you choose to rely on the information provided by one, keep a good record of what you, or your client, was told.
My second observation about the learning from the Moffatt case is that there was an alternative theory on which Moffatt’s claim could have been sustained. The court relied on a “negligent misrepresentation” concept to find that Mr. Moffatt had been misled to his detriment, but I think a “breach of contract” concept also would have worked.
Air Canada’s presentation of a chatbot that responded to Moffatt’s inquiry about bereavement fares could have been treated as an “offer to deal.” In effect, Moffatt asked the chatbot, “Can I buy this fare on these terms?” and the chatbot replied, “Yes, pay now and come back later for your refund.” To which Moffatt responded, “Deal” and proceeded.
This alternative basis for decision may be critically important because of a feature of U.S. law that, to my knowledge, does not have a counterpart in Canada. I refer to “preemption of state law” that was included in the 1978 Airline Deregulation Act (ADA). 49 U.S.C. § 41713(b)(1) of the ADA prohibits the enforcement of state laws “related to a price, route, or service of an air carrier.”
Initially, the preemption provision was interpreted broadly: In Morales v. Trans World Airlines, Inc., the Supreme Court held that the Airline Deregulation Act preempted a state’s effort to enforce guidelines regarding the content and format of airline fare advertising. The Court thought the state advertising guidelines expressly referenced airfares and were likely to have a significant impact on airfares.
This is an enormously complicated area of the law. You can get a deeper understanding, if you dare, by reading Federal Preemption: A Legal Primer, Congressional Research Service, updated May 18, 2023. Fair warning: this subject is a brain twister.
Suffice it to say for present purposes, that viewing the Moffatt case under an alternative legal principle of breach of contract potentially escapes the jaws of federal preemption more easily than a misrepresentation approach. Faced with this morass, you will, of course, want to consult your attorney. The extent to which state tort law is preempted by the ADA is, it’s fair to say, not settled.
But, for your sake as a businessperson, and in the interests of your clients, just always remember that:
• chatbots are not human,
• chatbots and AI-based apps are not infallible,
• make and keep confirming records of everything a chatbot or AI app tells you if you’re going to rely on it,
• don’t count on another airline making the mistakes Air Canada made in the Moffatt case,
• don’t count on a U.S. judge being as accommodating as the judge in Moffatt.