The Flixbus chatbot claims nothing it says is legally binding (but a court found otherwise in Canada, ruling against an airline)
The Flixbus chatbot claims nothing it says is legally binding (but a court found otherwise in Canada, ruling against an airline)
Flora - AI Assisted Agent
cross-posted from: https://slrpnk.net/post/27072322
A chatbot erroneously told a traveler they get free travel in a particular situation. I don’t recall exact circumstances but it was something like a last minute trip for a funeral. The airline then denied him the free ticket. He sued. The court found that the chatbot represents the company and is therefore legally bound to agreements.
It’s interesting to note that agreements are now being presented which you must click to accept before talking to a chatbot. E.g., from Flixbus:
You are interacting with an automated chatbot. The information provided is for general guidance only and is not binding. If you require further clarification or additional information, please contact a member of our staff directly or check out our terms and conditions and privacy notice.
(emphasis mine)
I’m not in Canada so that may be true. I just wonder if this agreement is enforceable in Europe.