You're Still Responsible
This is a member-only chapter. Log in with your Signal Over Noise membership email to continue.
Log in to readModule 1 · Section 9 of 11
You’re Still Responsible
In 2024, Air Canada’s chatbot gave a customer wrong information about bereavement fares. The company argued they weren’t liable for what their AI said. The court disagreed.
The principle is simple: AI tools are yours to check, yours to verify, and yours to take responsibility for. “The AI wrote it” is not a defence your clients will accept.