Am I liable if my app sends user data to an LLM API that uses it for training?
Quick Answer
Potentially, yes. Recent cases establish liability: Jones v. Peloton (2024) found company liable for third-party AI chatbot using data for training. Ambriz v. Google (2025) ruled AI a third party eavesdropper. You need DPA, Zero Data Retention, PII redaction.
Detailed Answer
Potentially, yes. Recent court cases establish growing liability:
-
Jones v. Peloton (2024): Court found that using a third-party AI chatbot that intercepts user data for their own purposes including to improve AI software could constitute wiretapping. The company using the chatbot was named as defendant, not just the AI vendor.
-
Ambriz v. Google (2025): Googles Contact Center AI was ruled a third party eavesdropper because it used conversation data to train models.
Your responsibilities:
- Ensure your AI vendor has a DPA
- Enable Zero Data Retention if available
- Disclose AI processing in your privacy policy
- Obtain appropriate consent from users
- Implement PII redaction before sending data


Comments
Loading comments...