×
WeTransfer clarifies it doesn’t use uploaded files to train AI models
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

WeTransfer has confirmed it does not use files uploaded to its service to train artificial intelligence models, following significant customer backlash over confusing terms of service changes. The file-sharing company updated its language to clarify that content moderation—not AI training—was the intended purpose, highlighting how unclear AI policies can quickly erode user trust in digital platforms.

What happened: WeTransfer faced widespread criticism on social media after updating its terms of service in late June or early July with language that users interpreted as granting permission to use their files for AI training.

  • The original terms stated WeTransfer could use content “including to improve performance of machine learning models that enhance our content moderation process.”
  • The company also included rights to “reproduce, distribute, modify,” or “publicly display” files uploaded to the service.
  • Creative professionals, including illustrators and actors who regularly use the platform to share work, expressed concerns and considered switching to alternative providers.

The clarification: A WeTransfer spokeswoman told BBC News the company does not use machine learning or AI to process shared content, nor does it sell content or data to third parties.

  • The clause was initially added to “include the possibility of using AI to improve content moderation” and identify harmful content.
  • WeTransfer updated the terms on Tuesday, stating it had “made the language easier to understand” to avoid confusion.
  • The revised clause now says: “You hereby grant us a royalty-free license to use your Content for the purposes of operating, developing, and improving the Service, all in accordance with our Privacy & Cookie Policy.”

Why this matters: The incident reflects growing sensitivity around AI training data and the importance of transparent communication from tech companies about how user content is handled.

  • Users are increasingly vigilant about whether their creative work might be used to train AI models without explicit consent.
  • The backlash demonstrates how quickly unclear terms of service can damage customer relationships, particularly among creative professionals who depend on these platforms for their work.

Similar incidents: WeTransfer joins other file-sharing platforms that have faced similar scrutiny over AI-related terms.

  • Dropbox, a cloud storage company, also had to clarify in December 2023 that it was not using uploaded files to train AI models after social media outcry.
  • These incidents suggest a pattern of companies struggling to communicate AI policies clearly to users.
WeTransfer says files not used to train AI after backlash

Recent News

Nextdoor’s biggest redesign adds 3 AI features for 100M users

Machine learning analyzes 14 years of neighbor conversations to answer local questions instantly.

Scale AI cuts 200 jobs after ramping up GenAI capacity too quickly

Even well-funded AI companies struggle to balance rapid growth with operational efficiency.

Anthropic launches analytics dashboard for Claude Code AI programming assistant

Enterprise managers demand concrete ROI data to justify expensive AI tool investments.