Slack trains machine learning models on user messages, files, and other content without express permission. The training is opt-out, which means your personal data will be deleted by default. Making matters worse, you’ll need to ask your organization’s Slack admin (HR, IT, etc.) to email the company to ask them to stop. (You can’t do it alone.) Welcome to the dark side of the new AI training data gold rush.

Corey QuinnCEO of DuckBill Group, noticed the policy in advertising comment c Slack Privacy Principles and posted about it on X (through PCMag). The section reads (italics ours), “To developing AI/ML modelsour systems analyze customer data (e.g messages, content and files), sent to Slack, as well as other information (including usage information) as defined in our Privacy Policy and in your customer agreement.”

The opt-out process requires you to do all the work to protect your data. According to the privacy notice, “To opt out, please ask the owners of your organization or workspace or the primary owner to contact our customer experience team at with your workspace/organization URL and the order for topic ‘Slack global model opt-out request .’ We will process your request and respond once the denial is complete.”

The company answered to Quinn’s message on X: “To clarify, Slack has machine learning models at the platform level for things like channel recommendations and emojis and search results. And yes, customers can opt out of their data from helping train these (non-generative) ML models.”

It’s unclear how long ago the Salesforce-owned company slipped the tidbit into its terms. It is misleading at best to say that customers can opt out when “customers” do not include employees working in an organization. They should ask everyone who manages Slack access in their business to do this – and I hope they comply.

Inconsistencies in Slack’s privacy policies add to the confusion. One section reads: “When developing Al/ML models or otherwise analyzing customer data, Slack does not have access to the underlying content. We have various technical measures in place to prevent this from happening. However, the policy of training a machine learning model seems to contradict this statement, leaving much room for confusion.

In addition, the Slack web page marketing its premium AI generating tools reads: “Work without worry. Your data is your data. We don’t use it to train Slack AI. Everything runs on Slack’s secure infrastructure, meeting the same compliance standards as Slack itself.”

In this case, the company is talking about its premium generative AI tools, separate from the machine learning models it trains without express permission. However, as PCMag notes implying that all your data is safe from AI training is a highly misleading claim at best, when the company clearly gets to choose which AI models to cover with that statement.

Engadget attempted to contact Slack through multiple channels, but did not receive a response at the time of publication. We’ll update this story if we hear back.