Slack seems to have sneakily opted-in its customers to AI training
A couple of days ago, a note in Hacker News highlighted that per the service's Privacy Principles, Slack analyzes customer data to develop "non-generative AI/ML models for features such as emoji and channel recommendations", allowing users to opt their data out without compromising their Slack experience by having them send a specific email. The Privacy Principles communicate the essential, simply stating that Slack analyzes messages, content, files, and other information submitted to the platform in a way that protects the underlying content and ensures the trained models are unable to learn, memorize, or reconstruct data to avoid leaks across workspaces.
Strangely enough, the opt-out confirmation email may be more informative than the Privacy Principles. The Hacker News note reproduces Slack's reply to the author's opt-out request. In this reply, Slack finally explains the difference between their "non-generative AI/ML models" and the recent Slack AI, an add-on powered by generative AI models not trained using customer data. The models are safely hosted within Slack's private AWS infrastructure, which ensures all customer data remains in-house and inaccessible by any LLM provider.
Even if there is no suspicious data mining and this turns out to be a matter of carelessness in which Slack did not bother to update its policy in the light of the Slack AI introduction, it is still a great example of how user rights remain an afterthought and not a priority for service providers, but also for users themselves. The Privacy Principles have been applicable since at least September 2023, and Slack AI was introduced in February 2024. This leaves at least three months of confusion where Slack did not find it time-pressing to update its policy to at least acknowledge that it offered an LLM-powered service, and most users remained blissfully unaware that they had been automatically opted-in to a data collection policy that could or could not have applied for generative AI training purposes.