Skip to main content

It appears that SoundCloud has made changes to its terms of use, permitting the company to utilize audio uploaded by users to train artificial intelligence on its platform.

This update was discovered by tech ethicist Ed Newton-Rex, who noticed that the latest version of SoundCloud’s terms includes a provision allowing the platform to use uploaded content to “inform, train, develop, or serve as input to artificial intelligence or machine intelligence technologies or services” as part of its services.

The terms, which were last updated on February 7, state: “You explicitly agree that your Content may be used to inform, train, develop, or serve as input to artificial intelligence or machine intelligence technologies or services as part of and for providing the services.”

The terms have an exception for content under “separate agreements” with third-party rightsholders, such as record labels. SoundCloud has licensing agreements with various indie labels, as well as major music publishers like Universal Music and Warner Music Group.

Upon reviewing the platform’s settings menu on the web, TechCrunch was unable to find an explicit opt-out option. SoundCloud did not immediately respond to a request for comment.

SoundCloud, like many large creator platforms, is increasingly investing in artificial intelligence.

Last year, SoundCloud partnered with nearly a dozen vendors to introduce AI-powered tools for remixing, generating vocals, and creating custom samples to its platform. In a blog post, SoundCloud stated that these partners would receive access to content ID solutions to “ensure rights holders receive proper credit and compensation” and pledged to “uphold ethical and transparent AI practices that respect creators’ rights.”

Techcrunch event

Berkeley, CA
|
June 5


BOOK NOW

Several content hosting and social media platforms have recently updated their policies to allow for first- and third-party AI training. In October, Elon Musk’s X updated its privacy policy to permit outside companies to train AI on user posts. Last September, LinkedIn amended its terms to allow it to scrape user data for training. And in December, YouTube began letting third parties train AI on user clips.

Many of these changes have sparked backlash from users who argue that AI training policies should be opt-in rather than opt-out and that they should be credited and paid for their contributions to AI training datasets.

Updated 2:22 p.m. Pacific: A SoundCloud spokesperson provided a statement via email, which we’ve published in part below:

“SoundCloud has never used artist content to train AI models, nor do we develop AI tools or allow third parties to scrape or use SoundCloud content from our platform for AI training purposes. In fact, we implemented technical safeguards, including a ‘no AI’ tag on our site to explicitly prohibit unauthorized use.

The February 2024 update to our terms of service was intended to clarify how content may interact with AI technologies within SoundCloud’s own platform. Use cases include personalized recommendations, content organization, fraud detection, and improvements to content identification with the help of AI technologies.

Any future application of AI at SoundCloud will be designed to support human artists, enhancing the tools, capabilities, reach, and opportunities available to them on our platform. Examples include improving music recommendations, generating playlists, organizing content, and detecting fraudulent activity. These efforts are aligned with existing licensing agreements and ethical standards. Tools like [those from our partner] Musiio are strictly used to power artist discovery and content organization, not to train generative AI models.  

We understand the concerns raised and remain committed to open dialogue. Artists will continue to have control over their work, and we’ll keep our community informed every step of the way as we explore innovation and apply AI technologies responsibly, especially as legal and commercial frameworks continue to evolve.”




Source Link