Sony Music has taken steps to eliminate over 135,000 songs on streaming platforms produced by fraudsters impersonating its artists.

The deepfake songs, generated by AI technology, have targeted prominent figures like Beyoncé, Queen, and Harry Styles. Sony has asserted that such counterfeits can severely harm genuine artists, especially during promotional campaigns for new albums.

Dennis Kooker, Sony's president of global digital business, stressed that these deepfakes not only financially impact artists but can also tarnish their reputations and disrupt marketing efforts.

The company highlights that the number of fraudulent songs is likely just a fraction of the total created, with approximately 60,000 tracks identified in the past year alone. Other affected artists include Bad Bunny and Miley Cyrus.

During a recent music industry event in London, it was reported that recorded music revenues have increased by 6.4% over the past year, reaching $31.7 billion. Conversely, unregulated AI and music piracy remain persistent threats, leading industry leaders to advocate for better regulation and stricter labels for AI-generated content.

The rise of fraudulent tracks utilizing deepfake technology has raised alarms, especially as it coincides with a broader recovery and growth in music industry revenues after years of decline due to piracy. The movement towards a more transparent music ecosystem is underway as industry stakeholders meet to discuss ways to combat this growing issue.