YouTube Warns Crypto Channel over Policy Breach

YouTube Targets Crypto Channels

YouTube delivered a strike notice, earlier this week, to Crypto channel owned by Sunny Decree for violating its ‘harmful and dangerous policy.’

What Is The Story?

On 5 September, YouTube halted Decree’s most recent Livestream on his English language channel in Switzerland and Germany.

The video sharing platform also warned him a second offense would result in a one-week suspension of service for livestreaming, uploading, and posting.

Breaking: My #Bitcoin Livestream halted, and I got a warning…@YouTube @ytcreators.

Decree wrote on his Twitter account

Wrong Call

It is not the first time for YouTube to target high-profile channels. Decree’s channel – which has more than 123,000 subscribers – had videos removed without explanation. However, YouTube later called the move an “error” and restored many of the videos.

Besides, YouTube terminated the brothers Aaron and Austin Arnold’s Altcoin Daily channel with 214,000 subscribers on 31 July for “encouraging illegal activities.” After more than two days of complete inaccessibility, YouTube had reversed the ban.

Another shutdown incident happened on 15 June. The platform cut off the official channel of cryptocurrency news-focused website for “a violation of YouTube’s Terms of Service.”

After an appeal, YouTube reactivated the channel with all 40,000 subscribers two days later.

In late December 2019, the platform began aggressively deleting videos with crypto-related content. Channels with tens of thousands of subscribers or more including Chris Dunn’s witnessed video removals.

Moreover, the platform removed entirely Robert Beadles’ channel, Crypto Beadles. YouTube referred to one of these bans as “an error” during the review process.

Crypto Channels Seen as Harmful

YouTube released on March 16 a statement warning its creator community that video removals may increase during the coronavirus pandemic.

Currently, the company said that its system currently relies on machine learning algorithms that is to detect potentially harmful content. Once these algorithms find “harmful content,” human reviewers are called upon for assessment.

YouTube stated in a statement: “Automated systems will start removing some content without human review, so we can continue to act quickly to remove violative content and protect our ecosystem, while we have workplace protections in place.”

0 0 vote
Article Rating
Share this page
Notify of
Inline Feedbacks
View all comments
Sally ElShorbagy 25 Articles
Sally ElShorbagy is a freelance journalist and translator who currently covers the future of the cryptocurrencies and the digital economy revolution.