what is not allowed on youtube

YouTube Sale of Illegal Goods and Services Policy outlines goods that should not be sold on YouTube. These include alcohol, nicotine, and vaping products, financial information (such as passwords, credit cards, counterfeit documents, or currency), controlled narcotics and other drugs, explosives, organs, endangered species or parts of endangered species, firearms, and some of the accessories, Google or YouTube unapproved online gambling sites, pharmaceuticals without prescriptions, sex or escort services, unlicensed medical services, and human smuggling. 

It applies to videos, descriptions, comments, live streams, and any other YouTube products and features. But how come we can still search and watch various videos with links to casinos, drugs, medical services, and similar things? Can we trust YouTube that it won’t assist in selling us illegal products or services?

Well, some of it isn’t strictly illegal. In other words, it can be unlawful in one country but legal in another. For example, companies selling medical cannabis, crypto, and gambling products must have local country licenses and permits for advertising. Medical Cannabis, for example, is banned in Sweden but allowed in the Netherlands. Companies can indeed have approval and permits to sell and advertise as well as comply with all YouTube requirements.  

According to Youtube, selling means a direct sale, link to, posts of links or contacts and access facilitation to product or service, so a phone number indicated in the description or video comments is considered a sale.

But how much of the content we can’t trust?

Google has published a Transparency Report summarising policy enforcement. In 2021, Youtube removed almost 9,6 million videos due to non-compliance. Unfortunately, we don’t know precisely how long the videos remained active on YouTube before removal, and we don’t know how many non-compliant videos are uploaded daily. Only Google knows. Google outsources some of the content review teams through various vendors, of which the majority is handled by consultancy company Accenture. YouTube’s product management director for trust and safety, Jennifer Flannery O’Connor, has said that it uses a mixed approach, including human, artificial intelligence, governmental agencies, and user reports to catch channels and videos violating their policies. What we know is once it is spotted, it has almost no chance to return. Google reinstated only 66,020 videos after submitted appeals.

What does it mean for businesses? 

Should a brand worry about its commissioned content appearing on various streaming channels?

Absolutely, if a streamer is an affiliate and gets a ban on advertised content, it cannot be monetised and consequently financially damages both parties. It becomes even more complex if regulated products and services have licenses and other legal obligations to comply with. A complaint can appear from multiple places, including YouTube or the regulator itself. In many countries, companies must monitor published affiliate content to ensure compliance with these obligations, such as to make sure there are appropriate age-gating and disclaimers present, etc. 

If the content violates the Policy, the channel may get several warnings and later be removed from the YouTube Platform. In some cases, content may not be removed but age-restricted. Age-restricted videos cannot be used for monetisation and cannot be watched on most third-party websites. Although, YouTube makes exceptions, and some videos depending on the context, can be approved for ads, and therefore be monetised.

As such, brands must take responsibility for online video content. Identifying and reviewing this content can be time-consuming and could contain a conflict of interest between a brand’s sales and compliance functions. An effective way to identify and review this content can be provided through third-party services and technologies, much like Google uses itself.  

One such service provider is TraceHat. Our unique solutions can help both companies and streamers protect themselves by making more intelligent decisions. We use AI technologies to analyse video content that provides audio and visual brand mentions in the video stream. Streamers can review how they are advertising a specific brand and compare themselves with other streamers. It can help make intelligent decisions, learn know-how, fulfill obligations with a brand, and position themselves competitively. Brands can identify and monitor content exposure such as Copyright infringements, disclaimers, inappropriate content and take appropriate safeguarding measures.

Youtube will punish many content publishers, while others will manage to slip under the radar. It is challenging for Youtube to balance freedom of expression and effective content policing, especially with such a broad regulatory landscape. Ultimately, Youtube has the power of control, and users can decide whether to trust it or not. Brands should take a slice of that control by monitoring their exposure and managing risks by analysing their streamer content and performance.

Resources:

§ Google Transparency Report

§ Advertiser-friendly Content Guidelines

Leave a Reply

Your email address will not be published. Required fields are marked *

Start free trial

Fill out the form below, and we will be in touch shortly.
Contact Information
Company information
When do you wish to start?