YouTube AI to automatically block videos that violate age restrictions

image2.png_thump

YouTube will use machine learning to automatically apply age restrictions on videos, the Google-owned video site said Tuesday, widening its use of artificial intelligence to automate blocking some videos from viewers who either aren’t signed into a YouTube account or are signed in as a viewer under the age of 18.

Creators who believe their videos were blocked unfairly can appeal. YouTube said these automated age restrictions and some tweaks to what it categorizes as inappropriate for people under 18 will all “roll out over the coming months.”

Currently, YouTube has a human team that applies the age restrictions when it reviews a video that isn’t appropriate for younger viewers. “Going forward, we will build on our approach of using machine learning to detect content for review, by developing and adapting our technology to help us automatically apply age restrictions,” YouTube said.

YouTube, with more than  2 billion monthly users, is the world’s biggest online video source — so big, in fact, that it’s the world’s top source for kids’ videos too. Content for kids is one of the site’s most-watched categories, but YouTube has come under fire for a range of scandals involving children. It was slapped with a record $170 million US penalty because of the the data YouTube collects on kids without parents’ consent. YouTube has also faced scandals involving videos of child abuse and exploitation and nightmarish content in its YouTube Kids app, pitched as a kid-safe zone.

Hundreds of hours of video are uploaded to YouTube every minute, making comprehensive human review impossible. So YouTube executives have touted machine learning as a crucial tool to supplement human moderation. But content decisions made by automated algorithms can be prone to mistakes. These errors can occur when value judgments and context is important to make the correct call; other times, a new kind of problem arises that software hasn’t been trained to address.

Footage of a mass shooting at a New Zealand mosque last year, for example, was initially able to spread on YouTube partly because machine learning had trouble detecting it automatically.

 

Source:-cnet