How we're helping creators disclose altered or synthetic content
Generative AI is transforming the ways creators express themselves – from storyboarding ideas to experimenting with tools that enhance the creative process. But viewers increasingly want more transparency about whether the content they’re seeing is altered or synthetic.
That’s why today we’re introducing a new tool in Creator Studio requiring creators to disclose to viewers when realistic content – content a viewer could easily mistake for a real person, place, or event – is made with altered or synthetic media, including generative AI.
As we announced in November, these disclosures will appear as labels in the expanded description or on the front of the video player. We’re not requiring creators to disclose content that is clearly unrealistic, animated, includes special effects, or has used generative AI for production assistance.
The new label is meant to strengthen transparency with viewers and build trust between creators and their audience. Some examples of content that require disclosure include:
- Using the likeness of a realistic person: Digitally altering content to replace the face of one individual with another's or synthetically generating a person’s voice to narrate a video.
- Altering footage of real events or places: Such as making it appear as if a real building caught fire, or altering a real cityscape to make it appear different than in reality.
- Generating realistic scenes: Showing a realistic depiction of fictional major events, like a tornado moving toward a real town.
Example of a label on the video player
Of course, we recognize that creators use generative AI in a variety of ways throughout the creation process. We won’t require creators to disclose if generative AI was used for productivity, like generating scripts, content ideas, or automatic captions. We also won’t require creators to disclose when synthetic media is unrealistic and/or the changes are inconsequential.
These cases include:
- Clearly unrealistic content, such as animation or someone riding a unicorn through a fantastical world
- Color adjustment or lighting filters
- Special effects like background blur or vintage effects
- Beauty filters or other visual enhancements
You can see a longer list of examples in our Help Center. For most videos, a label will appear in the expanded description, but for videos that touch on more sensitive topics — like health, news, elections, or finance — we’ll also show a more prominent label on the video itself.
Example of a label in the expanded description
You’ll start to see the labels roll out across all YouTube surfaces and formats in the weeks ahead, beginning with the YouTube app on your phone, and soon on your desktop and TV. And while we want to give our community time to adjust to the new process and features, in the future we’ll look at enforcement measures for creators who consistently choose not to disclose this information. In some cases, YouTube may add a label even when a creator hasn't disclosed it, especially if the altered or synthetic content has the potential to confuse or mislead people.
Importantly, we continue to collaborate across the industry to help increase transparency around digital content. This includes our work as a steering member of the Coalition for Content Provenance and Authenticity (C2PA).
In parallel, as we previously announced, we’re continuing to work towards an updated privacy process for people to request the removal of AI-generated or other synthetic or altered content that simulates an identifiable individual, including their face or voice. We’ll have more to share soon on how we’ll be introducing the process globally.
Creators are the heart of YouTube, and they’ll continue to play an incredibly important role in helping their audience understand, embrace, and adapt to the world of generative AI. This will be an ever-evolving process, and we at YouTube will continue to improve as we learn. We hope that this increased transparency will help all of us better appreciate the ways AI continues to empower human creativity.