The ECGI blog is kindly supported by
The Transparency Charter Boards Need for AI Generated Content
Transparency for AI generated content is often framed as a question of hidden markings and automated checks. For boards, it is mainly a question of authority, who decides when a label appears, what proof backs that decision, and who can pause publication when the organisation cannot show compliance.
The paradox is that weak decision ownership, not weak technology, is what turns a transparency duty into inconsistent practice.
On 17 December 2025 the European Commission published the first draft Code of Practice on transparency of AI generated content and opened feedback until 23 January 2026, with a second draft expected by mid March 2026 and a final text targeted for June 2026. The transparency rules are due to apply from 2 August 2026.
What the draft Code changes
The draft Code is voluntary, but it supports compliance with Article 50 of the EU Artificial Intelligence Act, which sets transparency duties for AI generated or edited content. Article 50 splits responsibility between providers, who supply generative systems, and deployers, who use them in their operations. Providers are pushed toward enabling marking and detection, tags that can be verified later, while deployers must disclose certain uses, including deepfakes and, in some cases, AI generated or edited content that informs the public on matters of public interest.
This split matters because it sits on the seam where accountability often fractures. Marking and detection sit close to engineering and product decisions. Labelling sits inside public facing workflows, where speed, judgement, and accountability collide.
The governance failure mode
Most organisations will not fail because a technical tag is missing. They will fail because transparency decisions will be made by habit and delegation, and no one will be able to answer who owned the call, what evidence was reviewed, and who approved an exception.
This is the kind of control gap that lands on an Audit Committee agenda when evidence is requested and cannot be produced.
In March 2023 an image of Pope Francis in a white puffer jacket went viral before a Reuters fact check confirmed it was AI generated.
In practice, these decisions bounce between legal, communications, security, and product, and without a named owner they become slow, inconsistent, or quietly avoided.
Inside an organisation, the same thing happens when a synthetic voiceover is added to a video, or an AI assisted rewrite becomes a public statement. The hard part is deciding what must be disclosed, then being able to show, quickly, who decided and why.
In large organisations, I have seen that failures of public truth usually come from unclear ownership and missing evidence, not from a lack of principles.
That is why boards should ask for a Transparency Charter, a one page table that assigns decision owners, required proof, exception approval, and pause authority, designed specifically for Article 50 decisions.
The Transparency Charter
A good Charter is not a communications guideline. It is a control. It should define the content types that trigger a transparency decision, name the accountable owner for each decision, and specify the minimum evidence file, what was checked, who approved it, and why, that must exist before publication. It should set a review cadence and how long the decision record is kept.
It should define an exception path, who can approve it, what threshold they apply, and what record is required. It should also name who can pause publication until the proof exists.
The purpose is not to create a path around disclosure. It is to make disclosure, and any exceptions, consistent, documented, and auditable.
Boards already recognise this pattern from financial reporting and regulated announcements, sign off, documentation, and escalation. The Transparency Charter applies that discipline to synthetic content decisions.
Proof, not slogans
The evidence trail should not depend on screenshots and memory. Provenance can help, a record of how content was made and edited. Standards such as Content Credentials from the Coalition for Content Provenance and Authenticity provide a way to attach that record to the content so it can be validated later.
Provenance will not make the disclosure decision for you. It cannot decide whether disclosure is required, what wording is appropriate, or when an exception is defensible. That is why the Charter and the evidence file must be designed together, the Charter defines the decisions, tools supply inputs, and the evidence file proves the decision was made under control.
Before the consultation window closes on 23 January 2026, boards should ask management for a draft Transparency Charter that fits on one page and can produce an evidence file within hours for a simple synthetic content scenario. Use the consultation window to test the control, and fix what breaks, before the duty goes live. If the Charter cannot name an owner and produce proof, the gap is governance, and governance is the board’s job.
__________________
Kostakis Bouzoukas is a London-based technology leader focused on board-level governance for AI-generated content.
The ECGI does not, consistent with its constitutional purpose, have a view or opinion. If you wish to respond to this article, you can submit a blog article or 'letter to the editor' by clicking here.