FINOS CC4AI (Common Controls for AI Services)
CC4AI — Common Controls for AI Services — is the FINOS initiative to establish a common, machine-readable evidence-artifact format so cloud providers and AI vendors can attest once to their AI data practices, and every consuming financial institution can inherit that assurance without running bilateral audits.
The initiative is led by a consortium that includes BMO, Citi, Microsoft, Morgan Stanley, RBC, Bank of America, Google Cloud, Red Hat, and AWS. CC4AI exists because policy documents are not sufficient evidence of AI data governance — regulated institutions need standardized, structured attestations they can plug directly into their third-party risk programs.
What CC4AI Covers
CC4AI defines the evidence artifacts that an AI vendor produces to prove its AI data practices are enforced — covering the controls regulated institutions must demonstrate under DORA, the EU AI Act, and the FINOS AIGF:
- Training data provenance — which data sources contributed to which model
- Retention periods — how long AI-related data is held and for what purpose
- Cross-tenant isolation — guarantees that one customer's data does not influence another customer's AI outputs
- Opt-out enforcement — evidence that customer opt-out decisions are operationally honored
- Policy-change notification cadence — how much advance notice customers receive for material changes
- Machine-readable format — attestations structured for automated ingestion by third-party risk tooling
Why CC4AI Matters
The current reality is that vendors publish policy documents, and customers manually translate those documents into control attestations during audits. This does not scale, and it does not produce evidence a regulator will accept as operational proof.
CC4AI replaces that manual loop with a standardized artifact. A vendor attests once. Every consuming institution can inherit and verify that attestation through its governance pipeline. When a regulator asks, the answer is already structured, already mapped to AIGF risk categories, already ready.
CC4AI in the Context of Vendor AI Data Changes
CC4AI exists precisely because the vendor AI data collection pattern — SaaS platforms unilaterally modifying training practices with lead times shorter than enterprise compliance cycles — makes the traditional control-attestation workflow impossible. A vendor that supports CC4AI gives its customers the evidence they need to operate their third-party risk programs at the speed the AI era now requires.
How iTmethods Aligns with CC4AI
iTmethods Reign is being aligned to consume and produce CC4AI-format attestations as they become available. For customers, this means third-party AI vendor governance evidence flows into the Evidence Engine automatically — and Reign's outputs to regulators map to the same controls CC4AI defines.
