High-risk categories under Annex III are marked. GPAI obligations under Art. 51–56 apply to foundation model deployments.
To generate the board liability statement, answer all questions in the current phase — including your legal role under the EU AI Act. This is required for accurate obligation mapping.
The board question of 2026 is not "Are we using AI?" It is: "If this AI-supported decision is challenged tomorrow — legally, regulatorily, reputationally — can we prove it was made correctly, by the right people, with the right evidence?"
High-risk categories under Annex III are marked. GPAI obligations under Art. 51–56 apply to foundation model deployments.
Patrick Upmann advises supervisory boards, executive management, and family-owned businesses on structuring AI governance — with a focus on defensibility, accountability, and regulatory resilience under the EU AI Act. The "Thinking AI vs. Governing AI" framework addresses the decisive gap: companies understand AI strategically, but few can defend AI-supported decisions under legal or regulatory pressure.
Specialised in Provider vs. Deployer obligation mapping, Annex III classification, and board-level accountability architecture under EU AI Act 2024/1689.
This tool reveals the exposure. Patrick Upmann's board session format closes it: structured role mapping, Annex III classification, liability architecture, and governance documentation — from gap to defensible board position.