Request scope

Free decision asset

AI Tool Approval Checklist

Use this before an AI tool spreads from a quick experiment into operational workflow dependency. The goal is not fear. The goal is controlled adoption.

Why this exists

AI tools often enter through convenience and become policy questions later.

A good approval process separates low-risk experimentation from sensitive data, customer workflows, financial decisions, legal exposure, and operational dependency.

Checklist

The checks to run before approving an AI tool.

1. Allowed use case

  • Define the exact workflow where the AI tool may be used.
  • Separate drafting, summarization, analysis, automation, and decision support.
  • State whether outputs require human review before use.
  • Name use cases that are explicitly not allowed.

2. Data boundaries

  • List data categories users may enter and categories they must not enter.
  • Check whether customer data, employee data, financial data, legal data, or confidential files are involved.
  • Confirm whether prompts, files, logs, and outputs are stored.
  • Confirm deletion options and retention periods.

3. Training and model chain

  • Verify whether customer inputs or outputs may be used to train models.
  • Identify underlying model providers and subprocessors.
  • Check data locations and cross-border transfer statements.
  • Flag unclear language such as 'may improve services' for vendor clarification.

4. Admin controls

  • Check SSO, role management, workspace controls, usage restrictions, and audit logs.
  • Verify whether controls are available on the intended plan.
  • Confirm whether admins can enforce data-sharing settings.
  • Record which controls must be tested during pilot.

5. Workflow and output risk

  • Identify where inaccurate output could create operational, customer, legal, or financial harm.
  • Define when a human must verify output before action.
  • Check whether the tool can cite sources or preserve evidence trails.
  • Decide whether the tool may automate actions or only assist humans.

6. Approval conditions

  • Write the decision as approve, block, or approve under conditions.
  • List allowed teams, allowed data, blocked use cases, and required controls.
  • Create vendor questions for unresolved data, retention, training, or control claims.
  • Set a review date after pilot or policy change.

How to use it

Use the checklist before adoption becomes invisible.

The strongest AI governance starts before teams normalize a tool. This checklist creates just enough friction to make adoption deliberate.

ApproveLow-risk use case, clear data boundaries, documented controls.
ConditionUseful tool, but training, retention, or admin controls need verification.
BlockSensitive data exposure, unclear vendor chain, or no workable controls.
ReviewRe-check after pilot, vendor policy change, or expanded use case.

EvidenceOps

If the checklist shows there is more at stake, get the vendor reviewed.

Request scope