The Office of Management and Budget issued M-26-04 in March 2026 directing all CFO Act agencies to complete an inventory of AI systems used in or by contractors performing federal services, with a September 30, 2026 deadline. The memo defines a "high-risk AI use" in contracting as any AI system that makes or informs consequential decisions affecting individuals or critical infrastructure, and requires agencies to obtain disclosure attestations from affected contractors, per OMB.

What agencies must do

  • Complete AI use inventories across their active contract portfolio by September 30, 2026
  • For high-risk AI uses, require contractor disclosure attestations (the form is appended to the memo)
  • Post anonymized summaries of contractor AI use to agency open data portals by December 2026
  • Update acquisition plans for major IT programs to include AI governance review checkpoints

What this means for contractors

If your firm uses AI tools in contract performance — even commercially available tools like Copilot, ChatGPT Enterprise, or AI-assisted analytics platforms — expect your COR or CO to ask about them in the coming months. The memo's disclosure framework distinguishes:

  • Tier 1 (high-risk): AI systems used in decisions affecting individuals (benefits, security, enforcement) — full disclosure and documentation required
  • Tier 2 (moderate-risk): AI used in analysis or recommendation tools where a human reviews the output — attestation required but documentation is lighter
  • Tier 3 (low-risk): AI tools for internal productivity (grammar, search, scheduling) — no disclosure required unless specifically asked

Action items

  • Inventory your AI tool usage across all federal contracts now, categorized by tier
  • Update your quality management documentation to reflect human-in-the-loop review steps for Tier 1 and Tier 2 uses
  • Brief your contracts and program management teams — CORs will start asking in Q3 2026

Sources