The Federal Acquisition Regulatory (FAR) Council published an interim rule in March 2026 establishing baseline requirements for artificial intelligence transparency in federal contract performance. The rule — effective 60 days after Federal Register publication — requires both agencies and contractors to take new steps when AI tools are used in delivering contracted services, per the Federal Register and reporting by Federal News Network.
What the rule requires of contractors
- Disclosure: Contractors must notify the Contracting Officer if AI tools are materially used in performance — "material" defined as AI-generated outputs incorporated into deliverables without human review for accuracy
- Data handling: Federal data used to train or fine-tune AI models during performance becomes subject to existing data rights clauses — no silent training on government data
- Bias testing: For AI used in decisions affecting individuals (benefits determinations, security adjudications), contractors must maintain bias testing documentation
What it does NOT prohibit
The interim rule is disclosure-focused, not a prohibition. Using AI tools in proposal writing, internal analytics, or code generation is not covered unless the output goes directly into a deliverable without human review. The rule explicitly carves out:
- Spelling and grammar tools
- Search and retrieval systems (including AI-assisted search)
- AI tools used solely for contractor internal operations not touching the deliverable
Action items
- Inventory your team's AI tool use on active federal contracts — identify any that meet the "material deliverable" threshold
- Update your quality review SOPs to document human review steps for AI-generated content
- Watch for agency-specific supplements — NASA, DoD, and DHS have all indicated they will issue class deviations with stricter requirements for sensitive programs