What sections 9 and 10 of the executive order on AI mean for government contractors

The federal government has been using or developing AI for several years. Section 10 of the EO now provides uniform direction for federal government agency effo...

On October 30, 2023, President Biden signed an Executive Order (EO) on the Safe, Secure and Trustworthy Development and Use of Artificial Intelligence. While the EO contains many mandates that will impact commercial entities, sections 9 and 10 of the EO, related to protecting privacy and advancing the federal government’s use of AI, are of particular interest to government contractors, as they contain requirements that will directly impact the government’s development, funding and procurement of private sector AI resources.

Section 9: Protecting privacy

Section 9 orders federal government action to mitigate potential threats to privacy posed by AI.  In particular, the EO expresses concern regarding “AI’s facilitation of the collection or use of information about individuals, or the making of inferences about individuals.” Accordingly, the EO obligates the federal government to take several steps to protect people’s privacy when the federal government makes use of AI. To that end, the Office Management and Budget must take steps to identify commercially available information (CAI) procured by federal agencies. CAI is defined in the EO as “information or data about an individual or group of individuals, including their device or location, that is made available or obtainable and sold, leased or licensed to the general public or to governmental or non-governmental entities.” In addition, OMB will also evaluate agency standards for the collection, processing or use of CAI that contains personally identifiable information (PII), and issue a request for information to inform potential revisions to such standards within 180 days of the EO’s publication. Related to the protection of privacy, the National Science Foundation will promote research, development and implementation of privacy-enhancing technologies (PETS), including by creating a research coordination network dedicated to advancing privacy research and working with federal agencies to identify opportunities to incorporate PETS (e.g. AI-generated synthetic data) into agency operations.

What section 9 means for government contractors:

  • Contractors should evaluate whether current or future contract performance may involve CAI, especially CAI containing PII. Contractors who do expect to use CAI in contract performance should be on the lookout for CAI guidance from OMB and/or their contracting agencies.
  • Contractors or other commercial entities seeking to develop or sell PETS products should monitor NSF and other federal government communications for PETS-focused grants, requests for information, or other procurement opportunities.

Section 10: Advancing federal government use of AI

The federal government has been using or developing AI for several years. Section 10 of the EO now provides uniform direction for federal government agency efforts to develop and use AI in their operations. Section 10.1 focuses on the provision of government-wide guidance for agency use, management, and procurement of AI, while Section 10.2 outlines a series of priorities and initiatives intended to improve and accelerate federal hiring of AI talent consistent with recent congressional action, including the National Defense Authorization Act (NDAA) for Fiscal Year (FY) 2024, signed into law on December 22, 2023.

Section 10.1: Providing guidance for AI management

Under section 10.1, the director of OMB will create and chair an interagency council to coordinate agencies’ development and use of AI. Section 10.1 also directed OMB to issue guidance on  agency AI governance, advancing AI innovation, and managing risks posed by the federal government’s use of AI.

OMB AI Guidance

In accordance with the EO, OMB published its draft AI guidance on November 1, 2023. Per the EO, OMB has 180 days from November 1 to ensure that agency procurement of AI systems and services align with its guidance. The guidance applies to the use or procurement of AI by a federal agency, but does not apply to AI used as a component of national security systems or to AI use by select federal intelligence entities (e.g., Central Intelligence Agency, Director of National Intelligence, Defense Intelligence Agency, etc.).

On governance, the guidance mandates the designation of a chief AI officer at each agency. The chief AI officer will be tasked with overseeing their agency’s use of AI, representing their agency on the AI interagency council, and other responsibilities to be determined by OMB. The guidance also provides for the creation of AI governance boards or similar bodies at each agency. The boards are intended to serve as a vehicle for senior agency leadership to coordinate with other agencies on the use of AI and to manage use of AI within their own agency. The guidance also contains recommendations to agencies regarding a variety of AI governance topics, including external AI testing or evaluation and documentation of procured AI.

Additionally, the OMB guidance sets forth risk management practices for government uses of AI that could impact people’s rights and safety, which must include EO-mandated practices from Office of Science and Technology Policy (OSTP)’s Blueprint for an AI Bill of Rights and the National Institute for Standards and Technology’s AI Risk Management Framework. The guidance also defines specific federal government uses of AI that are presumed by default to impact rights and safety and therefore are automatically subject to applicable risk management practices.

The guidance also provides recommendations for the federal workforce’s use of generative AI.  To promote innovation and the responsible adoption of AI capabilities, the guidance discourages agencies from imposing broad or general bans on agency use of generative AI. Instead, agencies should employ safeguards, trainings and risk management practices as needed to protect people and information that may be impacted by agencies’ use of AI.

Other section 10.1 provisions

Section 10.1 of the EO tasks OMB with developing a framework to prioritize critical and emerging cloud offerings in the Federal Risk and Authorization Management Program (FedRAMP) authorization process. OMB is instructed to begin by prioritizing “generative AI offerings whose primary purpose is to provide large language model-based chat interfaces, code-generation and debugging tools, and associated application programming interfaces, as well as prompt-based image generators.”

In addition, section 10.1 directs the General Services Administration, in coordination with OMB, the departments of Defense and Homeland Security, NASA and other federal agencies to take steps to ease access to government-wide acquisition solutions for AI services and products, potentially including the creation of an AI acquisition resource guide or similar tools for potential government AI vendors.

Section 10.2: Increasing AI talent in government.

Section 10.2 facilitates faster and better-coordinated hiring of AI talent, as well as better training of federal employees to make use of AI technologies. Among other mandates, AI training and familiarization programs will be made available by the heads of each agency for employees, managers and leadership in working in AI-relevant roles. These training programs should provide knowledge of emerging AI technologies such that federal employees can assess the risks and opportunities of such technologies. As recently highlighted by the Government Accountability Office in its December 14, 2023 report, a deficit in AI talent may be “one of the greatest impediments” to the United States’ goals with AI.

What Section 10 means for government contractors:

  • Contractors should note that EO section 10 and the OMB AI guidance are intended to promote (not necessarily restrict) the federal government’s use of AI, so long as such usage is safe and responsible. Contractors who proactively tailor their AI products/services to align with section 10 and the OMB guidance will likely have a leg up on their competitors when it comes to selling their products/services to government customers.
  • Once the guidance is finalized, contractors should consider proactively adopting the risk management practices and safeguards prescribed by the guidance, as well as any other AI-relevant procurement materials (e.g. the GSA’s AI acquisition resource guidance) published by the federal government.
  • Contractors interested in providing cloud-based generative AI products to the federal government should considering exploring prioritized FedRAMP authorization, which is mandatory for all federal agency cloud deployments, including those operated by contractors.

Evan D. Wolff is a partner and Michael G. Gruden, Laura J. Mitchell Baker are counsels in Crowell & Moring’s Washington, D.C. office. Counsel Michelle D. Coleman and associate Jake Harrison also contributed.

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories