Resource Center / Resource Articles / Implications of the AI executive order for business

Implications of the AI executive order for business

This resource analyzes the policies and principles reflected in Executive Order 14110, which will have direct and indirect effects on AI governance for years to come.


Published: November 2023


Contributors:


Navigate by Topic

Over the last few years many countries have released various forms of oversight for artificial intelligence. This has created speculation about how the U.S. will approach this generational challenge. On 30 Oct., the U.S. took the next step toward answering this question when President Joe Biden released a comprehensive Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence.

Executive Order 14110 identifies the immense opportunity of AI, acknowledging that it must have guardrails to be truly harnessed for the betterment of society. With safety and responsibility as cornerstones of the order, the Biden-Harris administration is directing government departments and agencies to inform these important efforts. Additionally, the order provides a wide range of new opportunities for input from the private sector, academia and civil society.

Guided by eight policies and principles, which serve as direction for the administration's approach to the governance of AI, much of the order requires additional implementation. However, there is a clear sense of urgency, with deadlines ranging from 45-375 days for a wide range of projects from public consultations to new regulations. The first consultations are already beginning. Immediately after the order's release, the Office of Management and Budget released draft guidance for the use of AI by the federal government, along with an invitation for stakeholders to comment.

As you develop AI governance initiatives within your organization, you should understand the policies and principles reflected in the executive order, which will have direct and indirect effects for years to come. One theme is clear in the order: the administration understands that people are at the heart of AI development and use.


Why an executive order?

In the absence of congressional action, the Biden-Harris administration made clear it intends to use every tool available to respond to the priority of AI policy. Executive orders are rules and instructions issued by the president that bind the addressed executive agencies. They have the force of law with respect to their addressees and, though they are of principal relevance to those addressees, executive orders can have wide-ranging impacts on the economy.

While Executive Order 14110 will lead to an incredible amount of government action in a relatively short period — building on many existing government initiatives — it offers only limited new requirements that apply today. However, the order serves as a watershed moment in its articulation of the types of rules under consideration and its setting of guideposts for the development of best practices. As agencies and regulators continue to engage with the order, the expectations of reasonable AI practices will be informed by what comes next.


The scope of AI guardrails

The order defines AI in broad terms as including any machine-based system that can make predictions, recommendations or decisions influencing real or virtual environments. However, many of the order's specific requirements apply only to clearly defined higher risk systems, such as dual-use foundation models. Dual-use models are distinct both in relation to their size (trained on tens of billions of parameters) and their potential use for both security and civilian purposes.


Policies and principles

  • expand_more

  • expand_more

  • expand_more

  • expand_more

  • expand_more

  • expand_more

  • expand_more

  • expand_more


Top takeaways for AI governance professionals

This executive order is a significant step forward for the governance of AI in the U.S. and will have an impact on the development of policies and standards internationally.

Though binding on most federal agencies—excepting independent agencies—the order addresses all aspects of the AI supply chain and is therefore directly relevant when the private sector does business with federal agencies. It is also indirectly relevant for parts of the private sector not engaged with federal agencies because of the AI governance norms that will flow from new standards and evolving best practices.

While many details and requirements will be forthcoming in various consultations, frameworks and regulations, now is the time to act by preparing your organization.

Some considerations to get you started:

  1. Consider fostering AI governance practices within your organization, such as:
    • Creating a chief AI officer or accountable individual or group within your organization.
    • Training individuals involved in development and use of AI across the organization. For example, check out the IAPP AI Governance Professional training.
    • Developing policies and procedures within your organization, including reporting, documenting and tracking the use of AI and data involved throughout the lifecycle.
    • Introducing accountability and compliance mechanisms for AI, including laying the foundation for likely future reporting requirements.
    • Developing AI assurance practices, such as tests, evaluations, assessments, and audits for your people, processes and tools.
  2. Familiarize yourself with the NIST AI RMF and other emerging AI standards development processes.
    • If you work in or with the U.S. government, it appears the AI RMF and future NIST standards will likely align with future AI guidance and any sectoral regulation.
    • Support for implementing the concepts in the AI RMF can be found in the companion playbook.
    • Interested organizations can also help shape standards development through the NIST's open call for participation in the AI standards development process.
  3. Prepare to align with guidance and direction provided for federal departments and agencies if you work for an AI vendor that sells to the government.
  4. Start to review workforce impacts due to increased AI use, by:
    • Being aware of the data you are collecting for use in AI systems. For example, it is likely new rules will be required for collecting and using employee data in the workplace to ensure civil rights and liberties.
    • Thinking about what jobs will change and how you are preparing for an AI-enabled workforce, including requirements for new AI roles in technical, operational or governance teams.
  5. Keep aware of forthcoming AI rules and regulations:
    • There are a lot of clues in this order and the accompanying OMB memo about what types of requirements may be in future frameworks, guidance and regulations. While i one general purpose law for AI is unlikely, it appears there will be several sectoral and use-case specific rules.
  6. Look for new grants and funding opportunities if you are looking to expand your business to include AI.
  7. Keep an eye out for upcoming opportunities if you are an AI expert or are looking for a career change. It appears that there will be many federal government jobs available in the near future.

Key items to watch

The executive order directs 150 requirements to be taken by federal agencies with variable scope and timing. For a full list of these orders, see the Safe, Secure, and Trustworthy Tracker developed by the Stanford Institute for Human-Centered AI.

The IAPP AI Governance Center will work to provide updates of relevant work completed from these requirements as they evolve. Here are some key items we will be watching:

  • Development of new frameworks, guidance and/or regulations for AI in criminal justice, law enforcement, transportation, hiring, education and health care.
  • Continued development of NIST standards to promote the safe and responsible design, development and use of AI.
  • Additional safeguards that will be required of companies developing large language models, making use of large computers or developing high-risk applications with cyber- or biosecurity, in addition to the new reporting requirements.
  • The U.S. government's approach to open-source model development.
  • Clarity around new AI talent rules to attract and retain AI expertise to work and study in the U.S.
  • AI standards development for organizations, whether through procurement requirements, government standard-setting or multistakeholder engagement.
  • Credentialing requirements for AI engineers and other AI roles, including AI governance professionals.

While high expectations have been set for this work, it is important to note the limitations of an executive order, which does not have the same enforcement requirements as a law. As a result, changes in direction could occur along with changes in administration. In addition, the president is unable to authorize new funding to complete the work prioritized within the order. This means changes to budgets — or existing funding constraints — could affect the capacity for projects to be completed within the identified time limits.

However, even if milestones are missed, the impact of this order will be felt for years.


Additional resources



Approved
CDPO, CDPO/BR, CDPO/FR, CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT, LGPD
Credits: 2

Submit for CPEs