Pressure continues to mount in the U.S. to keep data privacy and protection at the forefront. With the California Consumer Privacy Act set to go into effect in January 2020, consumers, businesses, governmental entities and other third parties are closely watching the data privacy developments taking shape in the U.S.
The Algorithmic Accountability Act of 2019
On April 10, Sens. Cory Booker, D-N.J., and Ron Wyden, D-Ore., sponsored the Algorithmic Accountability Act of 2019 in the U.S. Senate, with a House of Representatives equivalent sponsored by Rep. Yvette Clarke, D-N.Y. The bill, which is also referred to as S.1108, is influenced by the GDPR and CCPA, and directs the Federal Trade Commission to require entities that use, store or share personal information to conduct automated decision system impact assessments and data protection impact assessments.
Those entities that would be mandated to adhere to the Accountability Act shall be referred to as “covered entities.” The Accountability Act includes in its definition of covered entities any person, partnership or corporation, which the FTC has jurisdiction over under Section 5(a)(2) of the FTC Act (15 U.S.C. 45(a)(2)), and as well as those covered entities that had $50,000,000 in average annual revenue for the past three years or ones that possess or control personal information on more than 1,000,000 consumers or 1,000,000 consumer devices.
The bill mandates conducting a data protection impact assessment
One of the key requirements proposed by the Accountability Act is to mandate that covered entities conduct a DPIA under certain circumstances. A DPIA is defined as “a study evaluating the extent to which an information system protects the privacy and security of personal information the system processes.”
The Accountability Act’s overall intent is to prevent risk and harm from automated decision systems as to the privacy or security of personal information of consumers. Some of the risks that are mentioned that could potentially negatively impact consumers include violations of privacy and security, inaccuracies, bias and discrimination — which can possibly result from automated decisions.
Other key requirements of the Accountability Act
The bill would require the implementation of "an assessment of the relative benefits and costs of the automated decision system in light of its purpose, taking into account relevant factors, including:
- data minimization practices;
- the duration for which personal information and the results of the automated decision system are stored;
- what information about the automated decision system is available to consumers;
- the extent to which consumers have access to the results of the automated decision system and may correct or object to its results; and
- the recipients of the results of the automated decision system.”
To avoid the above-referenced risks to consumers, covered entities are required to implement technological and physical safeguards. Special attention is given to what is defined as a “High-Risk Automated Decision System” or a “High-Risk Information System.” These systems are categorized as “high-risk” and can include but are not limited to the following: “the personal information of a significant number of consumers regarding race, color, national origin, political opinions, religion, trade union membership, genetic data, biometric data, health, gender, gender identity, sexuality, sexual orientation, criminal convictions, or arrests.”
The Accountability Act states that once enacted, the FTC has up to two years to promulgate regulations in accordance with Section 553 of Title 5, U.S.C. Included in the regulation is a requirement that covered entities would have to conduct a DPIA “of existing high-risk information systems, as frequently as the Commission determines is necessary.” The publication of the DPIA is optional, and it is at the sole discretion of the covered entity whether or not to make it public.
Enforcement of the Accountability Act
In addition, the Accountability Act provides for enforcement by the FTC under unfair or deceptive acts or practices under section 18(a)(1)(B) of the FTC Act (15 U.S.C. 57a(a)(1)(B). However, it also allows state attorney generals to “bring a civil action on behalf of the residents of the State in an appropriate district court of the United States to obtain appropriate relief.”
The enforcement net for the Accountability Act has been cast very wide, as it also allows for actions to be brought by other state officials. More specifically, it states “in addition to a civil action brought by an attorney general under paragraph (1), any other officer of a State who is authorized by the State to do so may bring a civil action under paragraph (1), subject to the same requirements and limitations that apply under this subsection to civil actions brought by attorneys general.”
There is no federal preemption as a result of the Accountability Act, as it clearly states that nothing may be construed in the act to preempt any state law. The message being conveyed to potential covered entities is that they are expected to ensure fairness, impartiality and transparency in their automated decision making, which should provide consumers with enhanced data privacy and protections from algorithmic biases and risks.
Over the past several years, there has been a heightened focus on data privacy and protection. All indications point to the fact that this will only increase for the foreseeable future — and on a global level.
Photo by Andy Feliciotti on Unsplash