For a region like Latin America, where credit and banking access is especially challenging and often limited by consumers' credit history, it is relevant to understand the role of companies generating predictive instruments and their level of consumer transparency.
Chile is anticipated to enact a new personal data protection law and, like the EU General Data Protection Regulation, the draft grants data subjects the right to object to being subject to decisions based on automated data processing. It also grants the right to access information on the logic used in decisions determined through automated data processing. However, a question arises as to whether the proposed regulation will apply to consumer reporting companies that use predictive models to determine credit risk scores, which are definitive when accessing financial products.
Financial institutions use credit risk scores to evaluate and predict a consumer's credit behavior. Scoring models use mathematical and statistical data to compute a consumer's specific level of default risk, considering information like past insolvency or default on obligations, which is commonly shared among financial entities. Companies generate predictive values, evaluate and determine credit risk based on a certain score, and share a report with financial institutions, which then evaluate the findings and determine the conditions of the financial products and services offered. The companies do not necessarily have a direct relationship with the consumer, so any dispute would be through the financial entity.
The problem is credit score reports are often the only instrument used to evaluate consumers' access to financial products like mortgages, credit cards and bank accounts, so the accuracy and completeness of these reports are critical. Inaccurate credit score reports cause great harm by reducing the chances of accessing credit and banking services. This situation is particularly dramatic for low-income families who, for example, rely on short-term loans to meet their monthly living expenses.
The fact that consumers can only dispute the information contained in these reports with financial institutions puts them at an enormous disadvantage, with no way of accessing details on the logic used to create their score. As financial institutions are only a recipient of a credit score report, they cannot provide this information, which is vital for consumers to understand whether the information they received is objective due to the risk of discrimination based on data used.
Despite the need for greater transparency, algorithm-based scoring methods are considered confidential and are protected as trade secrets to prevent other companies from copying models, as well as manipulation by consumers. The absence of a standard, easily verifiable mathematical model used by companies has further contributed to the lack of transparency in the credit industry.
The opacity surrounding existing rating methods has been criticized for preventing consumers, stakeholders and regulators from being able to challenge these models. However, existing credit-rating systems are an acceptable way of measuring a person's financial health and are generally considered fair and objective.
A 7 Dec. 2023 ruling by the Court of Justice of the European Union on automated individual decision-making could solve this dilemma at the European level. The decision, which prohibits certain automated credit scoring and extended data retention practices under the GDPR, analyzes the scope of the regulation's Article 22 and specifically states the concept of "decision" could be applied to companies that develop predictive tools for third parties but have no direct interference in the subsequent decisions made. In other words, according to the CJEU, the simple fact of presenting risk scores would fall within the scope of what is understood by decisions based on automated individual decision-making insofar as they are decisive for a third party, to whom they are transmitted to establish, execute or terminate a contractual relationship with a given person.
Article 8 of Chile's draft regulations replicates Article 22 of the GDPR. On automated individual decisions the provision states, "The data subject has the right to object to and not be subject to decisions based on automated processing of his personal data, including profiling, which produces legal effects on him or significantly affects him." Unlike the GDPR's Article 22, the provision does not establish that the decision must be based "solely" on the automated processing of data, which could result in wider applicability of this right since it will be sufficient for automated processing to be considered in the decision.
Additionally, like the GDPR, the draft contemplates the right of access to information of data subjects. Article 5 specifically refers to data subjects' right to request "significant information on the logic applied in the event that the data controller carries out data processing" in accordance with Article 8. Therefore, if an interpretation similar to that of the CJEU were to be applied, consumers in Chile could object to companies conducting automated processing with their data, unless it is necessary for the performance of a contract, their prior and express consent is given, or the processing is authorized by law. On the other hand, consumers could directly ask companies to provide information on how scoring and weighting logics are developed.
Unlike the GDPR and other EU regulations, Chile's draft law does not include recitals to help interpret Article 8's scope, especially regarding whether the concept of decision can be extended to entities not directly participating in determining whether a loan or other financial product is granted. In this sense, the administrative interpretation of this rule by the Personal Data Protection Agency, which would be created under the draft legislation, will be relevant, as well as the interpretation of courts in cases that may arise.
Decisions based solely on automated decision-making can be considered a danger to consumers as they can create situations of discrimination that violate individuals' fundamental rights. Thus, the scope of Article 8 in Chile's draft law must be understood within the protective logic that seeks to inform all regulations on personal data protection and also in line with the comparative experience of not leaving individuals who are subject to these decisions unprotected.
Editor's Note:
The views and opinions expressed in this article are the author’s alone and do not necessarily represent the views or opinions of Uber or any of its subsidiaries, affiliates or employees.