TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Daily Dashboard | ITIF: Privacy law mirroring GDPR, CCPA could cost U.S. $122B Related reading: Notes from the IAPP, Feb. 14, 2020


A report from the Information Technology and Innovation Foundation found that a law mirroring either the EU General Data Protection Regulation or the California Consumer Privacy Act could cost the U.S. $122 billion per year. Should the U.S. Congress pass “a more targeted set of privacy protections,” it could still protect citizens' privacy and reduce costs to $6.5 billion a year, according to the ITIF report. “The costs of regulations can be significant, and are justifiable only when their benefits both outweigh their costs and are achieved in the most cost-effective way,” writes ITIF Vice President Daniel Castro and Senior Policy Analyst Alan McQuinn. “Neither of these are true with regard to the kind of overly restrictive data protection law many privacy advocates have demanded.”
Full Story

1 Comment

If you want to comment on this post, you need to login.

  • comment Trevor Mead • Aug 6, 2019
    Deeply put off by this report for several reasons, mostly due to credibility issues:
    1. Clear bias throughout. The authors lament the costs of "unnecessarily stringent" privacy regulations, without acknowledging the benefits afforded to consumers through those regulations or why stringent regulations were adopted in the first place. The piece is a catalog of industry gripes and one-sided arguments which boil down to "regulation is expensive, and we think some consumers are sometimes mildly inconvenienced by privacy regulations, therefore privacy regulations should be curtailed as aggressively as possible or avoided completely." 
    2. Poorly sourced or supported arguments. A staggering number of claims made are either guesses due to a lack of available data or refer back to the same authors' opinions from the same think tank, with equally thin original sources. Take this nugget as an example: "Unfortunately, opt-in requirements frame consumer choices in a way that leads to suboptimal data sharing because most users select the default option of not giving consent—for a number of irrational reasons." Without acknowledging the harms to consumers of increased data sharing, the increased risk to organizations storing excessive personal information, the benefits to consumers of more control, choice, and transparency over the sharing of personal information--or any of other other perfectly rational reasons to opt-out of superfluous sharing with undisclosed third parties--the author references his own opinion piece as a supporting citation, which somewhat arbitrarily and perversely concludes that privacy regulations only burden consumers and pit consumers against each other.
    3. Distressingly bad math. Since the authors openly invent numbers where hard data doesn't exist, they make some strange assumptions to better pad their figures: first, their estimated cost to US companies of implementing a GDPR-styled compliance program assumes no privacy-related staffing resources are available and that no prior privacy-related work (i.e. GDPR, CCPA, or related compliance exercises) has been performed--while in the same breath noting 68% of US companies with over 500 employees are spending an estimated $1-10 million on GDPR compliance alone. Second, their estimated cost of a smaller, "targeted" approach completely disregards key expenses: the authors flatly confirm "We did not estimate costs for federal or state agencies in this report." This is worse than comparing apples to oranges, it's intentionally misrepresenting what the apples and oranges are. Lastly, the calculated costs again fail to take privacy benefits into consideration: in estimating cookie banners would cost the US economy $1.9 billion due to lost productivity (based on the arbitrary figure of 2 seconds of interaction per adult), for example, the authors decline to weigh those figures against the increased productivity resulting from fewer spam messages to consumers as a result of privacy regulations--messages which cost the US economy $20-21 billion annually in lost productivity alone, more than tenfold the cost the authors are concerned with.
    4. The proposed solution is a joke. To save on all these costs--unsupported and exaggerated estimates untethered from the moors of countervailing evidence, mind you--the authors call for the adoption of a framework which only applies to "sensitive information in certain industries", not personal information as understood by the industry at large. Not only does this cut against the prevailing trend of increased data privacy protections for consumers in the face of increasingly efficient, asymmetrical, and invasive data processing practices by organizations, but reducing the scope of privacy protections to a such a narrow sliver of data would eviscerate consumer protections and encourage the continued abuse of inferences drawn from personal information; abuses which led to calls for increased privacy regulation in the first place. Device fingerprinting information is not sensitive, but can be used in ways which reveal sensitive information about individuals. Same goes for zip codes, which are far from sensitive yet used as a proxy for potentially sensitive information related to race, education, income and political affiliation. This proposal would apply to neither, and the authors offer no justification for slashing consumer rights beyond an invented price tag, with no suggestion of how to address the remaining and substantial consumer privacy concerns GDPR and CCPA-like regulations were developed for. 
    Any serious policy proposal must address both the cost and benefits of privacy regulations, and must place consumer protections above cost justifications. The goal of privacy regulation is to increase consumer control over personal information and decrease consumer harm due to privacy incidents, a goal which--when implemented correctly--benefits consumers and organizations alike.