Greetings from Portsmouth, New Hampshire!
As I write this, I am only hours removed from getting my second dose of the COVID-19 vaccine. As with the first shot I received three weeks ago, I feel immeasurable relief, tentative optimism for the future and a decently sore left arm. For the first time since this pandemic began, I am now looking at the months ahead with joy rather than dread. It’s a nice change of pace, I must say.
By the way, the 5G signals haven’t seemed to kick in just yet, but I can locate all ham radios and eight-track players within a 100-mile radius. I guess that’s something?
Now onto the world of privacy.
Artificial intelligence has been the name of the game this week. The biggest news came from across the pond as the European Commission published its highly awaited proposal for regulating AI. The proposed regulations could be a legitimate game-changer for the industry, and as Hogan Lovells Senior Associate Dan Whitehead wrote for the IAPP, they may also be “of real consequence to the privacy profession” in particular.
While the commission’s proposal may be the headliner of the week, the U.S. Federal Trade Commission published a blog post of its own that privacy professionals may want to keep an eye on.
In the simplest terms, the FTC is warning organizations to watch out when handling AI, because if you don’t, they’ll be coming. The agency even said the sale or use of racially biased algorithms may constitute a violation of Section 5 of the FTC Act.
The FTC’s declaration may be the signal of a new era in which these harmful algorithms are addressed. University of Washington School of Law Assistant Professional Ryan Calo seems to think so, as he called the agency’s messaging “a very stark example of what looks to be a sea of change.”
Of course, the potential fines organizations may face for using and selling racially biased algorithms shouldn’t be the main reason the practice must be stopped. Far from it, actually. The first, second, third and fourth through tenth reasons why these algorithms must be dealt with is because they greatly affect minorities — human beings whose lives are affected by forces they cannot control.
Everything starts and ends there. Tackling these algorithms isn’t going to be the cure-all that fixes everything in tech, and it certainly won’t remedy all the issues this country has to face, but it’s a step, however small, in the right direction.
If you want to comment on this post, you need to login.