Corporate Spy Tech and Inequality: 2023 Year in Review

Business

Our personal data and the ways private companies harvest and monetize it plays an increasingly powerful role in modern life. Throughout 2023, corporations have continued to collect our personal data, sell it to governments, use it to reach inferences about us, and exacerbate existing structural inequalities across society. 

EFF is fighting back. Earlier this year, we filed comments with the U.S. National Telecommunications and Information Administration addressing the ways that corporate data surveillance practices cause discrimination against people of color, women, and other vulnerable groups. Thus, data privacy legislation is civil rights legislation. And we need it now.

In early October, a bad actor claimed they were selling stolen data from the genetic testing service, 23andMe. This initially included display name, birth year, sex, and some details about genetic ancestry results—of one million users of Ashkenazi Jewish descent and another 100,000 users of Chinese descent. By mid-October this expanded out to another four million accounts. It’s still unclear if the thieves deliberately targeted users based on race or religion. EFF provided guidance to users about how to protect their accounts. 

When it comes to corporate data surveillance, users’ incomes can alter their threat models. Lower-income people are often less able to avoid corporate harvesting of their data, as some lower-priced technologies collect more data than other technologies, whilst others contain pre-installed malicious programmes. This year, we investigated the low-budget Dragon Touch KidzPad Y88X 10 kid’s tablet, bought from online vendor Amazon, and revealed that malware and pre-installed riskware were present. Likewise, lower-income people may suffer the most from data breaches, because it costs money and takes considerable time to freeze and monitor credit reports, and to obtain identity theft prevention services.

Disparities in whose data is collected by corporations leads to disparities in whose data is sold by corporations to government agencies. As we explained this year, even the U.S. Director of National Intelligence thinks the government should stop buying corporate surveillance data. Structural inequalities affect whose data is purchased by governments. And when government agencies have access to the vast reservoir of personal data that businesses have collected from us, bias is a likely outcome.  

This year we’ve also repeatedly blown the whistle on the ways that automakers stockpile data about how we drive—and about where self-driving cars take us. There is an active government and private market for vehicle data, including location data, which is difficult if not impossible to de-identify. Cars can collect information not only about the vehicle itself, but also about what’s around the vehicle. Police have seized location data about people attending Black-led protests against police violence and racism. Further, location data can have a disparate impact on certain consumers who may be penalized for living in a certain neighborhood.

Technologies developed by businesses for governments can yield discriminatory results. Take face recognition, for example. Earlier this year, the Government Accountability Office (GAO) published a report highlighting the inadequate and nonexistent rules for how federal agencies use face recognition, underlining what we’ve said over and over again: governments cannot be trusted with this flawed and dangerous technology. The technology all too often does not work—particularly pertaining to Black people and women. In February, Porcha Woodruff was arrested by six Detroit police officers on the charges of robbery and carjacking after face recognition technology incorrectly matched an eight-year-old image of her (from a police database) with video footage of a suspect. The charges were dropped and she has since filed a lawsuit against the City of Detroit. Her lawsuit joins two others against the Detroit police for incorrect face recognition matches.

Developments throughout 2023 affirm that we need to reduce the amount of data that corporations can collect and sell to end the disparate impacts caused by corporate data processing. EFF has repeatedly called for such privacy legislation. To be effective, it must include effective private enforcement, and prohibit “pay for privacy” schemes that hurt lower-income people. In the U.S., states have been more proactive and more willing to consider such protections, so legislation at the federal level must not preempt state legislation. The pervasive ecosystem of data surveillance is a civil rights problem, and as we head into 2024 we must continue thinking about them as parts of the same problem. 

This blog is part of our Year in Review series. Read other articles about the fight for digital rights in 2023.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *