The FTC May (Finally) Protect Americans From Data Brokers

Rate this post

On August 29, The Federal Trade Commission announced that it had filed a landmark lawsuit against data broker Kochava for “selling geolocation data from hundreds of millions of mobile devices” that can be used to track people’s time-stamped movements in sensitive locations. These include reproductive health clinics, places of worship, addiction recovery centers, and shelters for the homeless and survivors of domestic violence. Kochava, the FTC alleges, “is allowing others to identify people and exposing them to threats of stigma, harassment, discrimination, job loss and even physical violence.”

In response, the Idaho-based company says it “operates consistently and proactively in compliance with all regulations and laws, including those specific to privacy.” In other words, Kochova relied on a basic defense in the data runner’s playbook: Well, it’s legal.

But that’s like saying you’ve read every book on the subject when all that’s been written is a waiting room pamphlet. In a colossal failure of US policymaking—and in many cases the product of deliberate attempts to undermine or neglect the privacy of marginalized people—the US has generally weak privacy laws. Very few laws in the US even relate to data brokers, let alone restrict their actions. However, Kochava’s failure to break the law does not make his behavior harmless, nor does it make the company immune from lawsuits. The FTC case could establish that this type of data monitoring, monetization and exploitation is an unfair or deceptive business practice, exposing brokers to penalties. And it has the argument to get there.

Despite the lack of privacy laws, the FTC can still take companies to court for engaging in “unfair or deceptive acts or practices.” The FTC’s lawsuits against data brokers are unprecedented, but have typically focused on behavior such as facilitating criminal scams. By suing Kochava for brokering individuals’ geolocation data without their knowledge and exposing them to risk, the FTC is effectively pushing for a greater basis for action against data brokering harms.

While data brokers and other tech companies (from Experian to financial data broker Yodlee) have absurdly claimed that their data is “anonymized,” Kochova’s billions of threads of data are any less thing The company provided mobile advertising IDs, which allow marketers to track the person behind a device, combined with people’s location information, allowing a shopper to “identify user or the owner of the mobile device,” the lawsuit states. Kochava also put individuals at risk in a simpler way: If you have someone’s entire location history, you can easily discover their identity. Phones lying on a nightstand from 10:00 p.m. to 6:00 a.m., for example, might indicate a home address, just as phones in the same office building or retail store from 9 to 5 might indicate a work placement. The FTC says Kochava knew this and even tried to profit from it, suggesting “Home Mapping” as a potential use case of data on Amazon’s Web Services marketplace, where a buyer could “group devices to time and frequency of housing in shared locations to be mapped individually.” devices in homes.” Selling this information blatantly puts many people, especially the already marginalized and vulnerable, at risk.

The entire business model of data brokers is based on secretly collecting, analyzing, and selling or monetizing people’s information. Just within the realm of location data, companies have been caught advertising the real-time GPS locations of Americans, quietly surveilling Black Lives Matter protesters to identify individuals’ characteristics, and providing data on location to law enforcement agencies such as the FBI and Immigration and Customs Enforcement (ICE). ), without the need for a command. Even after the Supreme Court overruled Roe v. Wade, numerous data brokers continued to sell location data related to abortion clinic visits, some of which only agreed to stop when asked to by the press and members of Congress. Earlier this month, NextMark CEO Joe Pych said Politician, in an alleged defense of his own company’s behavior, that “to my knowledge, there is no law today that prohibits prenatal mailing lists.” Whether these practices encourage domestic and intimate partner violence, enable unjustified surveillance of over-policed ​​communities, or put women and LBGTQ+ people at risk of harassment and physical violence, many data brokers continue to sell information about the location anyway.

If data brokers look at the surveillance of vulnerable communities and claim not to understand the harm of collecting and selling this kind of data, they are either outright lying or simply don’t care. If they are secretly collecting people’s locations, linking them to people and selling them online, making it easier to track people going to churches and mosques, hospitals and health clinics, queer nightclubs and anti-police demonstrations, and preparing a “It is not no legal defense: They are pushing bad faith arguments. In a state of constant surveillance and the absence of privacy regulation, legality is not the determinant of harm.

Critically, the agency alleges that Kochava violated the “unfair or deceptive acts or practices” clause of the FTC Act, because it unfairly sells highly sensitive location information that poses a risk of “substantial harm” to consumers People, tracked without fully knowing and understanding the surveillance, cannot reasonably avoid such harm on their own. So, for all that Kochava claims the FTC is perpetuating “data privacy misinformation,” this case may further solidify the fact that brokering people’s highly sensitive information is a cause of action legal

As state legislatures remain slow to enact more privacy laws and congressional initiatives on the issue stall, with some members refusing to even touch the harms of data brokering, the case of the FTC may be the best solution in the country. The agency should press hard on its case and take all steps to link the sale of location data to outcomes such as harassment, discrimination and other types of data exploitation; otherwise, this highly sensitive information will remain on the open market and continue to harm millions of Americans.

Source link

Leave a Comment