Policy

Why the FTC's Worries About Online Data Brokers Are Overblown

|

The Federal Trade Commission is worried that online retailers may know too much about you thanks to online "data brokers" that collect, categorize, and sell information about potential customers. Unsurprisingly, its top official wants Congress to pass a law giving the agency power to regulate on your behalf.

The data broker industry "largely operates in the dark," FTC Chair Edith Ramirez told reporters yesterday, according to Ars Technica. "We want to shed the veil of secrecy that surrounds data broker practices."

Terminator 2, Carolco Pictures

"The Commission recommends that Congress consider enacting legislation to make data broker practices more visible to consumers and to give consumers greater control over the immense amounts of personal information about them collected and shared by data brokers," said an agency statement released yesterday.

The FTC is not worried that you are worried that retailers may know too much. The agency's presumption is that consumers don't know what's going on, and that the agency should be given the authority to take action as a result.

Along with the statement calling for legislation, the agency also released a detailed report examining the practices of data brokers. The report focuses heavily on the industry's practice of collecting personal information from a variety of sources and then organizing that information into potentially targetable silos for retailers.

"Data brokers infer consumer interests from the data that they collect," the report says. "They use those interests, along with other information, to place consumers in categories." Some are based on income and ethnicity, others on life events such as pregnancy or geographic features. This information is then combined with offline data to market to online consumers.

To its credit, the report notes that there are benefits to this sort of activity, including fraud prevention, better products, and tailored advertisements. But it also warns about potential risks, including security breaches, identity confusion, and the possibility that "they may facilitate the sending of advertisements about health, ethnicity, or financial products, which some consumers may find troubling and which could undermine their trust in the marketplace."

That last bit strikes me as a potentially real problem, but not one that FTC needs to be involved with. If consumers turn on a company for sending ads that seem too creepy, then the company has a pretty good incentive to reverse course, or at least to moderate practices that consumers find creepy. In fact, we've seen that happen already: Mega-retailer Target sparked a backlash by too aggressively targeting pregnant women—in one case actually figuring out that a teenage girl was pregnant before her father did. Eventually, the store started dialing back its marketing, including ads for yard equipment and other products along with the material it was targeting to pregnant women. When they did, they found that pregnant women were in fact happy to use the coupons and specials sent directly to them.

I'm sure that still seems at least a little bit creepy to some people once they find out about it. But it suggests a few things about how this sort of targeting works. First, companies will change their targeting behaviors in response to negative consumer reactions. Second, that consumers are worried mostly about the feeling that a company's data collection efforts are overly invasive; once that sensation is gone, people seem happy to use the discounts and specials—the benefits—that come from such targeting. It's not that they don't want companies to market to their needs; it's that they don't want to feel like they're being spied on.

I'm fairly sympathetic to some of the privacy concerns that marketing practices like these inevitably raise. Companies that collect this sort of data shouldn't be able to mislead people about their practices, and in general, it ought to be possible to opt out of many types of data collection, although that probably means opting out of some services and benefits as well.

But the deep fears that this sort of marketing seems to stoke, and the related calls for regulation, have also struck me as rather overblown. These companies are just doing a more sophisticated version of what sales representatives have done for years: attempting to get a better handle on their customers so they can better meet their wants and needs—and sell them more stuff in the process. In the pre-digital era, salesman would watch for cues, noticing a customer's style of dress or speech or the car they drove, and tailoring their pitch to suit. This is just a big-data, digital-era version of the same basic practice. And the result is that potential customers get products designed more to their tastes, ads that they might actually be interested in, and, hopefully, a happier experience as a result. 

The biggest worry here is not that companies will use this data to nefariously try to sell you stuff you like, but that the government will eventually get hold of it and use it for something else. But somehow I doubt that will be a big part of any new FTC regulations.