Internet entrepreneurs have been trying for years to create "Yelp for people," a service that would take the practice of reductively quantifying other human beings beyond the rarefied realms of beauty contests and the National Football League draft.
In October 2015, the latest aspirant, Peeple, generated a backlash while still in the elevator pitch stage. When The Washington Post reported that the app, which didn't exist yet, would allow users to post both positive and negative reviews of anyone they interacted with, even if the reviewee hadn't opted in to the system, humanity responded with an emphatic thumbs-down.
"Since the interview with the Washington Post, I've received death threats and extremely insulting comments aimed at me, my investors, and my family on almost every social media tool possible," Peeple co-founder Julia Cordray wrote at LinkedÂIn. But don't look for Cordray to vindictively name names. "Peeple is a POSITIVE ONLY APP," she wrote, pivoting from earlier descriptions of the service. "We want to bring positivity and kindness to the world." It will be "100% OPT-IN" and "there is no way to even make negative comments." Presumably, even the sadface emoji will be persona non grata there.
China may not let its citizens off quite so easily. According to government documents that were translated by Oxford University China expert Rogier Creemers in June 2014, China is developing a "social credit system" that will create a numerical rating for individuals using information drawn from financial transactions, criminal records, and social media behavior. The system, the International Business Times concluded in an April 2015 article, will be designed to "hold all citizens accountable for financial decisions as well as moral choices."
But while China's vision may escalate the idea behind platforms like Peeple from the merely annoying and intrusive into the realm of oppressive and coercive, there is, in fact, a positive case to be made for the liberating virtues of reputation systems. One reason the "Yelp for people" dream remains so persistent is because Yelp for optometrists, life coaches, and laundromats has proven so useful. Instead of relying on marketing and other forms of branding, scattershot news media coverage, and government regulations designed to protect consumers (that may or may not succeed), we can now easily access detailed information about people's actual past experiences with businesses, products, services, places, and more. Yes, reviews that appear on Yelp, TripAdvisor, Amazon, and elsewhere may contain inaccurate or deliberately misleading information, but these systems work so well on the whole that millions of people regularly consult them before making decisions about how to spend their time and money.
Just as important, millions of retailers, manufacturers, service providers, and commercial entities of other kinds now embrace reviews, too. In the early days of e-commerce, many businesses considered the prospect of uncontrollable consumer reviews threatening. Now, virtually every time you buy a sweater online or book a hotel room, you get an email from the merchant in the wake of the experience, asking for feedback and assessment. In the Internet era, the after-sell is often harder than the sell.
Merchants mass-nag their customers in this manner because they know people tend to provide positive feedback more often than negative feedback—numerous studies have shown that higher review volume correlates with higher overall ratings, and higher ratings lead to more sales.
In theory, individuals can also benefit from this new era of hyper-scrutiny, and to a certain extent, of course, they already do—at sites like RateMyTeacher.com, LinkedIn, eBay, Uber, Airbnb, Yelp itself, and many others. In fact, on transaction-oriented platforms such as eBay, Uber, and Airbnb, reputation systems aren't just tolerated—they're embraced as the mechanism that provides the trust that helps these marketplaces function efficiently.
As Washington Post reporter Caitlin Dewey suggests in her coverage of Peeple, this kind of reputation assessment is generally considered permissible because it involves an economic transaction of some kind. "You paid thousands of dollars to take that class, so you're justified and qualified to evaluate that transaction," she writes.
But most existing reputation systems are also flawed from an individual's point of view, because the results can rarely if ever be transferred from one place to another. In the digital age, character may be capital, but the character capital you accrue through years of conscientious eBay transactions isn't fungible. It only, or at least primarily, has value on one website. (You could potentially highlight your high eBay global feedback score on your LinkedIn profile.)
Companies like Amazon and Uber have little incentive to make such information portable. The reputation scores you establish on their sites, and the benefits you accrue because of them—more potential transactions because people trust you, fewer problematic interactions because you know whom to avoid—help keep you locked in to these platforms.
But as the gig economy becomes a larger part of people's work lives, and as technologies like Bitcoin, telepresence, and instant translation expand the possibilities for how and with whom people do their jobs, broader forms of reputation will become increasingly attractive. While Julia Cordray may not have initially done a good job of assuring the world that Peeple is concerned with mitigating the potential for major abuses and with giving users at least some control over how their reputation is presented, the platform is, at least, a bid to establish a more all-encompassing and portable reputation system.
Her app will apparently incorporate only one form of information input—reviews from other users. Karma (havekarma.com), a Los Angeles-based site that launched a public beta version last spring without attracting the sort of backlash that greeted Peeple, also encourages user reviews (which it calls "vouches") but is primarily built around incorporating data from other digital platforms and marketplaces, including Facebook, LinkedIn, Airbnb, Etsy, eBay, DogVacay, Twitter, and Foursquare.
Your overall Karma score derives from the sum of your activity on these sites, some of which it weights differently. (For example, reviews from sites that require face-to-face interaction, like DogVacay or Airbnb, are weighted more heavily than reviews from eBay or Etsy.) While Karma is opt-in only, you might still end up with a low overall rating, because in part the site evaluates you on the degree to which you're participating on the platforms it monitors. (If your rating is low, it advises, it could be "because of lack of activity.")
In their 2010 book Building Web Reputation Systems, Randy Farmer and Bryce Glass point out that eBay feedback scores and other forms of web reputation are highly contextual, and note the dangers of trying to extrapolate a "global user reputation" that can be deployed anywhere from only a few primary values.
But while it's true, as they suggest, that someone who achieves a high eBay feedback score "may in fact also steal candy from babies, cheat at online poker, and fail to pay his credit card bills," that concern diminishes at least somewhat as platforms like Karma expand their scope. Chances are good they'll eventually be able to learn from, say, negative TaskRabbit reviews, that you actually are the kind of person who steals candy from babies, and ding you accordingly. And as they draw from a wider and wider range of sites, these services will eventually develop a genuinely thorough record of your actions and behaviors.
Of course, even Karma ultimately has its own economic interests and incentives, just like eBay and Uber. The site biases participation, because more participation makes its own service more accurate and thus valuable. And ultimately it may not be willing to tell users exactly how it arrives at its overall scores, because then competitors would have access to that information, too.
What would benefit individuals most is a global open-source system, a Bitcoin for reputation, outside the control of any corporation or government. Unlike a mandatory national system (e.g., China's proposed social credit system) or a corporate one with incentives that don't always align with users' best interests, an open source system would likely offer the greatest level of transparency and portability for individual users.
As our opportunities to interact and transact with more and more humans and machines proliferate, so too does the need for reputation systems that can tell us whom to trust and whom to avoid. In time, it will become less and less possible to opt out of this new world without suffering major economic and even social consequences. In such a world, the people need something better than Peeple, a system worth opting into, designed to serve individuals rather than governments or corporations. But that will only happen if people create it.