Orwell comes to China
China’s 1.4 billion people are getting ‘social credit’ scores that rate their trustworthiness—and determine their place in society.
What is social credit?
It’s similar to the credit score system familiar to many Americans, but along with financial information, China’s version will take account of a person’s political activity, social interactions, and purchase history. All that data is fed into a computer algorithm that calculates a citizen’s trust score. Take care of your parents, pay your bills on time, and give to charity and you’ll be rewarded with a high rating, which can get you access to visas to travel abroad and good schools for your children. Run a red light, criticize the government on social media, or sell tainted food to consumers and you could lose access to bank loans, government jobs, and the ability to rent a car. Beijing aims to have the program running by 2020; pilot versions are underway in some 30 cities. “This is like Big Brother,” said Chinese novelist and social commentator Murong Xuecun, “who has all your information and can harm you in any way he wants.”
Why such a sweeping system?
Partly, it’s because China wants to better control its freewheeling and poorly regulated economy, now the world’s second largest. A social credit system will let the government easily punish business people who sell toxic baby formula or rotten meat, as well as bureaucrats who take bribes. “Swindlers have to pay a price,” says Lian Weiliang, vice chairman of the National Development and Reform Commission. China also lacks a national equivalent of the FICO score that U.S. lenders use to assess consumer credit risks, so most Chinese can get credit cards and loans only from their own bank. The social credit score system should result in more lending and less fraud. But for the Communist Party, social credit is mainly a way to push citizens toward approved behaviors—obeying the party, for example, or conserving energy.
How will Beijing score behavior?
By monitoring the wealth of data generated by citizens’ smartphones. Many Chinese have given up on cash and almost exclusively use their phones to pay for goods and services—$5.5 trillion in mobile payments are made in China every year, compared with about $112 billion in the U.S. The most popular smartphone payment portals, Alipay and WeChat Pay, are much more than financial apps. These so-called superapps have built-in social networks and can be used to hail a cab, order food, schedule a doctor’s appointment, or check into a hotel. All that data can be harvested and used to create a social credit score. E-commerce giant Alibaba, which owns Alipay, has already made a private version of the future government system, called Sesame Credit.
How does it work?
An algorithm assigns users a score between 350 and 950. The higher the number, the more perks you get. Low scorers have to pay larger deposits to do things like reserve hotel rooms, and they can be shut out of first-class seats on trains and planes. Personal factors weigh heavily—the degrees you hold, how much time you spend playing video games, and even the scores of your friends. So if your rating drops, your friends have an incentive to shun you, lest their scores dip too. Users can even link their scores to dating apps to screen potential mates. The system, says Sesame Credit CEO Lucy Peng, “will ensure that the bad people in society don’t have a place to go, while good people can move freely and without obstruction.” This is meant literally: Video surveillance will track everyone through facial recognition.
How widespread is that technology?
Some apartments already use facial recognition to unlock doors, and a growing number of restaurants let customers “smile to pay.” As more apps roll out, they will feed their data into a new government surveillance program called Sharp Eyes, a reference to the Mao Zedong–era system of neighbors informing on one another. Security cameras, ubiquitous in stores and on street corners, will be integrated into that surveillance platform, and artificial intelligence will analyze the mountain of video data. Suspicious behavior will be flagged, potentially affecting a person’s social credit score. “If you know gambling takes place in a location, and someone goes there frequently,” says Li Xiafeng of the facial-recognition firm Cloudwalk, “they become suspicious.”
What if the algorithm makes a mistake?
The consequences will be dire. One element that will be merged into the social credit system is the Supreme People’s Court blacklist of more than 7 million people who have outstanding fines or judgments. Journalist Liu Hu discovered he was on that list last year when he found himself unable to book a flight on a travel app. It turned out he had entered an incorrect account number when paying a fine, and the result was a blanket ban from all travel except the worst seats on the slowest trains. He has since paid the fine correctly but remains on the blacklist. Other people have been blacklisted for minor offenses, including one man who had shoplifted $70 worth of cigarettes. Lin Junyue, an academic seen as the father of social credit theory, says such mistakes and excesses are unfortunately inevitable. Yet “compared with the improvement in the atmosphere of the entire society,” he adds, “their sacrifice is worth it.”