Scoring

A.I. Systems Intentionally Hiding Data
March 10, 2020
Two-Factor Biometric-Based Authentication
March 10, 2020

Scoring

In order for our automated systems to work, they need both our data and a framework for making decisions. Everyone alive today is being scored. 

Key Insight 

In order for our automated systems to work, they need both our data and a framework for making decisions. Everyone alive today is being scored.   

What You Need To Know

In the U.S., we have a credit reporting system that measures our creditworthiness. Banks, financial institutions and others use that score to determine the likelihood that we might default on a loan or mortgage. Financial credit scoring is tightly regulated and available to all consumers—we can request copies of our financial credit scores, check their accuracy and correct errors. Now, hundreds of types of data are being harnessed to assign us scores. However, unlike the credit reporting system, which is federally regulated and follows set processes, this kind of data isn’t subject to enforceable rules. It can be impossible to find out what our scores are, how they are being calculated and how to correct them if there are inaccuracies. 

Why It Matters

Recent advancements in data mining and artificial intelligence promise new opportunities for business intelligence and law enforcement. There are risks, too: China is selling its government-funded scoring tools to authoritarian regimes elsewhere in the world. We anticipate that in the coming year, regulators will take a deeper interest in scoring.

Deeper Dive

The Age of Algorithmic Determinism

Our data are being mined, refined and productized to sort, tag and catalogue us. Why? To make it easier for systems to make decisions for, on behalf of and about us. We’re living in a new age of algorithmic determinism, where we increasingly rely on A.I. systems to make choices—even if those systems score us without being able to understand our nuanced histories, special circumstances and the unique characteristics of our humanity.

The Price is Right—For Some

Algorithms can determine how likely we are to break the law, what kinds of mobile plans we should qualify for, what sort of news we should be shown and even how much we’re willing to pay for a roll of toilet paper. Researchers at the Consumer Education Foundation found that when visiting e-commerce sites, anonymous shoppers were offered products at lower prices than the researchers themselves. For example, when browsing anonymously at Walmart.com, a box of ballpoint pens was listed for $4.15, but when Walmart had access to other data from the researchers, that price spiked to $9.69. Customers are being algorithmically assigned scores based on the predicted profit they might generate for the company, and served different prices accordingly.

China is Scoring its Citizens

By now, you are familiar with China’s Social Credit System, a vast ranking system that this year will see a national rollout. First announced in 2014, it promises to make good on the government’s stance that “keeping trust is glorious and breaking trust is disgraceful.” The system will take some time to become fully operational nationwide, but already it’s granting and restricting permissions for Chinese citizens algorithmically. Last year, the system determined that 13 million Chinese citizens are untrustworthy. Citizens are awarded or deducted points for a variety of activities, such as paying bills on time, spreading news contrary to the government’s viewpoints or spending too much time playing video games.

In late 2019, a leak of highly classified government documents revealed an operations manual for detention camps in the far western region of Xinjiang, where scoring is used for predictive policing. It is in this region where China’s Muslim Uighur community lives. The International Consortium of Investigative Journalists published a detailed report showing the scope and ambition of Beijing’s scoring system, which awards points and punishments for inmates in the camps. China argues its “re-education camps” and scoring system were built to combat terrorists and radical religious extremism.

China is Scoring Companies, Too

A longstanding goal of China’s Social Credit System is to create what the Communist Party of China (CCP) calls a “fair, transparent and predictable” business environment. To accomplish that goal starting in 2020, businesses that operate in China will begin to earn and lose points as part of China’s corporate social credit system. The plan includes centralizing data from domestic and foreign companies in a government system that allows the CCP to monitor the activities of all entities operating in China.

The Sky is Watching

There’s an old Chinese adage that says, “People are doing things, and the sky is watching.” But it holds true for the West, too. Increasingly, everything we do is being watched and recorded. Algorithms assign us scores all the time, by governments in some countries and by the commercial sector in others.

The Impact

Scoring presents tremendous opportunities to help businesses understand their customers better, which is why in 2020 every organization must develop a data governance strategy and ethics policy. For those who work in risk and compliance, 2020 will be the start of a newly complicated landscape. Organizations will need to hire compliance specialists who understand the complexities of using scoring systems.

For those in the public sector, massive-scale scoring impacts every facet of our daily lives, and it will soon influence geoeconomic relationships around the world. Chinese tech companies—Huawei, Hikvision, Dahua and ZTE—are building and supplying the scoring apparatus for 63 countries around the world. Of those, 36 have signed on to China’s Belt and Road Initiative. It isn’t just that China is exporting technology, it is actively creating an environment where the CCP can also export its influence.

Watchlist for section

AI Now Institute, Airbnb, Alibaba, Amazon, Amazon Neighbors, Amazon Rekognition system, Amazon Ring, American Civil Liberties Union, Apple Home Kit, Baidu, Best Buy, California Consumer Privacy Act (CCPA), China’s Belt and Road Initiative, China’s Social Credit System, Clarifai, Communist Party of China, Consumer Education Foundation, Dahua, Electronic Frontier Foundation, Electronic Privacy Information Center, European Union, Facebook, Google, Google Nest, Hikvision, Huawei, IBM Watson, International Consortium of Investigative Journalists, Iovation, Kount, Kustomer, MaxMind, Megvii’s Face++, Microsoft facial analysis systems, MIT Media Lab, Retail Equation, Riskified, Sensetime, Sephora, Tencent, Uber, University of Colorado-Boulder, U.S. Federal Communications Commission, US Federal Trade Commission, Walmart, Walmart.com, Zeta Global, ZTE.

More Trends in this Section