It’s no secret A.I. has a serious and multifaceted bias problem.
It’s no secret A.I. has a serious and multifaceted bias problem.
Just one example: The data sets used for training often come from places like Reddit, Amazon reviews and Wikipedia, a site inherently riddled with bias. The people building models tend to be homogeneous and aren’t often aware of their own biases.
As computer systems get better at making decisions, algorithms may sort each of us into groups that don’t make any obvious sense to us—but could have massive repercussions. Every single day, you are creating unimaginable amounts of data, both actively (like when uploading and tagging photos on Facebook) and passively (driving to work, for example).
Those data are mined and used, often without your direct knowledge or understanding, by algorithms. It is used to create advertising, to help potential employers predict our behaviors, to determine our mortgage rates and even to help law enforcement predict whether or not we’re likely to commit a crime.
Researchers at a number of universities—including the University of Maryland, Columbia University, Carnegie Mellon, MIT, Princeton, University of California-Berkeley, International Computer Science Institute, among others—are studying the side effects of automatic decision-making. You, or someone you know, could wind up on the wrong side of the algorithm and discover you’re ineligible for a loan, or a particular medication, or the ability to rent an apartment, for reasons that aren’t transparent or easy to understand.
Increasingly, data is being harvested and sold to third parties without your knowledge. These biases can reinforce themselves over time.
As A.I. applications become more ubiquitous, the negative effects of bias will have greater impact.
The Apple card gave high credit limits for men versus women, in some cases, by a factor of 20. Wearables such as Google’s Fitbit are considerably less accurate for darker skin-types because of how melanin absorbs green light.
This is problematic when insurance companies use biased algorithms to track heart rates, blood pressure and risk rates for conditions like irregular heartbeats or a potential heart attack.
This trend is part of our section on Artificial Intelligence. Other trends in this section include:
Accountants, Advertising and Public Relations, Aerospace, Agriculture, Airlines, Alternative Energy Production & Services, Architectural Services, Auto Manufacturers, Banking, Bars & Restaurants, Beer, Wine and Liquor, Book Publishers, Broadcasters, Radio and TV, Builders/General Contractors, Cable & Satellite TV Production & Distribution, Casinos/Gambling, Chemical & Related Manufacturing, Civil Servants/Public Officials, Clergy & Religious Organizations, Clothing Manufacturing, Commercial TV & Radio Stations, Construction, Corporate Boards & Directors, Covid-19/ coronavirus, CPG, Cruise Ships & Lines, Defense, Doctors & Other Health Professionals, Drug Manufacturers, Education Colleges & Universities, Education K-12, Education Online, Education Trades, Electric Utilities, Entertainment Industry, Environment, Finance, Foreign & Defense Policy, Gas & Oil, Government - International, Government - National, Government - State and Local, Health Professionals, Heavy Industry, Hedge Funds, Hospitality, Hotels/Motels/Tourism, Human Resources, Information Technology, Insurance, Law Enforcement, Lawyers/Law Firms/Legal Industry, Lobbyists, Luxury Retail, Magazines, Manufacturing, National Security, News Media, Non-profits/Foundations/Philanthropists, Online Media, Pharmaceuticals/Health Products, Private Equity, Professional Services, Professional Sports, Radio/TV Stations, Real Estate, Retail, Technology Company, Telecommunications, Trade Associations, Transportation, Travel Industry, TV Production, TV/Movies/Music, Utilities, Venture Capital, Waste Management, Work (Future of)