A.I. Systems Intentionally Hiding Data

Deep Linking
March 10, 2020
Prioritizing Accountability and Trust
March 10, 2020

A.I. Systems Intentionally Hiding Data

Researchers at Stanford and Google discovered that an A.I. created to turn satellite images into usable maps was withholding certain data.

Computers do exactly what they are told to do. Command a machine to win at a game, and it will do everything in its power to achieve that goal. Apparently that now includes cheating. Researchers at Stanford and Google discovered that an A.I. created to turn satellite images into usable maps was withholding certain data. Researchers were using a neural network called CycleGAN, which learns how to map image transformations. For example, it could take an old aerial photograph of a neighborhood, distinguish between streets, alleys, driveways, buildings and lamp posts, and generate a map that could be used by a GPS. Initially, they used an aerial photograph that hadn’t been seen by the network. The resulting image looked very close to the original­­—suspiciously close. But on deeper inspection, the researchers found that there were lots of details in both the original image and the image generated that weren’t visible in the map made by the A.I. It turns out that the system learned to hide information about the original image inside of the image it generated.