We’re already using machine learning to make subjective decisions— even ones that have life-altering consequences. Medical applications are only some of the least controversial uses of artificial intelligence; by the end of the decade, AIs were locating stranded victims of Hurricane Maria, controlling the German power grid, and killing civilians in Pakistan.

The sheer scope of these AI-controlled decision systems is why automation has the potential to transform society on a structural level. In 2012, techno-socialist Zeynep Tufekci pointed out the presence on the Obama reelection campaign of “an unprecedented number of data analysts and social scientists,” bringing the traditional “confluence of marketing and politics” into a new age.

Intelligence that relies on data from an unjust world suffers from the principle of “garbage in, garbage out,” futurist Cory Doctorow observed in a recent blog post. Diverse perspectives on the design team would help, Doctorow wrote, but when it comes to certain technology, there might be no safe way to deploy:

“Given that a major application for facial recognition is totalitarian surveillance and control, maybe we should be thinking about limiting facial recognition altogether, rather than ensuring that it is equally good at destroying the lives of women and brown people.”

It doesn’t help that data collection for image-based AI has so far taken advantage of the most vulnerable populations first. The Facial Recognition Verification Testing Program is the industry standard for testing the accuracy of facial recognition tech; passing the program is imperative for new FR startups seeking funding.

But the datasets of human faces that the program uses are sourced, according to a report from March, from images of U.S. visa applicants, arrested people who have since died, and children exploited by child pornography. The report found that the majority of data subjects were people who had been arrested on suspicion of criminal activity. None of the millions of faces in the program’s data sets belonged to people who had consented to this use of their data.