
Government AI Surveillance: Predictive Policing, Fusion Centers, and the Dossier Being Built on You
Minority Report was science fiction in 2002. In 2026, it's a procurement category. In 2020, Robert Williams was arrested in front of his wife and daughters in his driveway. The charge: shoplifting watches from a store he had never entered. He was detained 30 hours based on a facial recognition match. The AI was wrong. The detective treated the probabilistic output as a positive ID. Williams: first documented wrongful arrest from facial recognition AI in the US. Not the last. Predictive Policing Chicago's Strategic Subject List scored 400,000 residents on shooting risk (0-500). Factors: arrest history (not convictions), age, social network connections. ACLU FOIA results: 56% of all Black men in Chicago aged 20-29 were on the list. People with no criminal record appeared because they knew someone on the list. Individuals were not notified. No mechanism to contest their score. List shared with DHS and parole officers. PredPol feedback loop: Predicted where police would report crime, not w
Continue reading on Dev.to Webdev
Opens in a new tab




