#PredictivePolicing

HistoPol (#HP) 🏴 🇺🇸 🏴HistoPol
2025-05-10

@tg9541 @mattotcha



👉A friendly warning to the Government👈

(3/n)

... advent of and the continuing crackdown on the right to in the , it seems that the government seems to be following down that road.

👉The despicable use of anti-terror force by 30 in a place of in against six young women👈 discussing...

2025-05-08

‘Predictive’ policing tools in France are flawed, opaque, and dangerous.

A new report from @LaQuadrature, now available in English as part of a Statewatch-coordinated project, lays out the risks in detail.

The report finds that these systems reinforce discrimination, evade accountability, and threaten fundamental rights. La Quadrature is calling for a full ban—and we support them.

📄 Read more and access the full report: statewatch.org/news/2025/may/f

#PredictivePolicing #France

2025-05-03

Does anyone have a sober analysis of the UK's #predictivepolicing practises?

nemo™ 🇺🇦nemo@mas.to
2025-04-25

Großbritannien will per #PredictivePolicing berechnen, wer zum Mörder wird – mit sensiblen Daten & KI. Menschen werden so zu Verdächtigen, bevor sie etwas getan haben. Dystopisch & gefährlich für Minderheiten & Arme! Mehr dazu: 👉 netzpolitik.org/2025/predictiv #Überwachung #MinorityReport 🕵️‍♂️🤖 #newz

Autonomie und Solidaritätautonomysolidarity@todon.eu
2025-04-14

Wie Algorithmen in #Deutschland Straftaten „voraussehen“ sollen #PredictivePolicing

"In dem Bericht „Automating Injustice“ werden ausgewählte Systeme untersucht, die in Deutschland von der Polizei, Strafverfolgungsbehörden und Gefängnissen entwickelt oder eingesetzt werden. Außerdem werden öffentlich zugängliche Informationen über solche Praktiken analysiert, um zu erklären, wie die Systeme funktionieren, welche Daten sie verwenden, weshalb sie zu einer stärkeren Diskriminierung führen können und generell eine Gefahr für die Grundrechte sind......."

algorithmwatch.org/de/predicti via @algorithmwatch

#Polizei #Überwachung #Repression #KI #KünstlicheIntelligenz #Technology #Digitalisierung #Polizeiproblem

2025-04-11

This week in the Weekly News Roundup, UK testing Minority Report, a man stalks people from inside the house, and AI could be third-world contractors. We also visit SillyVille.
#weeklyNewsRoundup #predictivepolicing #ai

8:00p EST

All Articles:
switchedtolinux.com/news/where

Miguel Afonso Caetanoremixtures@tldr.nettime.org
2025-04-11

"Alexander, more than midway through a 20-year prison sentence on drug charges, was making preparations for what he hoped would be his new life. His daughter, with whom he had only recently become acquainted, had even made up a room for him in her New Orleans home.

Then, two months before the hearing date, prison officials sent Alexander a letter informing him he was no longer eligible for parole.

A computerized scoring system adopted by the state Department of Public Safety and Corrections had deemed the nearly blind 70-year-old, who uses a wheelchair, a moderate risk of reoffending, should he be released. And under a new law, that meant he and thousands of other prisoners with moderate or high risk ratings cannot plead their cases before the board. According to the department of corrections, about 13,000 people — nearly half the state’s prison population — have such risk ratings, although not all of them are eligible for parole.

Alexander said he felt “betrayed” upon learning his hearing had been canceled. “People in jail have … lost hope in being able to do anything to reduce their time,” he said.

The law that changed Alexander’s prospects is part of a series of legislation passed by Louisiana Republicans last year reflecting Gov. Jeff Landry’s tough-on-crime agenda to make it more difficult for prisoners to be released."

propublica.org/article/tiger-a

#USA #Louisiana #Algorithms #PredictiveAI #PredictivePolicing #PoliceState

Miguel Afonso Caetanoremixtures@tldr.nettime.org
2025-04-10

"The UK government is developing a “murder prediction” programme which it hopes can use personal data of those known to the authorities to identify the people most likely to become killers.

Researchers are alleged to be using algorithms to analyse the information of thousands of people, including victims of crime, as they try to identify those at greatest risk of committing serious violent offences.

The scheme was originally called the “homicide prediction project”, but its name has been changed to “sharing data to improve risk assessment”. The Ministry of Justice hopes the project will help boost public safety but campaigners have called it “chilling and dystopian”."

theguardian.com/uk-news/2025/a

#UK #PredictivePolicing #PredicitiveAI #Algorithms #Surveillance #PoliceState

Verfassungklage@troet.cafeVerfassungklage@troet.cafe
2025-04-09

#PredictivePolicing:

#Großbritannien will berechnen, wer zum #Mörder wird.

Die #britische #Regierung lässt an einem Programm forschen, das vorhersagen soll, ob eine Person zum Mörder wird. Für die Studie führen Forscher:innen persönliche Daten von hunderttausenden Menschen zusammen – unter anderem, ob sie Opfer häuslicher Gewalt wurden und an welchen psychischen Erkrankungen sie leiden.

netzpolitik.org/2025/predictiv

Klassengesellschaft Deutschlandklassismus.bsky.social@bsky.brid.gy
2025-04-09

#Precrime ist ein Begriff, der Tätigkeiten beschreibt, die sich mit potentiellen Straftaten & Straftätern beschäftigen. Der Begriff stammt von dem Sci-Fi-Autor Philip K. Dick. In George #Orwells dystopischen Roman 1984 wurden derartige Straftaten „Gedankenverbrechen“ genannt. #PredictivePolicing

RE: https://bsky.app/profile/did:plc:bd2ad25frux6jav66chr6aiw/post/3lmeupv65gc2s

rosamunde van brakelre_vbrakel
2025-03-07

Very proud to have a chapter in this handbook! Many thanks to Nathalie Smuha for the invitation!

One of the pertinent questions I ask in the conclusion is: Should the money that is invested in predictive policing applications not be invested instead in tackling causes of crime and in problem-oriented responses, such as mentor programs, youth sports programs, and community policing, as they can be a more effective way to prevent crime? cambridge.org/core/books/cambr

Amnesty in Salisbury & South Wiltshiresalisburyai.com@salisburyai.com
2025-02-25

Perils of predictive policing

Amnesty publishes a report warning of the perils of predictive policing

February 2025

Many TV detective series have technology at their core as our heroes vigorously pursue the wrongdoers. CCTV cameras are scrutinised for movements of the criminals, DNA evidence is obtained and of course fingerprints are taken. The story lines of countless detective series feature forensic evidence as a key component of police detection. The series and stories are reassuring by displaying law enforcement officers using all the techniques – scientific and technological – to keep us all safe and lock up the bad guys. Using science and algorithms to enable police forces to predict crime must be a good idea surely?

It is not. The Amnesty report, and other research, explain in great detail the problems and what the risks are. One of the persistent biases in the justice system is racism and it would be worth reading the book The Science of Racism by Keon West (Picador, pub. 2025). The author takes the reader through copious peer reviewed research conducted over many years in different countries explaining the extent of racism. Examples include many cv studies (US: resume) where identical cv’s, but with different names which indicate the ethnicity of candidates, produces markedly different results. There are similar examples from the world of medicine and academia. Racism is endemic and persists. As Keon West acknowledges, a similar book could be written about how women are treated differently.

The Amnesty report notes that Black people are twice as likely to be arrested; three times as likely to be subject to force and 4 times as likely to be subject to stop and search as white people. With such bias in place, the risk is that predictive policing might simply perpetuate existing prejudice and bias. The concern partly centres around the use of skin colour, where people live and their socio-economic background all used as predictive tools.

People have a deep faith in technology. On a recent Any Answers? programme (on BBC Radio 4), a debate about the death penalty and the problem of mistakes, several people showed a touching faith in DNA in particular inferring that mistakes cannot happen. People are mesmerised by the white suited forensic officers on television giving a sense of science and certainly. Technology is only as good as the human systems which use it however. There have been many wrongful arrests and prison sentences of innocent people despite DNA, fingerprints, CCTV and all the rest. Mistakes are made. The worry is that predictive policing could enhance discrimination.

People who are profiled have no way of knowing that they have been. There is a need to publish details of what systems the police and others are using. The police are reluctant to do this the report notes. What is the legal basis for effectively labelling people because of their skin colour, where they live and their socio-economic status?

The police are keen on the idea and around 45 forces use it. The evidence for its effectiveness is doubtful. The risks are considerable.

#policing #predictivePolicing #Racism

vrheidvrheid
2025-02-23

Palantir en de privatisering van surveillance

Big tech en de staat werken steeds nauwer samen, zonder transparantie of democratische controle. Palantir’s algoritmes voorspellen gedrag, profileren burgers en versterken staatsmacht. Hoe diep grijpt deze technologie in op onze vrijheden?

vrheid.nl/palantir-surveillanc

Predictive policing systems exacerbate racism and discrimination against people from lower socio-economic groups.

ORG supports Amnesty in calling for predictive policing systems to be BANNED.

Sign the petition to #StopAutomatedRacism TODAY ⬇️

#policing #police #precrime #predictivepolicing #codedbias #ukpolitics #ukpol

amnesty.org.uk/actions/ban-pre

Amnesty's new report shows that the police are supercharging racism through predictive policing.

At least 33 UK police forces have used prediction or profiling tools.

“These systems are developed and operated using data from policing and the criminal legal system. That data reflects the structural and institutional racism and discrimination in policing and the criminal legal system.”

#policing #police #precrime #predictivepolicing #codedbias #ukpolitics #ukpol

theguardian.com/uk-news/2025/f

AutisticMumTo3 She/Her or They/Themautisticmumto3.bsky.social@bsky.brid.gy
2025-02-19

UK use of predictive policing is racist and should be banned, says Amnesty | Police | The Guardian www.theguardian.com/uk-news/2025... #Amnesty #PredictivePolicing #Racism #Police

UK use of predictive policing ...

2025-02-18

Here's some important research on the use of algorithmic decision making in Italy, Netherlands, Spain

investigativejournalismforeu.n

#AI #Italy #Netherlands #Spain #EU #PredictivePolicing #Privacy

Client Info

Server: https://mastodon.social
Version: 2025.04
Repository: https://github.com/cyevgeniy/lmst