#codedbias

Predictive policing systems exacerbate racism and discrimination against people from lower socio-economic groups.

ORG supports Amnesty in calling for predictive policing systems to be BANNED.

Sign the petition to #StopAutomatedRacism TODAY ⬇️

#policing #police #precrime #predictivepolicing #codedbias #ukpolitics #ukpol

amnesty.org.uk/actions/ban-pre

Amnesty's new report shows that the police are supercharging racism through predictive policing.

At least 33 UK police forces have used prediction or profiling tools.

“These systems are developed and operated using data from policing and the criminal legal system. That data reflects the structural and institutional racism and discrimination in policing and the criminal legal system.”

#policing #police #precrime #predictivepolicing #codedbias #ukpolitics #ukpol

theguardian.com/uk-news/2025/f

AI is rapidly expanding into all areas of public life.

Automated systems will be commonplace in making decisions that can entrench discrimination and inequality.

Join us in pushing back against this dangerous threat to your rights in the UK by signing our petition ⬇️

#DUABill #DataBill #dataprotection #AI #gdpr #privacy #codedbias #ukpolitics

you.38degrees.org.uk/petitions

Automated decision-making exposes racialised communities to greater discrimination from algorithmic biases, as seen in recruitment.

The UK Data Use and Access Bill will expand the use of decisions made solely by AI without human review, so unfair practices could go unchallenged.

#DUABill #dataprotection #codedbias #gdpr #AI #ukpolitics #databill

independent.co.uk/news/world/a

Le Jeune Turc Mais Americainthiscannotbechanged@sharkey.world
2024-12-02

Both Gemini and ChatGPT are so woke that they can't answer a simple evolution question due to easy to be seen fact.

#google #gemini #openai #chatgpt #gpt #codedbias #bias #ai

Francesco Iannuzzellifrancesco@sociale.network
2024-08-15

Il riconoscimento facciale ha tanti aspetti problematici - vedasi #CodedBias - ma finora aveva almeno escluso una parte vulnerabile e giustamente protetta della popolazione, i minorenni. Ora si scopre che, per addestrare meglio i propri algoritmi, il DHS (Department of Homeland Security) statunitense ha avviato da tempo la raccolta di immagini facciali di minori stranieri, inclusi infanti, anche non accompagnati, al confine col Messico
technologyreview.com/2024/08/1
#usa #messico

"AI raises the stakes... data is not only used to make decisions about you, but rather to make deeply powerful inferences about people and communities."

Beware greater automated decision-making with fewer safeguards over our data.

The fight for algorithmic justice is imperiled by the #DataGrabBill.

#HandsOffOurData #DataGrab #GDPR #DPDI #DPDIBill #dataprotection #privacy #ukpolitics #humanrights #datarights #digitalrights #AI #codedbias #facialrecognition

themarkup.org/hello-world/2023

Without oversight or strong data rights, facial recognition will further embed discriminatio if the #DataGrabBill becomes law.

Accuracy diminishes when the subject is a person of colour and the younger the person is, disproportionately misidentifying young Black men.

#HandsOffOurData #DataGrabBill #GDPR #DPDIBill #dataprotection #privacy #ukpolitics #facialrecognition #codedbias #ai #surveillance

amnesty.ca/surveillance/racial

"The UK government has bungled what could have been an opportunity for real global AI leadership due to the Summit’s limited scope and invitees.

The agenda’s focus on future, apocalyptic risks belies the fact that government bodies and institutions in the UK are already deploying AI and automated decision-making in ways that are exposing citizens to error and bias on a massive scale."

🗣️ ORG's @abigail

#AISafetySummit #AISummitOpenLetter #AI #codedbias

Meta's algorithm labelling people as terrorists highlights how even seemingly straightforward automated systems can make mistakes, that invariably exacerbate racism and discrimination.

This is going to be an even bigger problem when the Online Safety Bill is implemented and tech companies are obliged to identify illegal content and prevent it from being posted.

Over-moderation will seriously harm freedom of expression.

#OnlineSafetyBill #freedomofexpression #codedbias

theguardian.com/technology/202

Data is being scraped from innocent citizens to build predictive profiles of criminality.

Entrenched in these models are racialised assumptions that've long perpetuated institutionalised racism in the criminal justice system.

#precrime #policing #facialrecognition #codedbias

Find out more ⤵️

openrightsgroup.org/campaign/r

Predictive policing technologies have received incredibly little scrutiny considering the harms they're causing and inequalities they're exacerbating.

We urge UK local authorities to take a beat and consider the utility, harm and trajectory of using this predictive policing tech and automated policing before procuring it.

At the very least we expect transparent and wide public consultation when the stakes are so high.

#precrime #policing #facialrecognition #codedbias

Proposed Real Time Crime Centres are using technologies responsible for miscarriages of justice, false arrests and impacts on benefits and housing:

🔴 Facial recognition
🔴 Automatic number plate recognition
🔴 Social media monitoring
🔴 Algorithmic decision-making

The pace that these predictive tools are becoming embedded in policing is frightening, especially with undemocratic commercial interests involved.

#precrime #policing #facialrecognition #codedbias #surveillance

opendemocracy.net/en/real-time

Automated technology frequently contains inherent biases.

The use of algorithms for moderation creates a clear risk that the screening systems will disproportionately block content relating to or posted by minority ethnic or religious groups.

#OnlineSafetyBill #censorship #freespeech #codedbias

Decisions on illegality won't be made by the courts, but by private providers with broad discretion.

Vast amounts of content are posted every day, so picking out 'illegal' content will be done by faulty algorithms.

#OnlineSafetyBill #censorship #freespeech #codedbias

Quote from Dr Monica Horten: "Prior restraint is a particularly draconian form of censorship that implements a ban before publication, and importantly, before a court has made a judgement that it is illegal."

Civil society groups say the use of sensitive metrics is exacerbating discrimination in policing.

See for yourself with this predictive policing tool from Fair Trials.

13/15

#policing #precrime #codedbias #PredictivePolicing

fairtrials.org/predictive-poli

UK #Police have algorithms analysing huge amounts of information, using discriminatory factors like ‘cramped houses’ and ‘jobs with high turnover’ to predict someone's likelihood to commit a crime.

See Liberty's factsheet on #PredictivePolicing

12/15

#policing #precrime #codedbias

libertyhumanrights.org.uk/fund

2023-04-07

@dltj

Big sigh. So many stories out there like this. Check out #AlgorithmicJusticeLeague and #CodedBias for their work. We need more accountability.

ajl.org/

#EthicalAI

Your personal data is used to profile you, making it easier for biased algorithms to decide over job applications, loans, housing and more. Data discrimination has real consequences for marginalised groups the most. We ask councils to fight this practice. #DataDiscrimination #DataProtection #GDPR #CodedBias

openrightsgroup.org/publicatio

Client Info

Server: https://mastodon.social
Version: 2025.04
Repository: https://github.com/cyevgeniy/lmst