#HashMatching

2026-02-20

The UK is moving toward mandatory proactive detection of nonconsensual intimate images.

Under proposals backed by Keir Starmer, platforms must:
• Remove flagged content within 48 hours
• Prevent reuploads using hash matching
• Deploy proactive detection “at source”
• Face fines up to 10% of global revenue

Regulator Ofcom is accelerating its decision on requiring technical enforcement mechanisms.
Technical considerations:
- Hash collision and false-positive risks
- Cross-platform hash database coordination
- Encryption vs scanning tradeoffs
- Abuse-report automation workflows
- AI-generated image detection accuracy
Is mandatory proactive scanning the future of online content governance?

Source: therecord.media/united-kingdom

Drop your technical analysis below.

Follow @technadu for advanced cybersecurity and policy reporting.

#Infosec #DetectionEngineering #AIsecurity #HashMatching #ContentModeration #DigitalForensics #CyberPolicy #OnlineSafety #DeepfakeDetection #PrivacyEngineering #ThreatModeling #SecurityArchitecture

UK to require tech firms to remove nonconsensual intimate images within 48 hours or face fines
2026-02-18

OFCOM update: so apparently the embargoed press-release email was circulated to one (more?) of their general discussion maillists, which I feel symbolic of their approach towards security

Highlights are as follows:

  • they will be “fast-tracking” their decision to force apps and sites (“tech firms”, of course, because all apps and sites are “tech firms”) bringing the decision forward
  • their proposal will be that everyone be forced to use fuzzy-image-matching, which they dress up as “perceptual-” or “hash-matching”, but it’s an old tech wrapped up in legal agreements with inadequate oversight
  • they frame this as a means to combat non-consensual deep-fakes, even though those are not currently a large part of the extant “hash databases” in use by the likes of Meta, which are geared at CSAM and Terrorism
  • management of such databases, and likewise logging of the queries where those databases are offered as an “outsource service” — clearly this is where the money will be made — both lead to major security issues; this is not mentioned
  • they will decide in May and expect other legalities to be sorted by “this summer”
#censorship #clientSideScanning #hashMatching #ofcom #surveillance
Alec Muffettalecmuffett
2026-02-18

OFCOM update: so apparently the embargoed press-release email was circulated to one (more?) of their general discussion maillists, which I feel symbolic of their approach towards security
alecmuffett.com/article/146056

Alec Muffettalecmuffett
2026-02-18

HEADSUP: Ofcom *tonight* to announce demand for apps, websites to deploy “hash matching” (i.e. client-side scanning, fuzzy matching) of uploaded images “to protect children”
alecmuffett.com/article/146028

2026-02-18

HEADSUP: Ofcom *tonight* to announce demand for apps, websites to deploy “hash matching” (i.e. client-side scanning, fuzzy matching) of uploaded images “to protect children”

privacy impact: logfiles of fuzzy-matching hash databases become long-term surveillance pipeline to retrospectively track whistleblower leak images, Snowden 2.0, etc; plus enabling censorship of arbitrary content.

WATCH THIS SPACE; ETA 2230H LONDON.

#censorship #clientSideScanning #fuzzyMatching #hashMatching #ofcom #surveillance

Client Info

Server: https://mastodon.social
Version: 2025.07
Repository: https://github.com/cyevgeniy/lmst