#NCII

Ars Technica Newsarstechnica@c.im
2026-01-22

Asking Grok to delete fake nudes may force victims to sue in Musk's chosen court arstechni.ca/5ssC #nudifyapps #ElonMusk #chatbot #Policy #aicsam #csam #grok #ncii #xAI #AI #X

2026-01-11

Grok AI 비동의 딥페이크 사태, 전 세계 규제 당국이 나선 이유

X의 AI 챗봇 Grok이 여성과 미성년자의 비동의 성적 딥페이크를 대량 생성하며 전 세계 규제 당국이 긴급 대응에 나섰습니다. AI 안전 가드레일 실패의 심각성을 분석합니다.

aisparkup.com/posts/8152

2026-01-09

Masterful Gambit: Musk Attempts to Monetize Grok's Wave of Sexual Abuse Imagery

fed.brid.gy/r/https://www.404m

<!--kg-card-begin: html-->
<div class="outpost-pub-container"></div>
<!--kg-card-end: html-->
<img alt="Masterful Gambit: Musk Attempts to Monetize Grok&apos;s Wave of Sexual Abuse Imagery" src="https://www.404media.co/content/images/2026/01/54349817858_afdf48c60f_k.jpg" /><p>Elon Musk, owner of the former social media network turned <a href="https://www.ft.com/content/ad94db4c-95a0-4c65-bd8d-3b43e1251091?ref=404media.co"><u>deepfake porn site X</u></a>, is pushing people to pay for its nonconsensual intimate image generator Grok, meaning some of the app&#x2019;s tens of millions of users are being hit with a paywall when they try to create nude images of random women <a href="https://www.404media.co/grok-ai-sexual-abuse-imagery-twitter/"><u>doing sexually explicit things</u></a> within seconds.&#xa0;</p><p>Some users trying to generate images on X using Grok receive a reply from the chatbot pushing them toward subscriptions: &#x201c;Image generation and editing are currently limited to paying subscribers. You can subscribe to unlock these features.&#x201d;&#xa0;</p><figure class="kg-card kg-image-card"><img alt="Masterful Gambit: Musk Attempts to Monetize Grok&apos;s Wave of Sexual Abuse Imagery" class="kg-image" height="154" src="https://www.404media.co/content/images/2026/01/data-src-image-b5d9978b-03ce-4bd8-89d4-95b526631d59.png" width="678" /></figure><p>Users who fork over $8 a month can still reply to random images of random women and girls directly on X and tag in Gro
AI Daily Postaidailypost
2026-01-06

Legal experts are probing whether the child‑undressing images generated by Grok’s AI violate US CSAM and NCII statutes. The Department of Justice’s “Take It Down” unit is weighing the case, with even political figures like Donald Trump weighing in. What could this mean for AI policy? Read the full analysis.

🔗 aidailypost.com/news/legal-rev

Rohini Lakshanérohini
2025-12-10

'Men Against Violence and Abuse' created this awesome Instagram post from my longform article on image-based abuse published last year in FactorDaily.

instagram.com/p/DSCQ8aZjCJG/

Edit: Sharing a PDF of for those can't access Instagram: drive.google.com/file/d/1hujZY

2025-11-22

The draft amendment pertains to the regulation of #deepfake content. We elucidate on the aspects of image-based abuse, among other topics. zenodo.org/records/1768... #imagebasedabuse #iba #ibsa #tfgbv #ncii #ogbv #gbv #vaw

Comments on the draft amendmen...

Rohini Lakshanérohini
2025-11-22

Submission made by Sapni GK and yours truly in response to the call for comments issued by the Ministry of Electronics and Information Technology (MeitY), Government of India, on the draft amendment to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021.

The draft amendment pertains to the regulation of content.

We elucidate on the aspects of image-based abuse. zenodo.org/records/17680230

Ars Technica Newsarstechnica@c.im
2025-10-17

Teen sues to destroy the nudify app that left her in constant fear arstechni.ca/5z8Q #AI-generatedimages #nudifyapps #fakenudes #Policy #csam #ncii #AI

2025-09-26

Why write it without having any real-world experience or credentials? This is a disservice to victim-survivors and those who support them. Bonus question: Why does this sh*t get funded when I get rejections because of "many, competing high-quality applications?" Fin. #IBSA #IBA #NCII #TFGBV

Rohini Lakshanérohini
2025-09-26

+the guide that the writer has never actually supported a victim-survivor of image-based abuse. I can tell because I have supported victims over more than a decade.

Why write it without having any real-world experience or credentials? This is a disservice to victim-survivors and those who support them.

Bonus question: Why does this sh*t get funded when I get rejections because of "many, competing high-quality applications?"

Fin.

2025-08-20

EPIC joins 13 other advocacy organizations in sending a letter urging the #FTC to investigate Grok’s “Spicy Mode.” The feature allows users to easily create and distribute Non-Consensual Intimate Imagery, potentially violating #NCII and age verification laws. Learn more and read our full letter ⬇️

RE: https://bsky.app/profile/did:plc:p4fmuj6gl2q3pnrbi3pvokpx/post/3lwev2x4h2s2z

2025-08-09

I often get requests to help survivors of image-based abuse with these takedowns. As much as I want to support each one, I don't always manage to do so in my personal time. I request survivors to make use of this avenue. #IBA #NCII #IBSA #imagebasedabuse #TFGBV #OGBV #GBV #VAW

Have you, or someone you know,...

Rohini Lakshanérohini
2025-08-09

I often get requests to help survivors of image-based abuse with these takedowns. As much as I want to support each one, I don't always manage to do so in my personal time. I request survivors to make use of this avenue.

linkedin.com/feed/update/urn:l

Rohini Lakshanérohini
2025-08-03

Re-sharing a guide I published last year. It offers practical and granular suggestions for victim-survivors of image-based abuse, especially those in the global majority world, to remove the violative content while safeguarding their privacy and identity online.

A few survivors have told me they appreciate the guide. One said she wishes she had access to this resource when she was dealing with her situation a few years ago.

genderit.org/articles/part-1-i

ccinfo.nlCCINL
2025-05-21

Het ongevraagd verspreiden van intieme beelden is een groeiend probleem dat steeds vaker psychologische en sociale schade aanricht bij slachtoffers. In dit artikel duiken we dieper in de verborgen gevolgen van dit misdrijf, waarbij we de wetgeving en het beleid in België, Nederland en Zweden vergelijken.

Podcast Spotify: open.spotify.com/episode/4xgJ5

Artikel Cybercrimeinfo: ccinfo.nl/menu-onderwijs-ontwi

Flipboard Tech DeskTechDesk@flipboard.social
2025-05-20

President Trump signed the Take It Down Act into law, enacting a bill that will criminalize the distribution of nonconsensual intimate images — including AI deepfakes — and require social media platforms to promptly remove them when notified. Via @theverge. #AI #NCII #TakeItDown #SocialMedia #Tech #Technology flip.it/4bu5p9

Client Info

Server: https://mastodon.social
Version: 2025.07
Repository: https://github.com/cyevgeniy/lmst