Here’s a non-paywalled link to an article published in the Washington Post a few days ago. It’s great to see this kind of thing getting some mainstream attention. Young children have not made an informed decision about whether they want their photos posted online.
Interesting how there are so many mentions of people worried about AI and only sharing photos in closed groups on Instagram/Facebook. I’m not sure that’s actually keeping the photos away from AI.
Came here to say this. If you upload pictures to instagram, they are already being processed by Facebook (“Meta”). If you have an online backup of your photos Google/Apple cloud, then they are alredy being processed.
I think a large part of their concern is AI-altered photos generated by an individual.
The problem with posting pictures of kids in closed groups is that pervs will just join those groups because they have what they’re looking for. You’re basically making it easier for them.
It’s not that parents are afraid of their kids being part of a training set, though that is a bad thing in and of itself. It’s more about all of these AI undressing app ads that are showing up on every social media site, showing just how much of a wild-west situation things currently are, and that this brand of sexual exploitation is in-demand.
Predators are already automating the process so that certain Instagram models get the AI undressing treatment as soon as they upload an exploitable pic. Pretty trivial to do at scale with Instaloader, GroundingDINO, SAM, and SD. Those pics are hosted outside of Instagram where victims have no power to undo the damage. Kids will get sexually exploited in this process, incidentally or intentionally.
I believe by closed groups they mean the family or friends chat with like 5 people.
Although I personally wouldn’t share too much in those groups too.