Here’s a non-paywalled link to an article published in the Washington Post a few days ago. It’s great to see this kind of thing getting some mainstream attention. Young children have not made an informed decision about whether they want their photos posted online.
Interesting how there are so many mentions of people worried about AI and only sharing photos in closed groups on Instagram/Facebook. I’m not sure that’s actually keeping the photos away from AI.
I think a large part of their concern is AI-altered photos generated by an individual.
Came here to say this. If you upload pictures to instagram, they are already being processed by Facebook (“Meta”). If you have an online backup of your photos Google/Apple cloud, then they are alredy being processed.
The problem with posting pictures of kids in closed groups is that pervs will just join those groups because they have what they’re looking for. You’re basically making it easier for them.
It’s not that parents are afraid of their kids being part of a training set, though that is a bad thing in and of itself. It’s more about all of these AI undressing app ads that are showing up on every social media site, showing just how much of a wild-west situation things currently are, and that this brand of sexual exploitation is in-demand.
Predators are already automating the process so that certain Instagram models get the AI undressing treatment as soon as they upload an exploitable pic. Pretty trivial to do at scale with Instaloader, GroundingDINO, SAM, and SD. Those pics are hosted outside of Instagram where victims have no power to undo the damage. Kids will get sexually exploited in this process, incidentally or intentionally.
I believe by closed groups they mean the family or friends chat with like 5 people.
Although I personally wouldn’t share too much in those groups too.
On the flip side, search for “mom run” or “parent run” on Instagram to see the kids whose parents have decided to parade in front of thousands of people online. Usually moms posting their little girls in leotards and swimsuits for their mostly mostly adult male followers… 🤢🤮
But don’t worry, Meta isn’t complicit, if you search “child model” they give you a scary child abuse warning message.
Someone else on Lemmy pointed this out a while back, and after seeing it for myself that firmly solidified my decision to stay the fuck away from anything Meta does.
As the internet gets scarier
How the fuck is the internet getting scarier? This isn’t the random gore and porn filled, go to a forum and immediately get targeted by a sex-predator, internet that I grew up with. The internet is a corporate walled garden of mega services that feed disinformation and bullshit to people, but your odds of getting genuinely victimized as a child are so much lower than they used to be.
IMO the “getting scarier” is the swinging back part. Grew up in the same time, my parents were big on “No identifying information to anyone on the internet!” I joke with them now that their generation, the ones that told us to stay off post all their business on facebook and the like.
But that’s the thing, you have a small segment of society that was the internet nerds that didn’t trust anything on the internet, hid themselves and the like, but now like you say it’s the corporate walled garden that’s sanitized and happy, which makes that veneer of trust. And boy do people trust it, posting anything and everything.
Odds are lower in percentages of being genuinely victimized as a child, but the lack of paying attention what’s posted has lead to a lot of effects, so people are getting worried again.
I think it is just a different scary. It is less predators snatching you up in their white van and more social media is totally screwing with people’s heads. It is more addictive than ever before. People are have para social relationships with online personalities. All photos you see online have been edited and changed to make them look better. Creating huge body image issues. And that is just the stuff I can think of off the top of my head.
And the preditor in the white van hasn’t actually gone away. Their have been some very questionables things over the years that have gotten some news coverage. Spiderman and Elisa shit, some very very questionable Musicly/TikTok video with kids, I have heard about some kids doing ASMR videos. Minecraft YouTubers seem to always end up grooming some kids.
Bottom line the Internet be scary. Stay safe and definitely don’t let kids be unattended.
I really hope it becomes the new normal to stop posting everything about ourselves non-anonymously online in general. But especially photos and information about kids. I am hopeful that in the near future, we’ll all look back and say “What the fuck were we thinking? We all looked like narcissists exploiting our kids for likes!”
I have not posted a single photo of my kids on any platform for this reason. My wife on the other hand thinks I’m overly paranoid, so thanks to her, Zuck has a ton of photos of them…
My mama taught me back in the day to never use your real name online. Now, multiple decades later, I laugh at people who are my age and just now learning that lesson.
Anything with your name on it should be very controlled and curated. Anything you said 10 years ago can and likely will be used against you.
Managing digital photos is quite hard to do reliably.
Where do you store them? Optical disc, it might get mushrooms; HDD, mechanism might fail; SSD or flash, this one’s better but it might get corrupted, and so on.
Cloud services provide a convenient solution to all this, than apart from the service going down (which is less likely) have no other issues. You can also access them wherever you are.
Privacy is an important concern. It would be nice to have them encrypted on cloud. Encrypted from a local and trusted (open source) client, that is also convenient. If each time I want to show a photo to my granny I have to download and gpg a file manually, I pass.
But most people don’t care about their privacy at all anyways, so why bother.
Optical disc, it might get mushrooms
Um… what?
Syncthing!
Android Phone/Linux/Windows/Mac/iOS clients. Simply sync your photos to all of your devices, if you only have the one device, use a trusted friend and cross sync…
Don’t bother with cloud.
Also Signal groups for sharing with those that matter.
I recently found out about Circles and was hoping to migrate friends and family to it, but it’s just too much of a learning curve to get things set up.
That looks cool, I hadn’t heard of Circles before. I want to check it out now. I’m curious if it somehow keeps your data private from the server owner. That feels like the missing feature in most federated, privacy-focused social networks.
Side note: looks like it’s made by Futo; I hadn’t realized they were working on something like that. I’ve been using another one of their apps, Grayjay for almost all of my mobile Youtube viewing lately. It works great.
I read through their EULA the other day, and it seems everything is E2EE so only the recipients can see the data, but they do have access to some stuff such as last login, usernames, etc.
I have a few friends using it, and it’s nice once you get it going, but adding/ finding friends is a bit of a headache in my experience
If it is made by anyone associated with Grayjay then I’m out
I haven’t heard anything bad about Grayjay before; what’s the issue with it?
Non free software pretending to be foss. It pisses me off and it can’t possibly be better than third party clients.
https://gitlab.futo.org/videostreaming/grayjay it is a paid program that is why it is not allowed to fork from it atleast without approval .Altough you can use it without paying
Exactly, it is not Foss
I mean it is not foss ,but more open than the youtube client app
I use 23Snaps. Gated social sharing among your contacts.
Yes, closed-source, unencrypted and hosted by a party you can “trust”. Anyone can write that they are a parent and care for your privacy.
Storing offline is great and all, but I hope everyone is storing on multiple disks at multiple locations…
Yer didn’t think so, I’m sure photos are being lost.