themachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 4 months agoA Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.coexternal-linkmessage-square90linkfedilinkarrow-up1607arrow-down123cross-posted to: hackernews@lemmy.bestiver.setechnology@lemmit.onlinefuck_ai@lemmy.world
arrow-up1584arrow-down1external-linkA Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.cothemachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 4 months agomessage-square90linkfedilinkcross-posted to: hackernews@lemmy.bestiver.setechnology@lemmit.onlinefuck_ai@lemmy.world
minus-squareMiðvikudagur@lemmy.worldlinkfedilinkEnglisharrow-up11·4 months ago“Child pornography” is a term NGO’s and Law enforcement are trying to get phased out. It makes it sound like CSAM is related to porn, when in fact it is simply abuse of a minor.
minus-squareTipsyMcGee@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up1·edit-22 months agodeleted by creator
“Child pornography” is a term NGO’s and Law enforcement are trying to get phased out. It makes it sound like CSAM is related to porn, when in fact it is simply abuse of a minor.
deleted by creator