jeffw@lemmy.worldM to News@lemmy.world · 1 month ago“CSAM generated by AI is still CSAM,” DOJ says after rare arrestarstechnica.comexternal-linkmessage-square16fedilinkarrow-up10arrow-down10
arrow-up10arrow-down1external-link“CSAM generated by AI is still CSAM,” DOJ says after rare arrestarstechnica.comjeffw@lemmy.worldM to News@lemmy.world · 1 month agomessage-square16fedilink
minus-squaresparky@lemmy.federate.cc@lemmy.federate.cclinkfedilinkarrow-up0·1 month agoThe problem is that the only way to train an AI model is on real images, so the model can’t exist without crimes and suffering having been committed.
minus-squareMadison420@lemmy.worldlinkfedilinkarrow-up0·1 month agoReal images that don’t have to be of csam but rather of children, it could theoretically train anything sexual with legal sexual content and let the ai connect the dots.
The problem is that the only way to train an AI model is on real images, so the model can’t exist without crimes and suffering having been committed.
Real images that don’t have to be of csam but rather of children, it could theoretically train anything sexual with legal sexual content and let the ai connect the dots.