Tretiak@lemmy.ml to Privacy@lemmy.mlEnglish · 1 year agoApple Expands Its On-Device Nudity Detection to Combat CSAMarchive.phexternal-linkmessage-square15fedilinkarrow-up118arrow-down10
arrow-up118arrow-down1external-linkApple Expands Its On-Device Nudity Detection to Combat CSAMarchive.phTretiak@lemmy.ml to Privacy@lemmy.mlEnglish · 1 year agomessage-square15fedilink
minus-squareevalda@lemmy.mllinkfedilinkEnglisharrow-up3·1 year agoThis would scan regardless of whether iCloud is enabled or not. But only for minors. Correct?
minus-squareThe Bard in Green@lemmy.starlightkel.xyzlinkfedilinkEnglisharrow-up6·1 year agoOnce the capabilities exist, how hard would it be for future fascist regimes to tell Apple to turn it on for whatever other purposes? Under His Eye. Blessed be the fruit.
minus-squareKickMeElmo@beehaw.orglinkfedilinkEnglisharrow-up6·1 year agoWhat it’s looking for is irrelevant, it has to scan everything to be able to look for that.
minus-squarepumpsnabben@beehaw.orglinkfedilinkEnglisharrow-up5·1 year agoIf you believe Apple then yes.
This would scan regardless of whether iCloud is enabled or not. But only for minors. Correct?
Once the capabilities exist, how hard would it be for future fascist regimes to tell Apple to turn it on for whatever other purposes?
Under His Eye. Blessed be the fruit.
What it’s looking for is irrelevant, it has to scan everything to be able to look for that.
If you believe Apple then yes.