Apple sued over leaving behind CSAM detection for iCloud | TechCrunch
Apple is being sued over its determination to not put into effect a gadget that may have scanned iCloud pictures for kid sexual abuse subject matter (CSAM). The lawsuit argues that by