iPhone so far you can store no more than 30 prohibited photos

After a week of criticism over the new Child Sexual Abuse Photographs (CSAM) detection system, Apple said on Friday that it would only search for images that have been tagged by data centers in various countries.

Initially, Apple did not confirm how many matching images on a phone or computer would have to be found before the operating system notifies Apple to verify a person and possibly report it to authorities. On Friday, Apple confirmed the threshold was initially set at 30 images, but this number may be reduced as the system improves.

Apple also denied speculation that the new mechanism could be used to target individuals: the list of image IDs is universal and will be the same for whatever device it applies to.

Apple also explained that the new system creates an encrypted hash database of child sexual abuse material on the device, sourced from at least two or more organizations under the patronage of individual national governments.

Apple admitted that it did a poor job in explaining the new strategy, prompting backlash from influential technology policy groups and even its own employees, who were concerned that the company was jeopardizing its reputation for protecting consumer privacy.

Apple declined to say if the criticism influenced any policies or software but said the project is still in development and will undergo changes.

Post a Comment