Apple is suspending the implementation of its Prohibited Sexually Abused Photograph (CSAM) search engine after reviewing the backlash received. The company said it needs to continue to improve this feature. Apple said it will gather more information from various groups and supporters of those concerned.
Last month, we announced plans to introduce features to help protect children from predators who use communications to recruit and exploit them, and to limit the distribution of child sexual abuse material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time in the coming months to gather information and make improvements before we release these critical features.
Apple
Apple previously stated that this system has been in development for many years and is not intended to be used by government to control citizens’ activities. Moreover, users in Russia and other countries of the world do not need to worry about this problem, because the system will be available only in the United States of America and only if the iCloud service is turned on.
That being said, many security experts have warned that Apple’s new tool could be used for surveillance, putting the personal information of millions of people at risk.
Initially, the introduction of the function was expected with the release of the final version of iOS 15 in September, new dates have not been announced.
Post a Comment
Post a Comment