It follows widespread criticism from privacy groups and others, worried that the on-device tracking set a dangerous precedent.
Apple said that it had listened to the negative feedback and was reconsidering.
There were concerns the system could be abused by authoritarian states.
The so-called NeuralHash technology would have scanned images just before they are uploaded to iCloud Photos. Then it would have matched them against known child sexual abuse material on a database maintained by the National Center for Missing and Exploited Children.
If a match was found then it would have been manually reviewed by a human and, if required, steps taken to disable a user’s account and report it to law enforcement.
It was due to launch later in the year.
In a statement, Apple said: “Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material.
“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Privacy campaigners expressed concern that the technology could be expanded and used by authoritarian governments to spy on citizens.
The Electronic Frontiers Foundation has been one of the most vocal critics of the system, gathering a petition signed by 25,000 customers opposing the move.
Its executive director Cindy Cohn told the BBC: “The company must go further than just listening and drop its plans to put a backdoor into its encryption entirely.”
“The enormous coalition that has spoken out will continue to demand that user phones – both their messages and their photos – be protected and that the company maintains its promise to provide real privacy to its users.”
Apple has been an exponent of privacy and end-to-end encryption in the past.News Source: BBC