Apple will check iPhones, iPads for images of child sex abuse

Apple Inc. announced that new software would be released later this year that will analyze photos saved in a user’s iCloud Photos account for sexually explicit images of children and report any instances to the appropriate authorities.

The company also revealed a tool that will analyze photos shared and received in the Messages app to or from children to check if they are explicit as part of additional precautions affecting youngsters. Apple is also adding tools to Siri, its digital voice assistant, that will intervene when customers seek similar harmful content. On Thursday, the Cupertino, California-based tech giant unveiled the three new features, stating that they would be implemented later in 2021.

Apple was one of the first major corporations to use “end-to-end” encryption, which scrambles data so that only the sender and recipient can read them. However, law enforcement has long been pushing for access to that information for investigations into crimes like terrorism or child sexual exploitation.

“Apple’s expanded protection for children is a game-changer,” John Clark, the president and CEO of the National Center for Missing and Exploited Children, said in a statement. “With so many people using Apple products, these new safety measures have the lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material.”

Julia Cordua, the CEO of Thorn, said that Apple’s technology balances “the need for privacy with digital safety for children.” Thorn, a nonprofit founded by Demi Moore and Ashton Kutcher, uses technology to help protect children from sexual abuse by identifying victims and working with tech platforms.

  • Leave a Reply

    Your email address will not be published. Required fields are marked *

    Think in Bits
    © 2024 Think in Bits
    Skip to content