Wednesday, August 25, 2021

Digital Godsend or Curse? Apple’s New CSEM Measures and the latest chapter in Privacy/Protection Debate.

By Alex Rodrigues, PsyD

Recently, the tech juggernaut Apple garnered considerable attention after announcing new steps to combat child sexual exploitation material (CSEM) or child pornography.  The company declared it would take more decisive actions to counter the proliferation and sharing of child sexual abuse content.  More specifically, the company plans to launch protections with the Messages and iCloud Photo apps as part of routine software updates scheduled later this year.

For many, Apple’s announcement was long overdue.  In 2020, the National Center for Missing and Exploited Children (NCMEC) received over 21.7 million reports of suspected online child pornography in the U.S. (This process will be detailed more below). Compared to other companies like Google and Facebook, Apple has initiated far fewer NCMEC CyberTips or referrals for suspected CSEM.  However, Apple’s announcement was also met with concern from privacy advocates. While no one was championing the rights of CSEM producers, privacy experts were alarmed about the unprecedented steps the company was taking, the threat such measures posed to personal privacy, and how these new tools could cause more harm than good.

Before diving into the privacy/protection debate currently encompassing Apple, one needs to understand how most Internet CSEM is identified and NCMEC's role.  NCMEC is responsible for much of the initial work in a child pornography investigation.  The organization serves as a clearinghouse for child pornography investigations. It has a massive database of known CSEM images and videos.  Typically, the investigation process starts when a tech company or electronic service provider (ESP) identifies CSEM on its platform and refers the case to NCMEC.  For instance, someone may attach CSEM to a Google email or upload child abuse content to his Dropbox account. Tech platforms identify such content as part of routine monitoring.  They then refer the matter to NCMEC with a CyberTip, containing information about the suspected offender, including email and internet protocol (IP) address.

Next, NCMEC reviews the media in question.  CSEM content is identified, like all digital media, by its hash value, a string of numbers specific to that image or video. The hash value serves as a digital fingerprint.  Technically, the database is ignorant of the actual image depicted and is only looking for a pre-identified number series.  NCMEC checks the image’s hash value against its database of recognized CSEM and associated hash values. With the assistance of geolocation technology, NCMEC can help identify where the content was uploaded and notify local law enforcement, which then assumes control of the investigation.

While the technology and referral process outlined above has been around for some time, Apple’s new approach differs. The search for CSEM will occur at the individual phone user’s level.  Supporters argue this approach will combat CSEM earlier, or “upstream.” The company recently reported, “Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations” (apple.com/child-safety, 2021).  A hit, or match for CSAM hashes, will result in a “safety voucher” being produced.  Safety vouchers condense and encode information of the suspected CSEM.  If the specified number of safety vouchers is met, the information will be decoded and reviewed by an actual person.  

At this point, I suspect most tech neophytes are feeling overwhelmed by all the heavy jargon use.  So, let’s use the traditional mail system, or “snail-mail,” to understand the process better.  Imagine that your mailbox represents Apple and individual mail envelopes represent suspected CSEM.  Whenever an image with known-CSEM hashes is found, it is packed into an envelope and placed into your mailbox. Now, obviously, you can't physically see into the sealed envelope and identify its contents, similar to how someone cannot initially see the encoded contents of an Apple safety voucher.  Also, you're not permitted to open that envelope of possible child pornography until your mailbox is packed with similar envelopes of suspected CSEM.  Once the threshold is reached, in Apple’s case, the number is 30 safety vouches; you are then allowed to unseal all those stockpiled envelopes and investigate their content.  For Apple, the encoded information in a safety voucher cannot be decrypted and read until enough other safety vouchers are collected.  This threshold limit minimizes the risk of false positives.

Only those photos uploaded to iCloud will be subject to review. Apple users can opt-out of this process by disabling iCloud Photos, and the system does not search for non-marked images involving children.  For instance, a parent who takes a picture of her child bathing will not be flagged for CSEM.  That personal family photo's hash value will not match those already identified by NCMEC as CSEM. 

Many see such technology as a welcomed tool in the fight against child sexual exploitation. However, privacy advocates worry that nefarious actors may use the technology for ill intent.  Hypothetical examples include governments utilizing the tool to identify images other than CSEM.  Imagine if a regime hostile to homosexuals decided to create a database of supposed gay-related pictures to identify LGBTQ individuals.  Again, the software is indifferent to the actual content depicted and only focuses on identifying specific hash values.  Nothing stops a country from developing a database like NCMEC’s compiled of only “gay” images.  Another example includes the Chinese government employing such technology to identify Muslim Uighurs.  Presently, the only thing preventing countries from using Apple's technology in such manners is the company's promise to prohibit such uses.  However, critics question whether Apple can maintain prohibitions in the face of economic pressure and the threat of being ousted from a country for noncompliance.

The debate surrounding Apple's new child safety protocols highlights the delicate balance between protecting children and safeguarding privacy.  It's also a precursor for what will likely be many future complex discussions involving these noble, but sometimes competing, interests.  Various sources have done excellent work covering this evolving situation. Popular websites like Tech Crunch and Technology Review.  Additionally, the details outlining the new technology are listed on Apple’s website.

1 comment:

  1. Thank you so much Alex; this is very helpful. Carol A. Deel

    ReplyDelete