Notice of Exempt Solicitation

Pursuant to Rule 14a-103 | February 12, 2025

 

Name of Registrant: Apple, Inc.

Name of person relying on exemption: Bowyer Research

Address of person relying on exemption: P.O. Box 120, McKeesport, PA 15135

 

Written materials are submitted pursuant to Rule 14a-6(g) (1) promulgated under the Securities Exchange Act of 1934. Filer of this notice does not beneficially own more than $5 million of securities in the Registrant company. Submission is not required of this filer under the terms of the Rule but is made voluntarily in the interest of public disclosure and consideration of these important issues.

 

 

 

Apple, Inc. (AAPL)

Combating Online Child Sex Abuse & Reputational Risk: Vote YES on Proposal 5

Contact: Gerald Bowyer | jerrybowyer@bowyerresearch.com

      Resolution

 

Bowyer Research urges Apple shareholders to vote YES on Proposal 5, “Report on Costs and Benefits of Child Sex Abuse Material‐Identifying Software & User Privacy.”

      Background

 

No shareholder ever wants to see their company described as the “greatest platform for distributing child porn.” Yet that’s exactly how one of Apple’s top executives describes the company’s relationship with online child exploitation.


In messages1 unearthed during the company’s 2021 legal battle with Epic Games, Apple Fraud Engineering Algorithms and Risk head Eric Friedman delivered2 a shocking assessment of Apple’s status when it comes to protecting kids online. Given Apple’s commitment to protecting user privacy, Friedman admitted the dark reality of that commitment: “we are the greatest platform for distributing child porn, etc." In the same documents,3 Friedman provided evidence in the form of an upcoming trust and safety presentation at Apple that listed “child predator grooming reports” as both an “active threat” and an “under-resourced challenge.”

 

But not everyone needs Apple execs to explain how the company’s approach to child sexual abuse material (CSAM) has holes. “Amy” and “Jessica,” the plaintiffs in a $1.2 billion dollar class action lawsuit4 against Apple, experience this reality every time they discover that photos of the sexual abuse they experienced as children have been uncovered on another Apple device. Law enforcement notifies them when this CSAM, video documentation of rapes and molestation that occurred when they were children, is discovered5 on iCloud accounts, or on a MacBook in Virginia.

      Legal & Reputational Risk

 

Amy & Jessica go by pseudonyms in the lawsuit to protect the innocent — even as they raise questions about Apple’s ability to do just that. The lawsuit alleges that Apple knowingly allowed child sex abuse material to proliferate under its watch through its choice to not deploy CSAM-scanning material due to privacy concerns. The company’s 2022 decision6 to not deploy NeuralHash7, a technology that would detect CSAM for further analysis and reporting if certain thresholds were met, sparked celebration from online privacy hawks, criticism from child safety advocates, and controversy8 across the board. And that controversy has only grown.

 


1https://www.theverge.com/c/22611236/epic-v-apple-emails-project-liberty-app-store-schiller-sweeney-cook-jobs

2https://www.theverge.com/c/22611236/epic-v-apple-emails-project-liberty-app-store-schiller-sweeney-cook-jobs

3https://embed.documentcloud.org/documents/21044004-2020-february-fear-friedman-admits-in-feb-2020-that-app-store-greatest-platform-for-child-porn-predator-grooming/#document/p11

4https://www.jamesmarshlaw.com/wp-content/uploads/2024/12/Amy-et.-al.-v.-Apple-Case-24-cv-08832-Doc-1-Complaint-12-07-2024.pdf

5 https://www.nytimes.com/2024/12/08/technology/apple-child-sexual-abuse-material-lawsuit.html

6 https://www.cnn.com/2022/12/08/tech/apple-csam-tool/index.html

7 https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf

8 https://www.wired.com/story/apple-csam-scanning-heat-initiative-letter/


Apple has been on the National Center on Sexual Exploitation’s ‘Dirty Dozen’ list for the second year in a row,9 due to its CSAM-scanning policies and a plethora of other points of concern, including evidence of Apple’s App Store promoting “nudify” apps that utilize AI as rated for ages four and above. While this issue with the App Store differs from the choice to kibosh CSAM-scanning protocols, it too creates controversy: further fueling shareholder concerns that the balance between user privacy and child protection at Apple is tilting in a way that creates ever-increasing levels of reputational risk.

 

No Apple shareholder wants to see the company described by its own executives as the “greatest platform for distributing child porn.” No Apple shareholder wants to see articles in mainstream news outlets describing how the company “helped nix parts of a child safety bill.” Nor Apple shareholder wants to see lawsuits10 like the one stemming from Amy & Jessica’s horrific abuse as children, alleging that the company knew the risks of CSAM and refused to act. All of these things happened. The push for increased transparency around Apple’s approach to combating online exploitation, as exemplified in Proposal No. 5, is only rational from any shareholder concerned about this growing area of reputational risk.

      Rebuttal to the Board’s Statement of Opposition

 

In direct contrast to the rational nature of this approach, Apple’s statement of opposition11 (SOP) contains several notable errors, ranging from the tangential to mischaracterization so blatant that it calls into question whether Apple’s Board of Directors even understands the proposal it writes to oppose.

 

Firstly, the tangential. The SOP references Apple’s Communication Safety protocols, including blurring photos/videos that contain nudity, and notes particularly that such features are on by default for “accounts of children under 13. While a laudable feature, Apple also implies that this safety feature is not on by default for accounts of users between the ages of 13–18. Does the company believe that children should not be protected, by default, from exposure to sexually exploitative content, after their 13th birthday? Why is this feature not on by default for all users


9 https://endsexualexploitation.org/apple/

10https://www.cnet.com/tech/services-and-software/apples-abandonment-of-icloud-csam-scanner-is-hurting-victims-lawsuit-alleges/

11 https://s2.q4cdn.com/470004039/files/doc_financials/2025/Proxy_Statement_2025.pdf


under the age of 18? If the Board’s statement is accurate, children older than 13 using Apple messaging have less default protection against being exposed to explicit sexual content than adults on many dating apps (which blurs out explicit media by default for users over 18). When a 14-year-old using iMessage is less protected against sexual exploitation than a 24-year-old on Bumble, Apple’s reputational risk becomes abundantly clear.

-

Secondly, the Board mischaracterizes the essential ask of the Proposal: a cost-benefit analysis on CSAM detection software, and not the implementation of any specific policy. Apple’s SOP repeatedly references the “universal surveillance suggested in the proposal,” as if the Board can’t differentiate between asking for a cost-benefit analysis for software and asking for the implementation of specific software. It may benefit Apple rhetorically to paint the implementation of CSAM detection software as universal surveillance (although Apple was certainly singing a different tune when it announced12 the software and described it as having a “1 in 1 trillion” error rate). But the proposal isn’t asking for specific implementation, and Apple’s inability to distinguish between the basic and separate categories of risk analysis and implementation is absurd, and only further discredits the company’s opposition to this Proposal.

      Conclusion

 

The sophistry of the board’s statement of opposition aside, the problem the Proposal seeks to address has not changed: Apple has a reputational problem when it comes to its stance towards online child exploitation. This isn’t mere shareholder opinion or twisted corporate activism, but an objective statement of the facts. Apple may have an honest and intelligible rationale for not deploying CSAM detection protocols – but shareholders deserve to know what that rationale actually is.

 

Apple has the opportunity to cast additional light on the pathway it took to arrive at its decisions surrounding CSAM detection software, and by doing so to offer shareholders something tangible in the face of mounting reputational risk, other than requests for trust in the company with no verification. In the name of providing such verification, we urge a vote in favor of Proposal No. 5.


12 https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf


 

      Disclosures/Media

 

The foregoing information may be disseminated to shareholders via telephone, U.S. mail, e-mail, certain websites and certain social media venues, and should not be construed as investment advice or as a solicitation of authority to vote your proxy. The cost of disseminating the foregoing information to shareholders is being borne entirely by the filers.

 

The information contained herein has been prepared from sources believed reliable but is not guaranteed by us as to its timeliness or accuracy, and is not a complete summary or statement of all available data. This piece is for informational purposes and should not be construed as a research report. Bowyer Research is not able to vote your proxies, nor does this communication contemplate such an event. Proxy cards will not be accepted by us. Please do not send your proxy to us. To vote your proxy, please follow the instructions on your proxy card.

 

For questions, please contact Gerald Bowyer, president of Bowyer Research, via email at jerrybowyer@bowyerresearch.com.


Apple (NASDAQ:AAPL)
Historical Stock Chart
From Jan 2025 to Feb 2025 Click Here for more Apple Charts.
Apple (NASDAQ:AAPL)
Historical Stock Chart
From Feb 2024 to Feb 2025 Click Here for more Apple Charts.