Apple was a role model, until it wasnt
Apple Inc. has long been a symbol of innovation and progressive values, particularly under the leadership of CEO Tim Cook, who publicly came out as gay in 2014. This moment was celebrated as a significant milestone for LGBTQ+ representation in corporate America, with Cook’s declaration accompanied by a strong commitment to inclusivity and social responsibility. Apple embraced Pride campaigns, positioning itself as a champion for marginalized communities. However, recent actions reveal a troubling shift in priorities as the company appears to prioritize profit over its previously touted ethical stances. A striking example of this contradiction emerged when Apple removed two prominent gay dating apps, Blued and Finka, from its App Store in China at the request of Beijing, doing so without any public statement or defense of LGBTQ+ rights. This decision underscores a pattern of behavior where Apple’s commitments to social justice seem to falter under economic pressure.
The issue extends beyond LGBTQ+ rights to the realm of child safety, particularly concerning the detection of child sexual abuse material (CSAM) on its iCloud platform. In 2021, Apple announced a plan to implement a privacy-conscious detection system for CSAM, which was initially met with support from experts. However, after facing backlash over privacy concerns, the company abruptly paused the rollout, leaving its platform vulnerable to exploitation. Unlike competitors like Google, which have adopted robust measures to combat CSAM, Apple has been criticized for its inaction, leading to lawsuits from survivors who argue that the company’s negligence enables the storage of abusive imagery. The stark contrast between Apple’s revenue generation from its cloud services—nearly $100 billion annually with impressive profit margins—and its failure to address these critical ethical issues raises serious questions about the company’s values and accountability.
As advocates like Lennon Torres and Sarah Gardner highlight, Apple possesses the resources and expertise to lead in both protecting vulnerable communities and ensuring child safety online. They argue that the company must implement proven detection tools and take a definitive stand against censorship that erases LGBTQ+ lives. The call to action emphasizes that ethics should not be optional in the tech industry, and stakeholders—including regulators, investors, and consumers—must demand accountability from companies like Apple. If Apple continues to prioritize profit over people, it risks losing the very essence of what once made it a beacon of hope and progress in the technology landscape. The conversation surrounding Apple’s ethical responsibilities is not just about corporate conduct; it’s a broader reflection of how technology companies must navigate the complex interplay between business interests and social responsibility in an increasingly digital world.
https://www.youtube.com/watch?v=6zBE8WDbM4Y
Apple shaped how entire generations think about technology. For many of us, its products symbolized creativity and progress. The company taught us to “Think Different” and to believe technology could make life better. But today, Apple’s actions tell a different story — when ethics collide with revenue, Apple folds.
In 2014, Tim Cook came out as gay and positioned it as a
moral stand.
Apple wrapped itself in Pride flags, marketed inclusion, and sold us the idea that it stood for something
bigger than profit.
The company seemed to embrace the fact that Apple is the
first Fortune 500
company with an openly gay CEO. Tim Cook even authored a piece in
Bloomberg Business
stating that he is, “proud to be gay.”
Fast forward to now, when Apple quietly
removed
two of the largest gay dating apps in China, Blued and Finka, at Beijing’s request. No statement. No defense of queer communities. Just silent compliance.
This isn’t an isolated decision. It’s a pattern.
When ethics collide with revenue
Apple’s support of marginalized communities seems to collapse under pressure. Take child sexual abuse material (CSAM). In 2021, Apple admitted that verified images and videos of children being sexually abused were stored on iCloud. Because they knew it was a problem, they developed a privacy-protected, vetted-by-independent-experts detection system to stop it. They
proudly announced
their plan in August 2021 and then 30 days later
paused the roll out
.
Apple commissioned their own cryptography experts to confirm the system safeguarded privacy. Independent reviewers like
David Forsyth
and
Benny Pinkas
agreed: No innocent user data would be exposed. Yet Apple abandoned the plan after backlash over privacy concerns, retreating to arguments it had previously dismantled.
Apple’s pivot to services like iCloud has made subscriptions a core revenue driver, generating nearly
$100 billion annually
with gross margins around 75 percent. Despite this profitability, Apple has still not implemented a meaningful solution to stop the spread of known CSAM, leaving iCloud as one of the
few major cloud platforms that does not proactively detect known CSAM.
This failure has sparked
lawsuits
from thousands of survivors who argue Apple’s decision enables predators to pay for storage of abuse imagery, effectively monetizing their trauma. By contrast, companies like
Google
deploy industry-standard safeguards, combining hash-matching against NCMEC databases and AI to detect and report CSAM at scale. Apple’s refusal to implement similar measures underscores a gap: While profiting from cloud services, it has not ensured those services are free from exploitation.
This isn’t just complacency. It’s negligence.
SEE ALSO:
9 LGBTQ creators discuss not backing down from Pride
Ethics shouldn’t be optional
It’s easy to do the right thing when it sells. Pride campaigns drive revenue, but only when the White House is lit up rainbow or consumer trends value ethics. But standing up for queer communities in China when the government is challenging you to stand on the side of oppression? That’s harder. Tackling child abuse on your own platform? That’s riskier. Apple will remove LGBTQ+ apps to appease Beijing without putting up a fight, but won’t take decisive action against child predators.
Apple doesn’t “Think Different” anymore. It thinks profit. And until we demand better, it will keep choosing power over people.
What needs to change
Apple has the resources and expertise to lead on both fronts — protecting vulnerable communities and safeguarding children online. It could implement proven, privacy-conscious CSAM detection tools developed by experts at Thorn, NCMEC, and Johns Hopkins’ MOORE Center. It could take a public stand against censorship that erases LGBTQ+ lives. Instead, it has chosen silence and inaction.
Regulators, investors, and consumers must hold Apple accountable. Tech companies should not be allowed to monetize harm while hiding behind branding campaigns. Ethics cannot be optional in the digital age.
This article reflects the opinions of the writers.
Lennon Torres is a Public Voices Fellow on Prevention of Child Sexual Abuse with The OpEd Project. She is an LGBTQ+ advocate who grew up in the public eye, gaining national recognition as a young dancer on television shows. With a deep passion for storytelling, advocacy, and politics, Lennon now works to center the lived experience of herself and others as she crafts her professional career in online child safety at
Heat Initiative
. The opinions reflected in this piece are those of Lennon Torres as an individual and not of the entities she is part of. Lennon’s substack:
https://substack.com/@lennontorres
Sarah Gardner is Founder and CEO of the
Heat Initiative
. With more than 13 years of technical and policy expertise in online child safety, she is an internationally recognized voice in advocating for the rights of children and survivors of child sexual abuse. Heat Initiative is an organization of technology experts, parents, survivors and advocates who believe strongly that tech companies like Apple and Meta need to remove CSAM from their platforms and implement policies that will keep children safe online.