Wednesday, December 3, 2025
Trusted News Since 2020
American News Network
Truth. Integrity. Journalism.
US Tech & AI

Apple was a role model, until it wasnt

By Eric December 3, 2025

Apple Inc. has long been a symbol of innovation and creativity, shaping how generations perceive technology. Under the leadership of Tim Cook, who came out as gay in 2014, the company embraced a narrative of inclusivity and moral fortitude, famously encouraging customers to “Think Different.” Cook’s public stance on LGBTQ+ rights positioned Apple as a champion for marginalized communities, especially during Pride Month when the company draped itself in rainbow flags and actively marketed its support for diversity. However, recent actions reveal a stark contradiction to this narrative. In a troubling move, Apple quietly removed two prominent gay dating apps, Blued and Finka, from its platform in China at the request of the Beijing government, showcasing a significant retreat from the values it once championed. This decision was made without any public statement or defense of the LGBTQ+ community, raising questions about the company’s commitment to ethics when faced with financial pressures.

This pattern of prioritizing profit over principles is not new for Apple. A notable example occurred in 2021 when the company announced a plan to implement a system for detecting child sexual abuse material (CSAM) on its iCloud platform. Despite initial enthusiasm and support from cryptography experts, Apple backtracked just a month later due to privacy concerns raised by the public. Critics argue that this retreat reflects a negligence that enables the exploitation of vulnerable children, as Apple remains one of the few major cloud service providers without proactive measures to combat the spread of CSAM. While competitors like Google have integrated industry-standard safeguards to detect and report such material, Apple’s inaction has led to lawsuits from survivors who claim the company’s decisions allow predators to exploit its services, effectively monetizing their trauma. The juxtaposition of Apple’s lucrative subscription model—generating nearly $100 billion annually—against its failure to safeguard against abuse illustrates a troubling disconnect between its corporate ethics and business practices.

The article emphasizes that ethics should not be optional for tech giants, particularly in an era where their influence is profound. It argues that Apple has the resources and expertise to lead in both protecting LGBTQ+ rights and ensuring child safety online but has chosen silence and inaction instead. The authors call for accountability from regulators, investors, and consumers, insisting that tech companies must not profit from harm while masking their failures with branding campaigns. The piece serves as a poignant reminder that standing up for ethical principles often comes with risks, and it challenges Apple to reclaim its commitment to “Think Different” by prioritizing people over profits. As the tech landscape continues to evolve, the demand for ethical responsibility in the digital age remains more crucial than ever.

https://www.youtube.com/watch?v=6zBE8WDbM4Y

Apple shaped how entire generations think about technology. For many of us, its products symbolized creativity and progress. The company taught us to “Think Different” and to believe technology could make life better. But today, Apple’s actions tell a different story — when ethics collide with revenue, Apple folds.
In 2014, Tim Cook came out as gay and positioned it as a
moral stand.
Apple wrapped itself in Pride flags, marketed inclusion, and sold us the idea that it stood for something
bigger than profit.
The company seemed to embrace the fact that Apple is the
first Fortune 500
company with an openly gay CEO. Tim Cook even authored a piece in
Bloomberg Business
stating that he is, “proud to be gay.”
Fast forward to now, when Apple quietly
removed
two of the largest gay dating apps in China, Blued and Finka, at Beijing’s request. No statement. No defense of queer communities. Just silent compliance. 
This isn’t an isolated decision. It’s a pattern.
When ethics collide with revenue
Apple’s support of marginalized communities seems to collapse under pressure. Take child sexual abuse material (CSAM). In 2021, Apple admitted that verified images and videos of children being sexually abused were stored on iCloud. Because they knew it was a problem, they developed a privacy-protected, vetted-by-independent-experts detection system to stop it. They
proudly announced
their plan in August 2021 and then 30 days later
paused the roll out
.
Apple commissioned their own cryptography experts to confirm the system safeguarded privacy. Independent reviewers like
David Forsyth
and
Benny Pinkas
agreed: No innocent user data would be exposed. Yet Apple abandoned the plan after backlash over privacy concerns, retreating to arguments it had previously dismantled.
Apple’s pivot to services like iCloud has made subscriptions a core revenue driver, generating nearly
$100 billion annually
with gross margins around 75 percent. Despite this profitability, Apple has still not implemented a meaningful solution to stop the spread of known CSAM, leaving iCloud as one of the
few major cloud platforms that does not proactively detect known CSAM.
This failure has sparked
lawsuits
from thousands of survivors who argue Apple’s decision enables predators to pay for storage of abuse imagery, effectively monetizing their trauma. By contrast, companies like
Google
deploy industry-standard safeguards, combining hash-matching against NCMEC databases and AI to detect and report CSAM at scale. Apple’s refusal to implement similar measures underscores a gap: While profiting from cloud services, it has not ensured those services are free from exploitation.
This isn’t just complacency. It’s negligence.

SEE ALSO:

9 LGBTQ creators discuss not backing down from Pride

Ethics shouldn’t be optional
It’s easy to do the right thing when it sells. Pride campaigns drive revenue, but only when the White House is lit up rainbow or consumer trends value ethics. But standing up for queer communities in China when the government is challenging you to stand on the side of oppression? That’s harder. Tackling child abuse on your own platform? That’s riskier. Apple will remove LGBTQ+ apps to appease Beijing without putting up a fight, but won’t take decisive action against child predators. 
Apple doesn’t “Think Different” anymore. It thinks profit. And until we demand better, it will keep choosing power over people.
What needs to change
Apple has the resources and expertise to lead on both fronts — protecting vulnerable communities and safeguarding children online. It could implement proven, privacy-conscious CSAM detection tools developed by experts at Thorn, NCMEC, and Johns Hopkins’ MOORE Center. It could take a public stand against censorship that erases LGBTQ+ lives. Instead, it has chosen silence and inaction.
Regulators, investors, and consumers must hold Apple accountable. Tech companies should not be allowed to monetize harm while hiding behind branding campaigns. Ethics cannot be optional in the digital age.
This article reflects the opinions of the writers.
Lennon Torres is a Public Voices Fellow on Prevention of Child Sexual Abuse with The OpEd Project. She is an LGBTQ+ advocate who grew up in the public eye, gaining national recognition as a young dancer on television shows. With a deep passion for storytelling, advocacy, and politics, Lennon now works to center the lived experience of herself and others as she crafts her professional career in online child safety at
Heat Initiative
. The opinions reflected in this piece are those of Lennon Torres as an individual and not of the entities she is part of. Lennon’s substack:
https://substack.com/@lennontorres
Sarah Gardner is Founder and CEO of the
Heat Initiative
. With more than 13 years of technical and policy expertise in online child safety, she is an internationally recognized voice in advocating for the rights of children and survivors of child sexual abuse. Heat Initiative is an organization of technology experts, parents, survivors and advocates who believe strongly that tech companies like Apple and Meta need to remove CSAM from their platforms and implement policies that will keep children safe online.

E

Eric

Eric is a seasoned journalist covering US Tech & AI news.

Related Articles

From AirPods to espresso machines, everything the Mashable shopping team bought during Black Friday and Cyber Monday 2025
US Tech & AI

From AirPods to espresso machines, everything the Mashable shopping team bought during Black Friday and Cyber Monday 2025

Read More →
OpenAI in “code red” after losing 6% of its users in a week due to Gemini 3, report says
US Tech & AI

OpenAI in “code red” after losing 6% of its users in a week due to Gemini 3, report says

Read More →
‘High Potential’ Season 2 Hiatus: When Does the Next Episode Premiere?
US Tech & AI

‘High Potential’ Season 2 Hiatus: When Does the Next Episode Premiere?

Read More →

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *