Anthropic’s Claude overtakes ChatGPT on App Store as Pentagon deal sparks user backlash

Anthropic’s AI assistant Claude has surged to the top of Apple’s App Store, overtaking OpenAI’s ChatGPT just days after OpenAI secured a Pentagon contract worth up to $200 million. The deal replaced Anthropic on the defense project, which Anthropic had refused over concerns about military applications of its AI technology.

Over the past weekend, Claude climbed to the No. 1 spot on Apple’s App Store while ChatGPT fell to second place. On Google Play, Claude reached No. 5, with ChatGPT in second place. This rise comes after Anthropic’s recent Super Bowl advertising campaign, which criticized OpenAI for introducing ads into ChatGPT, and coincides with a wave of user migration following the Pentagon news. Free active users of Claude have increased by more than 60% since the start of the year, while daily signups quadrupled.

OpenAI CEO Sam Altman announced that his company had replaced Anthropic on the Pentagon deal, allowing OpenAI’s models to be deployed on the Department of War’s classified network. Altman stated that the agreement includes the safety measures Anthropic had requested, such as bans on domestic mass surveillance and the use of autonomous weapons without human oversight. However, the move prompted protests online, with ChatGPT users urging others to “Cancel ChatGPT” and migrate to Claude, framing OpenAI’s decision as opportunistic.

Anthropic CEO Dario Amodei had rejected the Pentagon’s terms, which required Anthropic to allow its AI to be used for “all lawful purposes,” including surveillance and autonomous weapons. The Pentagon responded by labeling Anthropic a national security risk and directing federal agencies to phase out its technology. Amodei described the designation as “retaliatory and punitive” and reiterated that Anthropic remains open to military collaboration within its ethical red lines.

President Donald Trump also criticized Anthropic on TruthSocial, calling it a “Radical Left AI company” and supporting the Pentagon’s decision to remove the firm from defense work. Despite initial alignment with Anthropic, Altman positioned OpenAI as a compliant alternative that would implement strict safety guardrails while fulfilling the government’s requirements.

The App Store surge highlights how user sentiment and ethical stances can significantly influence digital platform rankings. Many users appeared motivated by concerns over AI ethics, transparency, and military use, directly affecting adoption patterns. Claude’s rise demonstrates the growing importance of public perception in AI deployment, especially when tied to sensitive government contracts.

Anthropic’s Claude overtakes ChatGPT on App Store as Pentagon deal sparks user backlash

As the debate unfolds, the market dynamics for AI assistants are shifting. OpenAI’s Pentagon contract may provide long-term financial and strategic advantages, but it has triggered immediate consumer backlash. Meanwhile, Anthropic’s principled stance against unrestricted military use has strengthened its position among ethically conscious users, translating into higher downloads and engagement.

Analysts note that this episode reflects the broader tension between AI innovation, public trust, and military applications. Companies that can navigate these concerns while maintaining transparency and safety standards are likely to gain competitive advantage in an increasingly scrutinized AI ecosystem.

Claude’s top App Store ranking underscores how user preferences, ethical positioning, and marketing campaigns interact to shape adoption in real time. OpenAI and Anthropic now face contrasting challenges: one balancing regulatory and government collaboration, the other leveraging ethical credibility to attract users. The long-term impact on AI market share, user loyalty, and public trust will be closely watched by industry observers and policymakers alike.

Anthropic and OpenAI CEOs condemn ICE violence while praising Trump

Share This Article
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *