Millions of AI chat messages exposed in app data breach
NEWNow you can listen to News articles!
A popular mobile app called Chat & Ask AI has more than 50 million users on the Google Play Store and Apple App Store. Now, an independent security researcher says the app exposed hundreds of millions of private online chatbot conversations.
The exposed messages reportedly included deeply personal and disturbing requests. Users asked questions about how to commit suicide painlessly, how to write suicide notes, how to make methamphetamine, and how to hack other apps.
These were not harmless indications. They were entire chat histories linked to real users.
Sign up to receive my FREE CyberGuy report
Get my best tech tips, urgent security alerts, and exclusive offers delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Guide to Surviving Scams, free when you join me CYBERGUY.COM information sheet.
HOW TECHNOLOGY IS USED IN THE INVESTIGATION INTO THE DISAPPEARANCE OF NANCY GUTHRIE

Security researchers say Chat & Ask AI exposed hundreds of millions of private chatbot messages, including entire conversation histories linked to real users. (Neil Godwin/Getty Images)
What exactly was exposed?
The issue was discovered by a security researcher known as Harry. It discovered that Chat & Ask AI had a misconfigured backend when using Google Firebase, a popular mobile app development platform. Because of that misconfiguration, it was easy for outsiders to gain authenticated access to the application’s database. Harry says he was able to access approximately 300 million messages linked to more than 25 million users. It analyzed a smaller sample of about 60,000 users and more than a million messages to confirm the reach.
The exposed data allegedly included:
- Complete chat histories with AI
- Timestamps for each conversation
- The custom name that users gave to the chatbot.
- How users configured the AI model
- Which AI model was selected?
This is important because many users treat AI chats as private diaries, therapists, or brainstorming partners.
How this artificial intelligence app stores so much sensitive user data
Chat & Ask AI is not a standalone artificial intelligence model. It acts as a container that allows users to talk to large language models created by larger companies. Users can choose from models from OpenAI, Anthropic, and Google, including ChatGPT, Claude, and Gemini. While those companies operate the underlying models, Chat & Ask AI handles the storage. That’s where things went wrong. Cybersecurity experts say this type of Firebase misconfiguration is a well-known weakness. It’s also easy to find if someone knows what to look for.
We reached out to Codeway, which publishes the Chat & Ask AI app, for comment but did not receive a response before publication.
149 MILLION PASSWORDS EXPOSED IN MASSIVE CREDENTIALS LEAK

The exposed database reportedly included timestamps, model configurations, and the names users gave their chatbots, revealing much more than isolated prompts. (Elisa Schu/Getty Images)
Why this is important for everyday users
Many people assume that their chats with AI tools are private. They write things they would never publish publicly or say out loud. When an application stores that data insecurely, it becomes a gold mine for attackers. Even without names attached, chat histories can reveal mental health issues, illegal behavior, work secrets, and personal relationships. Once exposed, that data can be copied, extracted, and shared forever.
YOUR PHONE SHARES DATA AT NIGHT: HERE’S HOW TO STOP IT

According to the researcher, because the app handled data storage itself, a simple Firebase misconfiguration made sensitive AI chats accessible to outsiders. (Edward Berthelot/Getty)
Ways to stay safe when using AI applications
You don’t have to stop using artificial intelligence tools to protect yourself. Some informed choices can reduce your risk while still allowing you to use these apps when they are useful.
1) Be aware of sensitive topics
AI chats can feel private, especially when you’re stressed, curious, or looking for answers. However, not all apps handle conversations securely. Before sharing deeply personal struggles, medical concerns, financial details, or questions that could pose legal risks if exposed, take the time to understand how app stores protect your data. If those protections are unclear, consider safer alternatives, such as trusted professionals or services with stricter privacy controls.
2) Research the app before installing it.
Look beyond download counts and star ratings. Check who operates the app, how long it has been available, and whether its privacy policy clearly explains how user data is stored and protected.
3) Suppose conversations can be stored
Even when an app claims privacy, many AI tools record conversations to troubleshoot or improve the model. Treat chats as potentially permanent records rather than temporary messages.
4) Limit account linking and logins
Some AI apps let you sign in with Google, Apple, or an email account. While convenient, this can directly connect chat histories to your real identity. When possible, avoid linking AI tools to primary accounts used for work, banking, or personal communications.
5) Review app permissions and data controls.
AI applications may request access beyond what is necessary to function. Review permissions carefully and disable anything that is not essential. If the app offers options to delete chat history, limit data retention, or turn off syncing, enable those settings.
6) Use a data deletion service
Its digital footprint extends beyond AI applications. Anyone can find personal information about you with a simple Google search, including your phone number, home address, date of birth, and Social Security number. Marketers buy this information to target ads. In more serious cases, scammers and identity thieves breach data brokers, leaving personal data exposed or circulating on the dark web. Using a data removal service helps reduce what can be linked to you if a breach occurs.
While no service can guarantee complete removal of your data from the Internet, a data deletion service is truly a smart choice. They are not cheap, and neither is your privacy. These services do all the work for you by actively monitoring and systematically deleting your personal information from hundreds of websites. It’s what gives me peace of mind and has proven to be the most effective way to delete your personal data from the Internet. By limiting the information available, you reduce the risk of scammers cross-referencing leak data with information they can find on the dark web, making it harder for them to target you.
Check out my top picks for data removal services and get a free scan to find out if your personal information is already available on the web by visiting Cyberguy.com.
Get a free scan to find out if your personal information is already available on the web: Cyberguy.com.
Kurt’s Key Takeaways
AI chat apps are advancing rapidly, but security still lags behind. This incident shows how a single configuration error can expose millions of deeply personal conversations. Until stronger protections become standard, you should treat AI chats with caution and limit what you share. The convenience is real, but so is the risk.
Do you assume your AI chats are private or has this story changed how much you’re willing to share with these apps? Let us know your opinion by writing to us at Cyberguy.com.
Sign up to receive my FREE CyberGuy report
Get my best tech tips, urgent security alerts, and exclusive offers delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Guide to Surviving Scams, free when you join me CYBERGUY.COM information sheet.
CLICK HERE TO DOWNLOAD THE News APP
Copyright 2026 CyberGuy.com. All rights reserved.
Kurt “CyberGuy” Knutsson is an award-winning technology journalist with a deep love for technology, gear and devices that improve lives with his contributions to News and News Business since mornings on “News & Friends.” Do you have any technical questions? Get Kurt’s free CyberGuy newsletter, share your voice, a story idea or comment on CyberGuy.com.


