Feb 10, 2026 · 5 min read
An AI Chat App Just Leaked 300 Million Private Conversations
Chat & Ask AI exposed every conversation its 25 million users ever had with ChatGPT, Claude, and Gemini through a single misconfigured database.
One Misconfiguration, 300 Million Messages
A security researcher who goes by Harry discovered that Chat & Ask AI, a popular mobile app with over 50 million downloads, had left its entire backend database publicly accessible. The app uses Google Firebase as its backend. Because Firebase Security Rules were set to public, anyone with the project URL could read, modify, or delete every record in the database.
Harry accessed roughly 300 million messages tied to more than 25 million users. He analyzed a smaller sample of about 60,000 users and more than one million messages to confirm the scope of the exposure before reporting it to the developer.
What Was Exposed
Chat & Ask AI is not a standalone AI model. It is a wrapper app that lets users interact with large language models from OpenAI, Anthropic, and Google, including ChatGPT, Claude, and Gemini. The exposed database contained:
- Complete chat histories for every user
- Timestamps of every conversation
- The custom names users gave their chatbot
- Which AI model each user selected
- User configuration settings and files
The content of those conversations ranged from mundane to deeply personal. Researchers found users asking the AI about medical conditions, requesting help writing suicide notes, and seeking instructions for illegal activities. All of it was sitting in a database that anyone could access.
A Systemic Problem, Not an Isolated Incident
Harry did not stop at Chat & Ask AI. He built an automated scanning tool called Firehound and tested 200 popular iOS apps. The results were alarming: 103 out of 200 apps had the same Firebase misconfiguration, collectively exposing tens of millions of stored files across the app ecosystem.
The exposure also extended beyond Chat & Ask AI itself. The database belonged to Codeway, the Turkish developer behind multiple popular apps. Data from users of Codeway's other applications was also accessible through the same misconfigured backend.
The Fix Was Fast. The Damage Is Harder to Measure.
Harry disclosed the vulnerability to Codeway on January 20, 2026. The company fixed the issue across all of its apps within hours. Harry confirmed the fix and removed the affected apps from his Firehound registry.
But the speed of the fix does not answer the harder question: who else accessed the database before Harry found it? A misconfigured Firebase backend is discoverable by anyone scanning for open databases. There is no access log to confirm whether the data was downloaded, copied, or sold before the vulnerability was closed.
Why This Matters for Everyone Using AI
People treat AI chatbots like private search engines. They ask questions they would never type into Google. They share medical symptoms, relationship problems, financial details, and professional secrets. When that data leaks, it is not just metadata. It is a window into someone's inner life.
Chat & Ask AI is one of hundreds of AI wrapper apps available on iOS and Android. Most of them use third party backends. Most of them store your conversations. And as Harry's research shows, more than half of the apps he tested had the same type of misconfiguration.
The lesson is not just about one app. It is about the expanding surface area of personal data exposure. Every new service you share data with is another potential leak. Whether it is an AI chatbot storing your conversations, an email provider scanning your inbox, or a tracking pixel logging when you read a message, the pattern is the same: your data travels further than you expect and is stored less securely than you hope.