Mar 21, 2026 · 5 min read
Sears AI Chatbot Recorded 3.7 Million Conversations—Then Left Them on the Open Internet
Three unprotected databases contained every customer interaction with Sears Home Services' AI assistant, including voice recordings that kept running for up to four hours after the service call ended.
When you call a company's customer service line, you expect the call to be recorded. You do not expect the recording to keep running for four hours after the conversation ends, capturing your private household conversations, background television, and ambient noise. You definitely do not expect all of that to be stored in a database anyone on the internet can access.
That is exactly what cybersecurity researcher Jeremiah Fowler discovered when he found three publicly accessible, unencrypted databases belonging to Sears Home Services. The databases contained 3.7 million records spanning two years of customer interactions with "Samantha," an AI powered customer service chatbot.
What Was Exposed
The three databases were not password protected and not encrypted. Anyone who found them could browse their contents through a web browser. Inside were:
- 2.1 million text files containing AI scheduling conversations with customer names, addresses, and contact details
- 1.4 million audio recordings of phone calls, totaling 3.9 terabytes of data
- 207,381 spreadsheet files with scheduling logs and operational data
- 54,359 complete chat transcripts from start to finish in a single CSV file
The data spanned from 2024 to 2026, covering two years of customer interactions. Personal information included names, phone numbers, home addresses, email addresses, details about appliances owned, and scheduling information for deliveries and repairs.
The Four Hour Recordings
The most disturbing finding was that some voice recordings ran for up to four hours. A typical customer service call lasts minutes, not hours. These extended recordings suggest the system continued capturing audio long after the actual service conversation ended, recording whatever was happening in the customer's home.
Fowler noted that the recordings potentially captured private conversations, background television audio, and ambient household sounds that customers had no idea were being recorded. For anyone who has ever left a phone on speaker during a service call, the implications are unsettling.
Voice data carries risks beyond what text records do. Fowler emphasized that voice is a biometric identifier, a unique fingerprint that cannot be changed like a password. With as little as a few seconds of audio, criminals can now create convincing voice clones using AI tools, a growing threat vector in voice phishing attacks that have already breached identity protection companies.
The AI Systems Behind It
The exposed databases referenced two AI systems. "Samantha" is the customer facing chatbot that handles scheduling and service inquiries. "KAIros" is a broader AI platform that Sears Home Services uses for operational support, including appliance diagnostics through a tool called "Fix Genius" and recruitment through "HireHub."
The exposure reveals something most customers never consider: when you interact with an AI customer service bot, your conversation does not disappear when you close the chat window. Every word you type and every second of audio gets stored, processed, and retained, often with minimal security controls. These AI systems are data collection machines, and the databases they feed grow indefinitely unless someone deliberately limits them.
Sears' Response
Fowler followed responsible disclosure procedures, notifying Transformco, the parent company that manages the Sears brand after its 2018 bankruptcy. The databases were restricted from public access the following day. Beyond that initial action, Transformco provided no substantive response about how the data came to be exposed, how long it was accessible, or whether anyone else accessed it before the researcher.
The lack of transparency is a pattern. Companies that suffer database exposures from misconfiguration rather than hacking often avoid public disclosure entirely, since most data breach notification laws require evidence of unauthorized access rather than mere exposure. Whether anyone besides Fowler found these databases during the unknown window of exposure remains an open question.
What This Means for AI Customer Service
Every major company is racing to deploy AI chatbots for customer service. The appeal is obvious: AI handles more calls at lower cost, operates around the clock, and generates structured data that companies can analyze. But every one of those conversations becomes a liability if the data is not properly secured.
The Sears exposure is unlikely to be unique. Thousands of companies are deploying AI customer service systems, and the speed of deployment often outpaces the security review process. If Sears' databases were sitting on the open internet for months or years without anyone noticing, how many other companies have similar exposures?
How to Protect Yourself
You cannot control how companies store your data after you share it. But you can limit what you share in the first place:
- Minimize what you share with chatbots. Do not provide your Social Security number, full address, or financial details through AI chat interfaces unless absolutely necessary.
- End calls promptly. Hang up as soon as your service interaction is complete. Do not leave the call connected while you do other things.
- Use text over voice when possible. Text chat logs are a privacy risk, but they do not include your biometric voice data.
- Ask about data retention. When interacting with AI customer service, ask how long your conversation data is stored and whether it can be deleted.
- Monitor for phishing. If you have used Sears Home Services in the past two years, be alert for phishing attempts that reference your appliance details or service history, since that information may now be exposed.