Sears Home Services uses an AI phone assistant called Samantha, powered by technology named kAIros. It was meant to make repairs and appointments smoother. Instead, a security researcher discovered that conversations with Samantha were sitting in plain sight on public servers.
What was exposed
Researcher Jeremiah Fowler, who works with Black Hills Information Security, found three unsecured databases that together contained:
- 3.7 million chat logs
- 1.4 million audio files and plain text transcripts
- Logs that span from 2024 to this year, including English and Spanish conversations
- A CSV file holding 54,359 complete chat logs by itself
The records included clear personal details: customer names, phone numbers, home addresses, what appliances people owned, and information about delivery appointments and repairs.
Why this matters
Fowler summed it up plainly: this is real data of real people. That matters for two big reasons.
- Phishing and scams. The data gives scammers material to build convincing fraud attempts, like warranty or delivery scams, because it contains specific details about customers and their homes.
- Long, private recordings. Many audio files kept recording well after the call seemed to end. Some files ran for hours, and one example included a call lasting 76 minutes. Those extra minutes could capture private conversations or background information customers assumed was not being recorded.
Samples from the logs
The exposed files show two predictable things: people getting frustrated, and chatbots failing to be perfect.
In one long audio call, the person asked to speak to a human just two minutes in. The AI insisted it could handle the issue, then later said it was encountering errors and offered to transfer the call to a live agent. In a text transcript that runs from about 11:00 a.m. to 1:30 p.m., a caller repeats "Where's my technician?" 28 times and eventually says, "You're a computer. You're a computer. You're a computer."
What happened after the discovery
Fowler reported the exposed databases to Transformco, the company that owns Sears and Sears Home Services. The databases were secured after he reached out. It is not known how long the data was exposed before Fowler found it, or whether anyone else accessed the files while they were public. Transformco did not provide a comment when contacted about the issue.
Fowler says his initial report included a reply claiming to connect him with a Samantha AI chatbot manager, but that contact did not follow up after a subsequent message.
Security and responsibility
Fowler emphasized that companies deploying AI need to protect the data they collect. At a minimum, he said, the files should have been password protected and encrypted. Cutting corners on basic security leaves people vulnerable.
Expert perspective
Carissa Véliz, an author and associate professor at the University of Oxford, noted that sometimes people feel safer talking to a machine because they assume a machine will not cause harm. But she warned that customers often have little choice about sharing sensitive information.
Véliz said companies should offer more options, including the choice to speak with a human and the choice to opt out of recording. Customers should feel safe and respected, not exposed or exploited.
Bottom line
Sears Home Services had a lot of customer conversations accessible on public servers until a researcher found them. The exposed material included personal details and long audio recordings that may have captured private moments. The data was secured after being reported, but the incident highlights how easily AI systems can put sensitive information at risk when basic protections are not in place.