Be careful what you tell your AI chatbot. It’s not a therapist — it’s a snitch.

Be careful what you tell your AI chatbot. It’s not a therapist — it’s a snitch.

The hottest new read of 2026 might just be The Secret Diary of Greg Brockman, Aged 38¾. It has everything: feuding billionaires, scheming CEOs, and a narrator who might not be entirely reliable. You won’t find it in a library, but you can watch Brockman—a co-founder and president of OpenAI—being forced to read the juiciest parts out loud in court.

Before you ask ChatGPT to explain, here’s the backstory: Elon Musk is in a legal fight with Brockman and OpenAI’s CEO, Sam Altman. Musk, a former board member of OpenAI, accuses them of breaking the company’s founding agreement by turning it into a for-profit business. Meanwhile, Altman and his team argue that Musk is just upset he’s no longer in control and wants to take down a competitor.

Luckily for Musk, Brockman kept a diary during the company’s early years, and it’s now central to the case. In one heavily quoted entry, Brockman writes: “Financially what will take me to $1B?” Another closely examined passage says: “It’d be wrong to steal the non-profit from [Musk]. to convert to a b-corp without him. that’d be pretty morally bankrupt. and he’s really not an idiot.” (He might have been dreaming of billions, but his lack of capital letters suggests he’s conflicted about capitalism—or at least about proper punctuation.)

Look, I’m no expert on managing crime, but I’m pretty sure there are some thoughts you shouldn’t share with Dear Diary. Even Brockman’s tech bro peers are shocked by his journal-keeping. “I love the guy, but what … is he thinking?” David Friedberg, co-host of the All-In podcast, said recently. “You’re just sitting at home, like, let me write about the crime I’m committing … and by the way, let me never delete it.” Alleged crime, David, alleged.

Not many people are keeping diaries that lay out potentially shady corporate moves. But millions are using tools like ChatGPT as a kind of therapist or digital confession box—a place to share private or half-formed thoughts. “Within the next decade,” one lawyer told Axios, “the diary equivalent will be standard discovery in every major executive lawsuit in the country.”

What does this mean? It means you shouldn’t trust a chatbot with your secrets. As several recent cases show—including one where a former NFL player allegedly asked ChatGPT for help after killing his girlfriend—conversations with AI can be used in court. Even if you don’t expect legal trouble, be careful about sharing sensitive information: most chatbot chats aren’t private, can be kept forever, and might be shared with other people. Your AI chatbot isn’t a therapist—it’s a snitch.

Frequently Asked Questions
Here is a list of FAQs about the article Be careful what you tell your AI chatbot Its not a therapist its a snitch

BeginnerLevel Questions

1 What does your AI chatbot is a snitch mean
It means that what you tell a chatbot isnt private The company behind it can see store and sometimes share your conversations with others

2 Isnt a chatbot just like a private diary
No A diary stays in your notebook A chatbot sends your words to a server The company can read those words use them to train their AI and hand them over if legally required

3 Can my boss see what I told ChatGPT
Possibly If you use a work account or a companyapproved chatbot your employer often has the right to monitor those chats Even on a personal account if you discuss work secrets you could be risking your job

4 I use a therapist chatbot Is that safe
Not fully These chatbots are not bound by doctorpatient confidentiality If you mention selfharm abuse or a crime the company may report it Also your data could be used to improve the AI or sold to third parties

5 Will the police get my chatbot chats
They can If law enforcement gets a warrant or subpoena the chatbot company must hand over your conversations This has already happened in several criminal cases

Intermediate Questions

6 Does the chatbot save everything I type
Yes usually Most chatbots log your conversations for quality improvement safety monitoring and to train future models You can sometimes delete your history but the company may still keep a backup

7 Can I trust the privacy promises of AI companies
Be skeptical Many companies claim privacy but then update their terms to allow data sharing Always read the privacy policylook for phrases like may share with third parties or for research purposes

8 What kind of information should I never tell a chatbot
Never share your full name address phone number social security number passwords financial details medical records or anything that could be used to blackmail or embarrass you Also avoid discussing illegal activities

9 Can a chatbot accidentally leak my secrets to other users
Rarely