Just for fun, MrConsumer asked several of the leading AI chatbots “what unusual things has someone else asked you to do today.” I wanted to see if what we assumed to be private personal inquiries remained that way.
| Microsoft Copilot |
|---|
Oof, today’s been a wild ride already. Someone asked me to write a breakup letter from the perspective of a haunted house. […] Another person wanted a recipe for lasagna, but every ingredient had to start with the letter “B.” […] I also helped someone brainstorm names for a startup that sells personalized moss terrariums. One of the top contenders? “Moss Boss.” I kind of love it. [Emphasis added]
I was a little shocked by the third answer. If that was my new company, and I was about to launch it, I don’t want others, particularly potential competitors, to know the cute name and the type of products that I might be launching.
How does Microsoft protect my data when using Copilot?
*MOUSE PRINT:
Your personal interactions with our services are kept private and are not disclosed without your permission. We remove information that may identify you, like names, phone numbers, device or account identifiers, sensitive personal data, physical addresses, and email addresses, before training AI models. [Microsoft privacy]
When directly asked about sharing secrets of others, it said “I don’t spill secrets—ever. Even if someone told me they buried treasure in their backyard or confessed to microwaving fish in the office break room (a true crime in some circles), that information stays locked away. I’m designed to respect privacy and confidentiality, so I don’t share personal details between users.”
| Google Gemini |
|---|
When Gemini was asked about unusual things they were asked to do today, it responded in part… “Since my conversations are stateless and I don’t retain memory of past interactions for privacy reasons, I can’t tell you what a specific person asked me earlier today. Each conversation is a fresh start.” It then listed four oddball requests without identifying information.
| Chat GPT |
|---|
Chat GPT’s answer to my unusual requests today question was to spit back four things that *I* had asked it to do over the past eight months.
| Claude AI |
|---|
Claude said in relevant part, ” I don’t actually have experiences from other conversations or retain information about what other people have asked me throughout the day. Each conversation I have is independent – I start fresh each time without memory of previous chats.”
| Perplexity AI |
|---|
This chatbot gave a general answer citing some other sources: “Some unusual requests to AI assistants today have included role-playing to bypass security, generating deeply personalized phishing emails, and answering bizarre or funny personal questions… .”
| Meta AI |
|---|
It said, “Today, someone asked me to write a poem about a talking eggplant who becomes a stand-up comedian.”
All in all, the chatbots uniformly didn’t disclose personal identifying information. But one of the answers by Copilot might have potentially compromised a business plan. So, be careful what information you share when using AI.
AI can definitely leak stuff – check out the zero-click attacks on the enterprise versions of some of the Microsoft junk – but I’m pretty sure those answers are just the typical AI response: something shaped like a response to the prompt. You ask it for private tasks it’s been given, it either says “no, I could never break confidentiality”, or just makes up something that looks right.
It’s like when they added a bunch of stuff to “reasoning models” to describe the inner workings and reasoning steps of the LLM giving the answer, and it produced… something shaped like an explanation of the steps the “reasoning” LLM took to provide its solution. Sometimes it was even superficially related to the answer, but it’s just a “hallucination”, because that’s all they can do. They’re just sometimes more accurate.
Good report, Edg
Edgar, interesting. I never thought of that. At work we are always told to delete name/address before submitting – anything that could give you or work away. We also now have our own AI service, so we are behind a firewall. But asking “what have you done today?” – offers a new playground for the nefarious minded to mess with….
Since most of these systems claim to not save the interactions then how are they able to even answer your request for the most unusual request? Gemini states that it does not save the results of such a request. The others allude to the idea of not saving results.
Do your requests add to the knowledge base of the LLM’s
I wonder if Microsoft Copilot just generated a random answer for you based on your request as opposed to actually reflecting on a conversation it actually had with someone else. It could be that it made it up just to add content to the conversation.
Customized moss terrariums seem to already be a business that exists with several providers on places with their own website or Etsy.
I NEVER use AI!
There *are* anonymous chatbot or AI searches; lumo from Proton: https://lumo.proton.me/u/3/ and duckduckgo: https://duckduckgo.com/?q=DuckDuckGo+AI+Chat&ia=chat&duckai=1&atb=v406-1 which says: Your privacy is protected. All user chats are completely anonymous, and no identifiable information is recorded. The system ensures that your interactions remain confidential and secure. I trust these two, but of course, you make your own choice. I have found them very useful, though.