[ad_1]
People turn to ChatGPT for all kinds of things — couples therapy, help with writing a professional email, turning pictures of their dogs into humans — letting the artificial intelligence platform in on some personal information.
And apparently, there are a few specific things you should never share with the chatbot.
When you type something into a chatbot, “you lose possession of it,” Jennifer King, a fellow at the Stanford Institute for Human-Centered Artificial Intelligence, told the Wall Street Journal.
“Please don’t share any sensitive information in your conversations,” OpenAI writes on their website, while Google urges Gemini users not to “…enter confidential information or any data you wouldn’t want a reviewer to see.”
On that note, here are the five things no one should tell ChatGPT or an AI chatbot.
Identity information
Don’t reveal any identifying information to ChatGPT. Information such as your Social Security number, driver’s license and passport numbers, as well as date of birth, address and phone numbers should never be shared.
Some chatbots work to redact them, but it’s safer to avoid sharing this information at all.
“We want our AI models to learn about the world, not private individuals, and we actively minimize the collection of personal information,” an OpenAI spokeswoman told WSJ.
Medical results
While the healthcare industry values confidentiality for patients to protect their personal information as well as discrimination, AI chatbots are not typically included in this special confidentiality protection.
If you feel the need to ask ChatGPT to interpret lab work or other medical results, King suggested cropping or editing the document before uploading it, keeping it “just to the test results.”
Financial accounts
Never reveal your bank and investment account numbers. This information can be hacked and used to monitor or access funds.
Login information
It seems that there could be reasons to provide a chatbot with your account usernames and passwords due to the rise of their ability to perform useful tasks, but these AI agents aren’t vaults and don’t keep account credentials secure. It’s a better idea to put that information into a password manager.
Proprietary corporate information
If you’re using ChatGPT or other chatbots for work — such as for drafting emails or editing documents — there’s the possibility of mistakenly exposing client data or non-public trade secrets, WSJ said.
Some companies subscribe to an enterprise version of AI or have their own custom AI programs with their own protections to protect from these issues.
If you still want to get personal with the AI chatbot, there are ways to protect your privacy. According to WSJ, your account should be protected with a strong password and multi-factor authentication.
Privacy-conscious users should delete every conversation after it’s over, Jason Clinton, Anthropic’s chief information security officer, told the outlet, adding that companies typically permanently get rid of “deleted” data after 30 days.
[ad_2]
Source link