SELECT LANGUAGE BELOW

Five things you should never reveal to ChatGPT if you want to protect your privacy

People rely on all sorts of things, including couple therapy, writing professional emails, helping to turn dog photos into humans, and putting artificial intelligence platforms in their personal information.

Apparently there are a few specific things you should not share with your chatbot.

When you type something into the chatbot, “You’ll lose it,” says Jennifer King, a fellow at the Stanford-centric Institute of Artificial Intelligence. He told the Wall Street Journal.

There are ways to protect your privacy when using AI chatbots. Getty Images

“Don’t share sensitive information in conversations.” Openai writes On their website, Google encourages Gemini users to “…not enter sensitive information or data they don’t want to show to reviewers.”

In that memo, there are five things no one should tell ChatGpt or AI chatbots.

ChatGPT has a “temporary chat” mode that works in a similar way to the incognito mode of an Internet browser. Future Publishing via CFOTO/Getty Images

ID information

Do not reveal your identification information to chatgpt. Please never share information such as your Social Security number, driver’s license, passport number, date of birth, address, or phone number.

Some chatbots work to edit them, but it’s safer to avoid sharing this information at all.

“We want to learn about the world from AI models rather than individuals, and we actively minimize the collection of personal information,” an Openai spokesperson told WSJ.

Medical results

The healthcare industry appreciates the confidentiality of patients to protect personal information and discrimination, but AI chatbots are usually not included in this special security.

If you feel that you need to ask ChatGpt to interpret lab work or other medical results, King will suggest cropping or editing before uploading the document, keeping it “just test results.”

There are five specific things you shouldn’t tell ChatGpt or AI chatbots. Reuters/Dad Luvik/Illustrated

Financial account

Never reveal your bank and investment account numbers. This information can be used to hack and monitor or access funds.

Login information

With an increased ability to perform useful tasks, there may be reasons to provide chatbots with account username and password, but these AI agents are not vaults and do not keep account credentials safe. It’s a good idea to put that information into your password manager.

Unique company information

If you are using chatgpt or other chatbots (such as drafting emails or editing documents), you may accidentally publish your client data or private trade secrets, the WSJ said.

Some companies subscribe to the enterprise version of AI or have their own custom AI programs with their own protection to protect against these issues.

If you still want to be personal with an AI chatbot, there are ways to protect your privacy. According to WSJ, accounts must be protected with strong passwords and multi-factor authentication.

Privacy users should delete all conversations once it’s over, Jason Clinton, humanity’s chief information security officer, told the outlet, adding that in 30 days the company will permanently delete the “deleted” data.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News