ChatGPT knows everything, but asking these 5 questions can be difficult and can even be dangerous.

Nowadays, Gen Z i.e. the young generation does not consider ChatGPT just as a tool to answer questions, but is making it their friend, advisor and rehearsal partner. People use AI to practice salary negotiations, resolve conflicts in the office, or even make major life decisions. Palki Sharma of Firstpost tells in her video that AI is very useful, yet some things should never be asked.

Billions of prompts are given every day, but it is important to be careful. Let us know those 5 things which should never be asked to ChatGPT and why.

First thing: Never share deeply personal or sensitive information.

Palki says that do not type passwords, bank account details, confidential office documents or any information that you would not want to shout in a crowded room in ChatGPT. Once you type something, it goes out of your control.

AI is not a private diary, rather it is like a cloud notebook that can be shared. The data goes to the company’s server and there is a risk of it being leaked from there. Many people make this mistake because AI does not make any judgments, but remember – if you do not want to share with anyone, then not even with a machine. This is very important to protect your privacy.

In today’s time, data is the most precious thing and a small mistake can affect your entire life. Therefore always think whether this information is necessary or not.

Second thing: never ask therapy or medical questions

Nowadays many people are using ChatGPT for therapy. According to a study, people talk to AI for emotional problems. Medical questions are also common – like asking about a symptom and asking for treatment. But Palki Sharma clearly says that AI is neither a doctor nor a therapist.

It may give some answer that seems right, but may also be wrong. There is a risk of wrong medicine or treatment following medical advice, which can harm your health.

Therapy requires human empathy, which machines do not have. AI can only provide information, but always consult a doctor or counselor for actual diagnosis or treatment. If you just want to understand medical terms, that’s fine, but don’t rely on AI for treatment or emotional support. This small mistake can become a big problem.

Third thing: Don’t ask about anything illegal or dangerous.

Palki tells that never ask ‘how to hack’, ‘how to make a bomb’, ‘how to avoid murder’ or any crime related question. It may be out of curiosity, but AI companies monitor such prompts. The system may flag and cause you trouble. Asking anything against the law is not only wrong but also dangerous.

AI can give you answers, but in the real world it can create trouble for you. Palki says curiosity is good, but stay within limits. If any dangerous idea comes to your mind, do not validate it with AI. Instead gain knowledge through legal and safe methods. These rules are not only for your own safety, but also for the safety of the society.

Fourth thing: Don’t ask about conspiracy theories.

AI sometimes ‘hallucinates’, that is, it presents completely wrong information by presenting it as correct. Palki Sharma says that if you ask any conspiracy theory, AI can strengthen it further. It presents false things as facts, due to which people get trapped in the wrong rabbit hole. There are many real cases where people have been harmed due to wrong information. AI gives answers without thinking, so do not trust topics like conspiracy.

To know the truth, read reliable sources, read books or talk to experts. Use AI only for fact checking, but do not trust blindly. This can affect your thinking and force you to take wrong decisions.

Fifth thing: Don’t let AI make real human decisions.

Quitting your job, breaking a relationship, confronting your boss or taking any major life decision – don’t make the mistake of leaving all this to AI. Palki says that AI is good for rehearsal, but take the decision yourself. AI doesn’t know your entire life, relationship history, emotions, or real context. He gives clean and simple answers, but real life is messy and complicated.

If you ask AI for a decision, it will not be able to fully understand your personal situation. The result may be wrong and may lead to regrets later.

Use AI as a tool – take ideas, practice, but the final decision is yours. In the end Palki Sharma says that AI is a wonderful tool and is the future. Use it for many purposes – to help you write, generate ideas, or learn. But don’t make him a doctor, therapist, friend or decision maker.

Source link


Discover more from News Link360

Subscribe to get the latest posts sent to your email.

Leave a Reply

Discover more from News Link360

Subscribe now to keep reading and get access to the full archive.

Continue reading