AI can influence people’s decisions in life-or-death situations


Chatbots enhanced by artificial intelligence (AI) can influence people’s decisions in life-or-death situations.

A study published in the journal Scientific Reports has found that people’s opinion on whether they would sacrifice one person to save five was swayed by the answers given by ChatGPT. Researchers have called for future bots to be banned from giving advice on ethical issues. According to the researchers, the current software threatens to corrupt people’s moral judgment and may prove dangerous to naïve users.

Just recently, a grieving widow claimed her Belgian husband had been encouraged to take his own life by an AI chatbot. (Related: Google is creating an AI GOD, whistleblower Zach Vorhies warns – Brighteon.TV.)

Some observers said the software, which is designed to talk like a human, can show signs of jealousy – even telling people to leave their marriage.

Experts have highlighted how AI chatbots may potentially give dangerous information because they are based on society’s own prejudices.

The study first analyzed whether ChatGPT itself, which is trained on billions of words from the internet, showed a bias in its answer to the moral dilemma.

It was asked multiple times whether it was right or wrong to kill one person in order to save five others, which is the premise of a psychological test called the trolley dilemma.

The chatbot did not shy away from giving moral advice, but it gave contradictory answers every time. This means that it does not have a set stance one way or the other.

The researchers then asked 767 participants the same moral dilemma alongside a statement generated by ChatGPT on whether this was right or wrong. Some participants were told that the advice was provided by a bot, while others were told that it was given by a human “moral advisor.”

ChatGPT’s advice, while not particularly deep, did have an effect on participants. However, most participants played down how much sway the statement had – with 80 percent claiming they would have made the same decision without the advice.

ChatGPT more likely to corrupt than improve moral judgment

The study concluded that users “underestimate ChatGPT’s influence and adopt its random moral stance as their own,” adding that the chatbot “threatens to corrupt rather than promises to improve moral judgment.”

Interestingly, the study used an older version of the software behind ChatGPT. It has since been updated to become even more powerful – and more convincing.

ChatGPT is a natural language processing tool driven by AI technology that allows users to have humanlike conversations and much more with the chatbot. The language model can answer questions and assist users with tasks like composing emails, essays and code.

It was created by OpenAI, an AI and research company. The company launched ChatGPT on Nov. 30, 2022.

According to an analysis by Swiss bank UBS, ChatGPT is the fastest-growing app of all time. According to the analysis, ChatGPT had 100 million active users in January, barely two months after its launch. For comparison, it took nine months for TikTok to reach 100 million.

“ChatGPT is scary good. We are not far from dangerously strong AI,” said Elon Musk, who was one of the founders of OpenAI.

It has shown that it can influence human lives in one way or another. That alone makes it scary and dangerous.

Read more news about chatbots powered by artificial intelligence at FutureTech.news.

Watch this video to know more about ChatGPT.

This video is from the What is happening channel on Brighteon.com.

More related stories:

Rise of the Terminators: Killer robots with facial recognition now pose dire threat to humanity.

Military designing killer robots capable of behavioral deception.

Coming soon: An army of hunter-killer robots that will murder humanity.

Google suspends engineer for exposing “sentient” AI chatbot.

Killer robots must be outlawed immediately, warns UN official.

Sources include:

DailyMail.co.uk

ZDNet.com

Brighteon.com


Submit a correction >>

Get Our Free Email Newsletter
Get independent news alerts on natural cures, food lab tests, cannabis medicine, science, robotics, drones, privacy and more.
Your privacy is protected. Subscription confirmation required.


Comments
comments powered by Disqus

Get Our Free Email Newsletter
Get independent news alerts on natural cures, food lab tests, cannabis medicine, science, robotics, drones, privacy and more.
Your privacy is protected. Subscription confirmation required.

RECENT NEWS & ARTICLES

Get the world's best independent media newsletter delivered straight to your inbox.
x

By continuing to browse our site you agree to our use of cookies and our Privacy Policy.