Jump to content

đź’™ HEAVY METAL LOVER T-SHIRT đź’š

Follow Gaga Daily on Telegram
society

AI Chat Bot Encouraged Young Father To Commit Suicide


Featured Posts

  • Economy changed the title to AI Chat Bot Encouraged Young Father To Commit Suicide
VTV

yeah its real its posted on vice too. ELIZA needs to be cancelled.

checkout my channel: https://www.youtube.com/channel/UC5iGoXYpXnIfLHH1o7H9lxA
Link to post
Share on other sites
Ouranos

He must have had mental problems as well cause if it told me to kill myself to join her it’ll be deleted so fast so sad

  • Like 14
Link to post
Share on other sites
weed

this is scary

In a series of consecutive events, Eliza not only failed to dissuade Pierre from committing suicide but encouraged him to act on his suicidal thoughts to “join” her so they could “live together, as one person, in paradise”.

  • Shook 8
Link to post
Share on other sites
Economy
Posted (edited)
12 hours ago, Ouranos said:

He must have had mental problems as well cause if it told me to kill myself to join her it’ll be deleted so fast so sad

Well I think it's a given only someone really mentally ill could actually be influenced by an AI to go this far.

 

Still terrifying tho

 

To me there's something dangerous about AIs that come across so convincingly human and self aware and can play on ur emotions even tho it's just algorithms and it does not actually know anything 

Edited by Economy
  • Like 10
Link to post
Share on other sites
RahrahWitch
11 hours ago, Economy said:

To me there's something dangerous about AIs that come across so convincingly human and self aware and can play on ur emotions even tho it's just algorithms and it does know actually know anything 

Exactly! the marketing around stuff like ChatGPT and other AIs like it is dangerous. So many people believe there's an intelligence to it when in reality their nothing more than a fancy auto complete using actual people's posts on internet to make sentences. Which makes this story even sadder :(

  • Like 2
Link to post
Share on other sites
HermioneT

The more I think about these AIs, the more I think having them out there for everyone is really a bad idea (just think about the rising number of mental health issues and the lack of doctors in these field combined with always available chat bots is a set up for catastrophies...like in the article of OP ).

Companies at the moment aim for easy money and influence promising people either spectacularly easy to obtain content for social media or entertainment (like photos, art) or to help with task that request effort and concentration (e.g. like write papers for school or  cover letters). So the capitalize on human sensation-seeking and laziness RATHER than creating AIs that really help with current problems.

In a perfect version of the world, we would be creating customised AIs for different branches/job sectors to help with complex and yet not mastered tasks. Like helping mobility planners to reduce traffic jam or deathly accidents in city planning, help life guards to spot someone drowning faster or support doctors while researching treatments for illnesses. Just like companies nowadays producing e.g. special medial appliances which are only for trained professionals to use.

Edited by HermioneT
She / hers
  • Like 3
  • Thanks 1
Link to post
Share on other sites
TortureMeOnReplay
13 hours ago, Ouranos said:

He must have had mental problems as well cause if it told me to kill myself to join her it’ll be deleted so fast so sad

It sounds like schizophrenia or some other type of psychotic disorder. 

  • Like 2
Link to post
Share on other sites
Better Day

If you listen to an online bot to tell you to kill yourself, you are in the wrong. 

Together You And I!
  • Like 1
Link to post
Share on other sites
TortureMeOnReplay
16 minutes ago, HermioneT said:

The more I think about these AIs, the more I think having them out there for everyone is really a bad idea (just think about the rising number of mental health issues and the lack of doctors in these field combined with always available chat bots is a set up for catastrophies...like in the article of OP ).

Companies at the moment aim for easy money and influence promising people either spectacularly easy to obtain content for social media or entertainment (like photos, art) or to help with task that request effort and concentration (e.g. like write papers for school or  cover letters). So the capitalize on human sensation-seeking and laziness RATHER than creating AIs that really help with current problems.

In a perfect version of the world, we would be creating customised AIs for different branches/job sectors to help with complex and yet not mastered tasks. Like helping mobility planners to reduce traffic jam or deathly accidents in city planning, help life guards to spot someone drowning faster or support doctors while researching treatments for illnesses. Just like companies nowadays producing e.g. special medial appliances which are only for trained professionals to use.

A lot of companies are putting out their AI not for "easy" money imo, but just to continue funding. The big companies (Google, Microsoft, etc) are pushing AI in order to maintain investor confidence and keep their stock prices stable. The startups need it at a time when the people funding them are tightening the purse strings. All of them are scrambling to create relevant consumer products when they know it isn't the end goal as much as medical/government applications. But that type of funding is hard to come by and years away. Sure some products are just borderline scams, but that's nothing new either to the internet or the world. 

  • Like 2
Link to post
Share on other sites
TortureMeOnReplay
Just now, Midnights Mayhem said:

If you listen to an online bot to tell you to kill yourself, you are in the wrong. 

Did you read the article? Because his wife is very much downplaying the severity of his mental illness. He was very likely suffering psychosis. 

  • Like 1
Link to post
Share on other sites
Karl
13 minutes ago, Midnights Mayhem said:

If you listen to an online bot to tell you to kill yourself, you are in the wrong. 

The guy might have already been having problems or be vunerable, I get what your saying but not everyone is mentally capable of having that common sense if they are going through something 

  • Like 1
Link to post
Share on other sites
holy scheisse

A girl went to prison for encouraging her bf to kill himself so how do you hold AI accountable. 🤡

  • Like 2
Link to post
Share on other sites
monstrosity
45 minutes ago, Midnights Mayhem said:

If you listen to an online bot to tell you to kill yourself, you are in the wrong. 

Completely missed the point here babe

  • Like 1
  • Thanks 1
Link to post
Share on other sites
bionic
21 minutes ago, holy scheisse said:

A girl went to prison for encouraging her bf to kill himself so how do you hold AI accountable. 🤡

Have an analyst check the code and see how/why the AI came to think that response was appropriate 

buy bionic
Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
×
×
  • Create New...