home bbs files messages ]

Forums before death by AOL, social media and spammers... "We can't have nice things"

   alt.cyberpunk.tech      Cyberpunks LOVE making shit complicated      1,115 messages   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]

   Message 255 of 1,115   
   flashheart to MasterKarsten   
   Re: Chat GPT has now fully automated red   
   12 Oct 25 20:02:46   
   
   From: flashheart@dont-mail-me.com   
      
   On 10/12/25 19:56, MasterKarsten wrote:   
   > TheSun, 12 Oct 2025 19:43:56 +0300, flashheart wrote:   
   >   
   >> On 10/12/25 19:33, MasterKarsten wrote:   
   >>> TheSun, 12 Oct 2025 17:31:02 +0300, flashheart wrote:   
   >>>   
   >>>>> As AI bots like ChatGPT become inextricably tangled with people’s   
   >>>>> private and public lives, it’s causing unpredictable new crises.   
   >>>>>   
   >>>>> One of these collision points is in romantic relationships, where an   
   >>>>> uncanny dynamic is unfolding across the world: one person in a couple   
   >>>>> becomes fixated on ChatGPT or another bot — for some combination of   
   >>>>> therapy, relationship advice, or spiritual wisdom — and ends up   
   >>>>> tearing the partnership down as the AI makes more and more radical   
   >>>>> interpersonal suggestions.   
   >>>>   
   >>>>> In one chaotic recording we obtained, two married women are inside a   
   >>>>> moving car, their two young children sitting in the backseat.   
   >>>>>   
   >>>>> The tension in the vehicle is palpable. The marriage has been on the   
   >>>>> rocks for months, and the wife in the passenger seat, who recently   
   >>>>> requested an official separation, has been asking her spouse not to   
   >>>>> fight with her in front of their kids. But as the family speeds down   
   >>>>> the roadway, the spouse in the driver’s seat pulls out a smartphone   
   >>>>> and starts quizzing ChatGPT’s Voice Mode about their relationship   
   >>>>> problems, feeding the chatbot leading prompts that result in the AI   
   >>>>> browbeating her wife in front of their preschool-aged children.   
   >>>>>   
   >>>>> After funneling her complaints into ChatGPT, the driver asks the bot   
   >>>>> to analyze the prompts as if “a million therapists” were going to   
   >>>>> “read and weigh in.”   
   >>>>>   
   >>>>> “The responses you’ve described would likely be considered unfair and   
   >>>>> emotionally harmful by the majority of marriage therapists,” the   
   >>>>> chatbot responds at a loud volume, while mirroring back the same   
   >>>>> language used in the prompt with flowery therapy-speak. It offers no   
   >>>>> pushback, nor does it attempt to reframe the driver’s perspective. At   
   >>>>> one point, the chatbot accuses the wife in the passenger seat of   
   >>>>> engaging in “avoidance through boundaries” by requesting that they   
   >>>>> not fight in front of their kids — while those very children sit in   
   >>>>> the vehicle, just feet away.   
   >>>>>   
   >>>>> It goes on and on, with ChatGPT monologuing while the wife it’s being   
   >>>>> wielded against occasionally tries to cut in over its robotic   
   >>>>> lecture.   
   >>>>> The spouse prompting the bot, meanwhile, mutters approving   
   >>>>> commentary: “that’s right,” “mm-hmm,” “see?”   
   >>>>>   
   >>>>> “Please keep your eyes on the road,” the wife being lectured by the   
   >>>>> AI pleads at one point.   
   >>>>>   
   >>>>> This was a regular occurrence, she told us, in which her spouse would   
   >>>>> pull out ChatGPT and prompt it to agree with her in long-winded   
   >>>>> diatribes.   
   >>>>>   
   >>>>> “We were arguing a lot… we would be up all night, and I would assert   
   >>>>> a boundary, or say, like, ‘I don’t want to have this discussion in   
   >>>>> front of the kids,’ or ‘I need to go to bed,’” she recounted,   
   “and   
   >>>>> [my ex]   
   >>>>> would immediately turn on ChatGPT and start talking to it, and be   
   >>>>> like, ‘can you believe what she’s doing?'”   
   >>>>>   
   >>>>> Her ex would carry out these conversations with ChatGPT on speaker   
   >>>>> phone, she added — within earshot, pointedly, so she could hear   
   >>>>> everything.   
   >>>>>   
   >>>>> “[My ex] would have it on speaker phone, and then have it speak not   
   >>>>> to me, but it would be in the same room,” she recalled. “And of   
   >>>>> course, ChatGPT was this confirmative voice, being like, ‘you’re so   
   >>>>> right.'”   
   >>>>>   
   >>>>> Today, the former couple, together nearly 15 years, is in the midst   
   >>>>> of a contentious divorce and custody battle.   
   >>>>   
   >>>>> Even Geoffrey Hinton, a Nobel Prize-winning computer scientist known   
   >>>>> as a “Godfather of AI” — a technology that likely wouldn’t exist   
   in   
   >>>>> its current form without his contributions — recently conceded that   
   >>>>> his girlfriend had broken up with him using ChatGPT.   
   >>>>>   
   >>>>> “She got ChatGPT to tell me what a rat I was… she got the chatbot to   
   >>>>> explain how awful my behavior was and gave it to me,” Hinton told The   
   >>>>> Financial Times. “I didn’t think I had been a rat, so it didn’t   
   make   
   >>>>> me feel too bad.”   
   >>>>   
   >>>> and many more examples of LLM induced retardation @   
   >>>> https://futurism.com/chatgpt-marriages-divorces   
   >>>>   
   >>>> We are currently careening towards a future where people will become   
   >>>> increasingly solitary because some autocomplete trained on billions of   
   >>>> reddit comments is telling them to terminate their relationships.   
   >>>> This is clearly happening on accident, I don't think Sam Altman   
   >>>> personally wants people breaking up because of his chatbot.   
   >>>> However, now that Pandora's box is opened, do you think that   
   >>>> Generative AI companies will begin manipulating people towards   
   >>>> atomization?   
   >>>> Love and the fulfillment derived from it is one of the few things that   
   >>>> cannot be bought and sold in this increasingly financialized present.   
   >>>> There are powerful interests against people deriving enjoyment out of   
   >>>> something that isn't a product.   
   >>>   
   >>>   
   >>> It's the people not the LLM... Why one should ask certain stuff to AI   
   >>> is beyond me   
   >>>   
   >>>   
   >>>   
   >> The people wouldn't act like this if it weren't for the existence of   
   >> LLMs, not at this scale. Many people now admit that they use ChatGPT as   
   >> a therapist or interlocutor. What happens when   
   >> OpenAI/Google/Meta/Anthropic figure out that they can manipulate people   
   >> by modifying the Language models? How are they going to choose to   
   >> manipulate them? They figured out mass manipulation through social media   
   >> quickly enough.   
   >   
   > They are comfortably selling themselves. You guys in the states really gave   
   > yourself away.   
   >   
   >   
   >   
      
   Wrong, Eastern Europe. You have no idea how deeply LLMs have already   
   penetrated society.   
      
   --- SoupGate-Win32 v1.05   
    * Origin: you cannot sedate... all the things you hate (1:229/2)   

[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]


(c) 1994,  bbs@darkrealms.ca