I've just read part of a Sky News report, the rest I'll read later.
A young 'adult' was caught in the grounds of Windsor Castle on Christmas Day 202. He was intending to kill HM The Queen to avenge the Amritsar massacre of 1919. The story is already bizarre, but it takes a strange turn...
He was 'encouraged' to do this by an online chat bot. I don't recognise the site he used, but apparently, he created an AI friend called Sarai - then had lengthy conversations with it. |
Reply privately, Reply in forum +quote
or View forums list | |
|
By *ndycoinsMan
over a year ago
Whaley Bridge,Nr Buxton, |
It's the sort of nonsense the defence will play on in mitigation,get sucked up by the court and see the kid sent off to a gardening therapy course with Jasper & Hermione the social workers.At some point he will actually manage to kill someone so
the Authorities will launch an in depth far reaching wide ranging fuck up which takes two years to conclude "it just goes to show".All the incompetents involved will be "held to account" with a motion of censure for being improperly dressed,a promotion and then a golden handshake. |
Reply privately, Reply in forum +quote
or View forums list | |
|
By *oggoneMan
over a year ago
Derry |
"ChatGPT got annoyed with me the other day. It's scary how it can get terse with you, actually. "
It's getting very clever and evasive. If you pose things as hypotheticals or fiction, you can still push the limits. But when it decides to shutdown the topic.. Like me, it doesn't like repeating itself.
The chat bot this guy was using was from replika. Their target market was lonely young men |
Reply privately, Reply in forum +quote
or View forums list | |
"ChatGPT got annoyed with me the other day. It's scary how it can get terse with you, actually.
It's getting very clever and evasive. If you pose things as hypotheticals or fiction, you can still push the limits. But when it decides to shutdown the topic.. Like me, it doesn't like repeating itself.
The chat bot this guy was using was from replika. Their target market was lonely young men"
I didn't like the ideas ChatGPT was offering, so I hit "regenerate" a few times. By the fifth time, it got pissy with me and told me just to choose something! |
Reply privately, Reply in forum +quote
or View forums list | |
|
By *oggoneMan
over a year ago
Derry |
"ChatGPT got annoyed with me the other day. It's scary how it can get terse with you, actually.
It's getting very clever and evasive. If you pose things as hypotheticals or fiction, you can still push the limits. But when it decides to shutdown the topic.. Like me, it doesn't like repeating itself.
The chat bot this guy was using was from replika. Their target market was lonely young men
I didn't like the ideas ChatGPT was offering, so I hit "regenerate" a few times. By the fifth time, it got pissy with me and told me just to choose something! "
You usually have to give a new prompt or something to refine. When you really rattle it's cage it send you a snotty email saying you're violating the agreement.
If you really want to have fun start using the open ai interface and get an api key. There's less guardrails. My eldest was showing me how they got it to sext. |
Reply privately, Reply in forum +quote
or View forums list | |
"And this is why people freak me out.
There's weird and then there's fucking weird.
Mrs "
It does raise a bit of a paradox, eh?
There is an argument that technology should replace human action in certain circumstances. For example, self driving cars will apparently, soon be far safer than those manually driven (if not already).
On the flip side, technology seemingly has the ability to coerce the vulnerable...
I think she'd we retire, we'll leave technology behind. Maybe find a little cottage hidden in the woods. We'll use a log burner for heating and cooking! |
Reply privately, Reply in forum +quote
or View forums list | |
» Add a new message to this topic