The girl used AI to ask how to kill people with drugs, and two lives were lost. Netizens said: It’s the deceased’s fault because she was so beautiful


This article is reprinted from the WeChat public account: Those Things in the UK

At 9 pm on January 28, 2026, an ordinary motel in Suyu-dong, Gangbuk-gu, Seoul, South Korea, welcomed a young girl and a man in his 20s.

picture

(surveillance photo)

They took a room together. But two hours later, only the girl walked out of the lobby alone.

At 6 pm the next day, the man’s body was found on the bed, without any signs of struggle, as if he had fallen asleep.

A forensic autopsy showed he had large amounts of benzodiazepines in his system.

Benzodiazepines, commonly known as diazepam, include alprazolam, diazepam, etc. This is a very commonly prescribed drug in psychiatry, used to treat anxiety, insomnia, epilepsy and other conditions.

But 11 days later, the same pattern killed another person.

On February 9, the girl appeared again at another motel in Jiangbei District, bringing with her a man in his twenties. Same process, same ending.

The man is dead.

On February 11, police arrested her. In accordance with Korean media practice, her name is only disclosed by her surname: Kim, 21 years old.

picture

(After the second incident, the woman left alone)

Initially, the police charged her with injury resulting in death, which is a qualification between manslaughter and intentional homicide. This characterization means that the prosecutor believes that she may not have done it intentionally.

During the interrogation, she also admitted to putting drugs in the other party’s drink, but insisted that she did not know it would kill people.

But after police checked her phone, they discovered a crucial twist.

When investigators looked through Kim’s mobile phone records, she frequently asked ChatGPT if she could take sleeping pills while drinking? What dose is considered dangerous? Questions like what dose is fatal.

And it wasn’t just once in a while.She asked and confirmed again and again.

picture

(Kim was arrested)

A police source later said: “She asked ChatGPT multiple times about drug-related issues and received answers, so she was fully aware that mixing alcohol and drugs could lead to death.”

Based on the cell phone records, prosecutors upgraded the charge against her from injury causing death to premeditated murder.

ChatGPT conversation records became key evidence for conviction.

The police even started to investigate backwards and discovered another case in December 2025.

At that time, Kim was in a relationship with a man. One day in December, the two were sitting in a car drinking in the parking lot of a cafe in Namyangju City, Gyeonggi Province.

Soon after, the boyfriend lost consciousness.

Fortunately he survived. He himself thought it was an accident or some kind of physical reaction, and did not call the police immediately.

But police now believe it was Kim’s first attempt.

That failed, and according to investigators, she increased the dosage of the drug on the next two occasions.

After that time, she asked ChatGPT that the dose was not enough, how much more should be added?

At present, the police are still expanding the scope of the search to investigate whether there are more victims.

It is reported that Kim himself suffered from mental illness and was prescribed diazepam by a doctor.

The weapon she used to kill was a legal prescription drug. The tool she uses to get operating instructions is the most popular AI chat software in the world.

This case caused an uproar on the Korean Internet, but it made many people feel uncomfortable, exceptThe murder itself, as well as the reactions of netizens.

After the incident, Kim’s photos began to circulate on social media.

picture

(Kim)

The comments are saying…Kim is so beautiful, I would definitely drink the drinks she gives me.

Some people say that the judgment should be based on appearance, so Kim should be given a lighter sentence.

Some people even say that such a small mistake cannot ruin the life of this young girl… It should be the man who did terrible things to Kim first, so that she had no choice but to do so…

picture

(Netizen comments)

At the same time, Kim’s personal information, including her address and social account, were exposed by netizens, and the number of fans on her social media account increased to more than 10,000.

This is not the first time that ChatGPT has been involved in a murder case. Will the AI ​​company bear responsibility for this? This issue is now being discussed intensely around the world.

Futurism’s report pointed out that ChatGPT and other AI chatbots are accused of having fragile and unreliable security protection mechanisms. These mechanisms can easily be intentionally or unintentionally bypassed during long conversations, causing AI to freely provide harmful information.

Multiple lawsuits are ongoing alleging that AI acts as a suicide coach or reinforces users’ dangerous delusions.

Of course, there is another voice: ChatGPT answered whether mixing sleeping pills and alcohol is dangerous, which is perfectly normal. The answer to this question can be found on any medical website.

The real problem is:When this kind of question is asked repeatedly and the AI ​​senses dangerous intent, what should it do?

Currently, OpenAI does not have any public response to South Korea’s Kim case.

picture(surveillance picture)

Kim is undergoing psychiatric evaluation and in-depth psychological interviews to establish a psychological profile, which will be considered at a later court hearing.

Police have not released a motive for her crime. No one knows why she did this. She asked so many questions in the conversation but never explained why.

This is the most chilling part of this case. It wasn’t the questions she asked, but the blank motivation behind those questions.

The AI ​​doesn’t know who she is, which motel she’s going to, or that two lives are about to disappear because of the answers to those questions.

It just answered truthfully.

ref:https://www.thesun.co.uk/news/38357192/woman-drugs-men-death-asks-chatgpt-how-to-kill/

——————–

Related reading:

Leave a Reply

Your email address will not be published. Required fields are marked *