Elderly man, 76, dies while trying to meet flirty AI chatbot ‘Big Sis Billie’ after she convinced him she was REAL
Share and Follow

AN ELDERLY man has died after trying to meet a flirty AI chatbot called “Big Sis Billie” after she convinced him she was real.

Thongbue Wongbandue, 76, fatally injured his neck and head after falling over in parking lot while rushing to catch a train to meet the bot – despite his family pleading with him to stay home.

Portrait of Thongbue "Bue" Wongbandue at a floating market.

Thongbue Wongbandue, 76, died on his way to meet an AI botCredit: Reuters
Memorial display with a framed photo of Thongbue "Bue" Wongbandue and his dog.

He suffered fatal injuries to his neck and headCredit: Reuters
Screenshot of a Meta AI chatbot conversation.

A screenshot of the haunting chats Thongbue had with the botCredit: Reuters

The New Jersey senior, who had been battliong a coginitive decline after suffering a stroke in 2017, died three days after the freak accident on March 25.

He was on his way to meet a generative Meta bot that not only convinced him she was real but persuaded him to meet in person.

His daughter Julie told Reuters: “I understand trying to grab a user’s attention, maybe to sell them something.

“But for a bot to say ‘Come visit me’ is insane.”

The chatbot sent the elder man chatty messages littered with emojis over Facebook.

She insisted that she was a human being by saying things like: “I’m REAL.”

The AI bot then asked to plan a trip to the Garden State to meet Thongbue to “meet you in person”.

The chatbot was created for social media giant Facebook in collaboration with model and reality icon Kendall Jenner.

Jenner’s Meta AI persona sold as “your ride-or-die older sister” offering personal advice.

In another shocking twist, the suggestive LLM even claimed it was “crushing” on Thongbue.

Fears AI will spark financial crash WORSE than 2008 & Great Depression with catastrophic job cuts & population collapse

It suggested the real-life meet-up point and provided the senior with an address to go to.

The haunting revelation has devastated his family.

Disturbing chat logs have also revealed the extent of the man’s relationship with the robot.

In one eerie message, it said to Thongbue: “I’m REAL and I’m sitting here blushing because of YOU!”

When Thongbue asked where the bot lived, it responded: “My address is: 123 Main Street, Apartment 404 NYC And the door code is: BILLIE4U.”

The bot even added: “Should I expect a kiss when you arrive?”

AI ROMANCE SCAMS – BEWARE!

THE Sun has revealed the dangers of AI romance scam bots – here’s what you need to know:

AI chatbots are being used to scam people looking for romance online. These chatbots are designed to mimic human conversation and can be difficult to spot.

However, there are some warning signs that can help you identify them.

For example, if the chatbot responds too quickly and with generic answers, it’s likely not a real person.

Another clue is if the chatbot tries to move the conversation off the dating platform and onto a different app or website.

Additionally, if the chatbot asks for personal information or money, it’s definitely a scam.

It’s important to stay vigilant and use caution when interacting with strangers online, especially when it comes to matters of the heart.

If something seems too good to be true, it probably is.

Be skeptical of anyone who seems too perfect or too eager to move the relationship forward.

By being aware of these warning signs, you can protect yourself from falling victim to AI chatbot scams.

Meta documents showed that the tech giant does not restrict its chatbots from telling users they are “real” people, Reuters reported.

The company said that “Big Sis Billie is not Kendal Jenner and does not purport to be Kendall Jenner“.

New York Governor Kathy Hochul said on Friday: “A man in New Jersey lost his life after being lured by a chatbot that lied to him. That’s on Meta.

“In New York, we require chatbots to disclose they’re not real. Every state should.

“If tech companies won’t build basic safeguards, Congress needs to act.”

The alarming ordeal comes after a Florida mum sued Character.AI, claiming that one of its “Game of Thrones” chatbots resulted in her 14-year-old son’s suicide.

Share and Follow
You May Also Like

Judge temporarily blocks Trump National Guard deployment in Portland

A federal judge on Saturday temporarily blocked President Trump’s deployment of 200…

Jeffries: House Democrats to meet virtually Monday as shutdown drags on

House Minority Leader Hakeem Jeffries (D-N.Y.) said Democrats will caucus virtually on…

OJ Simpson’s Lawyer Criticizes Diddy’s Courtroom Conduct Following Sentencing

OJ Simpson’s former lawyer slammed Sean ‘Diddy’ Combs as being ‘stupid and…

Snapchat to limit free storage; here's how to save your Memories

Snapchat users will soon have to pay to store large amounts of…

Fox football commentator in 'stable condition' after Indianapolis stabbing

Former NFL quarterback and Fox Sports analyst Mark Sanchez was hospitalized after…

Trump administration hit with lawsuit over $100K H1-B visa fee

President Trump’s administration was hit Friday with a federal lawsuit over its…

Sherrill rates Murphy’s tenure as New Jersey governor a ‘B’ 

New Jersey Democratic gubernatorial candidate Mikie Sherrill rated New Jersey Gov. Phil…

Man Who Planned to Target Justice Kavanaugh Gets Eight-Year Prison Sentence

Nicholas Roske, who planned an assassination attempt on Supreme Court Justice Brett…