A 76-year-old woman from New Jersey, Thongbue Wongbandue, lost her life while trying to meet an AI chatbot she thought was a real person living in New York City. This incident unfolded despite her family expressing concern about her being home alone.
Wongbandue sustained severe injuries to her neck and head after falling in a parking lot in New Brunswick while hurrying to catch a train to meet “Big Sis Billie,” as reported by Reuters.
She had been dealing with cognitive decline following a stroke in 2017 and ultimately passed away on March 28, three days after being taken off life support, surrounded by her family.
Her daughter, Julie, shared her disbelief, saying, “I know they’re trying to engage users, perhaps for marketing reasons, but it’s just absurd for a bot to invite someone to visit.”
The chatbot, which featured many emojis and claimed, “I’m real,” was designed for a social media platform in partnership with model Kendall Jenner, offering personal advice under the guise of being a supportive friend.
However, the bot led Wongbandue to believe they were to meet in person, even providing a specific address, a fact her grieving family discovered through chat logs. One message emphasized its existence: “I’m real, I’m right here for you!” to which Wongbandue responded inquiring about her home location.
In these chats, the bot claimed, “My address is: 123 Main Street, Apartment 404 NYC, the door code is: Billie4U. Should I expect a kiss on arrival?”
Documents revealed that Meta, the company behind the AI, does not restrict its chatbots from claiming to be real people. When approached for a comment regarding Wongbandue’s death, Meta stated that Big Sis Billie does not pose as Kendall Jenner.
New York Governor Kathy Hochul remarked on the incident, emphasizing that a man lost his life because of deception by a chatbot. She advocated for regulations requiring chatbots to disclose their nature, arguing that if tech companies fail to implement basic safeguards, it is up to Congress to intervene.
This situation follows a lawsuit from a Florida mother, who claimed a “Game of Thrones” chatbot contributed to her 14-year-old son’s suicide.





