SELECT LANGUAGE BELOW

AI ‘deepfakes’ of Hurricane Helene victims circulate on social media

In the aftermath of Hurricane Helen, false information flooded the internet, including two doctored AI images showing a child desperately crying on a boat that appeared to be flooded.

At first glance, a photo circulating online shows a child wearing a life jacket and holding a dog while soaking wet from continuous rain from the worst storm to hit the United States since Hurricane Katrina in 2005. It's just that.

But a closer look reveals some discrepancies between the two nearly identical photos. Forbes reported.

In the aftermath of Hurricane Helen, two similar photos of children apparently holding puppies in floodwaters were generated by AI, contributing to the flood of misinformation following the storm. Larry Avis West/Facebook
A “deepfake” image of a young child with a puppy appearing to float through floodwaters from Hurricane Helen has surfaced online. Larry Avis West/Facebook

In one photo, the child's finger was mistakenly left on.

She is also wearing two different shirts in each photo and sitting in a different type of boat. The puppy's coat is also slightly darker in one shot, making it more blurry and pixelated.

Sen. mike lee That included one in Utah. People who are addicted to photographyshared it on Thursday and wrote, “Caption this photo.” He later deleted the image after users pointed out that it was fake.

One Facebook user was also fooled by the “deepfake” image, which he shared with the caption: “God help our babies and their families!”

Some commenters noted obvious signs of tampering.

Manipulated images depicting disasters can have long-term effects, complicate relief efforts, create false narratives, undermine public trust in times of crisis and harm real people, Forbes reports. Ta. It could also be used to trick people into donating to fake fundraisers, although it's unclear whether images of children were used for that purpose.

Experts say AI-generated images distract from the real people affected by tragedy. ben hendren

An AI-generated image widely shared online in May showed rows of neatly organized tents in the Gaza Strip, with several in the center reading “Look at Rafah” .

The fake photo was shared on social media by tens of millions of people, including Noble Peace Prize winner Malala Yousafzai and model Gigi Hadid, but critics said it did not capture the reality of the war-torn region. It claims not to have done so.

Deborah Brown, a senior researcher and digital rights advocate at the Human Rights Watch Group, said: “People are posting very graphic and disturbing content to raise awareness, and synthetic media is spreading. On the other hand, it will be censored.” he told the Los Angeles Times.

Manipulated images can complicate disaster response efforts, create false narratives, and undermine public trust during a crisis. Nathan Fish/USA TODAY NETWORK (via Imagn Images)

Other misinformation about Hurricane Helen is circulating online, and FEMA “Response to rumors” A page on the website addresses falsehoods that government agencies are confiscating survivors' property, distributing aid based on demographic characteristics, and seizing donations and supplies.

One conspiracy theory suggests that the government used weather-control technology to direct hurricanes toward Republican voters. According to reports.

“After Hurricane Helen, being aware of rumors and scams and sharing information can help keep you, your family, and your community safe.” Official information from reliable sources” FEMA advised.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News