A 47-year-old middle school educator in Omaha, Nebraska, faces charges for allegedly using artificial intelligence (AI) to produce child pornography while at work.
The individual, identified as Matthew Rand, was employed at Anderson Middle School. On Friday, prosecutor details emerged, indicating that Rand purportedly also engaged in inappropriate acts while at the school.
In February, authorities were alerted that Rand had uploaded over 400 files containing potential child pornography, described as child sexual abuse material (CSAM), to his Google account. Disturbingly, these files included images of children under 12, with one being reported as an infant.
Some images reportedly depicted naked children, while others showed them involved in sexual activities.
Millard Public Schools confirmed the situation. Following Rand’s arrest on Wednesday, reports indicated that his contract would be terminated.
A judge has set his bail at $1 million. Additionally, Rand is required to avoid contact with anyone under 19 and to wear a GPS tracker.
One parent expressed concern, reflecting on how Rand made several students uncomfortable, including their own child, who sensed something was off but couldn’t articulate why.
After the Nebraska State Patrol stated they did not believe any student was a direct victim, authorities obtained a warrant and subsequently arrested Rand at his residence.
Charges against him include possession and distribution of CSAM.
Reports warn that the use of AI to generate videos depicting child sexual exploitation may significantly rise in 2025.
The Internet Watch Foundation raised alarms about the growing sophistication of these AI-generated videos, suggesting they are becoming nearly indistinguishable from actual abuse images. Shockingly, over 1,000 scrutinized videos fell under the most severe category of child pornography.
IWF analysts attribute this troubling trend to significant investments in the AI industry, which have allowed perpetrators to access tools for creating and disseminating CSAM more easily.
Additionally, analysis revealed that an AI-driven nudify website, which produces non-consensual deepfake porn from regular photos, has generated millions by utilizing services from major companies like Google and Amazon, despite their policies against such content.



