Growing up, my parents always cautioned me against sharing photos online. Back then, before the social media boom, avoiding it was pretty straightforward. However, once I got an Instagram account, I realized that posting pictures was acceptable—as long as they met a certain standard. We all know that once something is uploaded, it really isn’t “ours” anymore. Even if you try to delete it, it can still resurface, which is a little unsettling.
We now inhabit a vastly different world. With the rise of AI, social media, and advancing technology, explicit images have become alarmingly accessible. They pop up in front of you, even if you weren’t searching for them. But just think about it: discovering compromising images of yourself online—images you never consented to, or even knew existed. It’s pretty terrifying, isn’t it?
Omni Miranda Marton, the founder and CEO of the Sexual Violence Prevention Association (SVPA), noted that this troubling trend has escalated over the last two years. In the past, creating non-consensual AI-generated deepfakes was a complex task, requiring sophisticated technology and multiple images of the victim’s face. While we may have seen this happen with public figures, the rise of AI-generated porn has now reached everyday people. Marton emphasizes that a staggering 98% of online deepfakes are barefaced. She argues that this isn’t just a tech issue—digital sexual violence is a serious matter that needs our attention.
If we believe that every human carries the image of God, then how do we reconcile the use of technology that distorts and exploits that image? What obligations do we have to ensure that laws protect those who are vulnerable?
The Dangers of AI
While we shouldn’t underestimate technology’s potential, it’s crucial to recognize that AI tools are now capable of fabricating realistic, explicit images without consent. The victims in this scenario come from all backgrounds—women, men, teenagers, and even civilians. The most distressing aspect? These images can spread like wildfire, often without any anonymity, and once they’re out there, it’s nearly impossible to erase them.
The impact on victims can be profound—emotionally, relationally, professionally, and psychologically. Unlike many traditional crimes, this kind of violation often remains invisible yet feels everlasting. “Now, anyone can generate these images in mere seconds; all you need is one or two photos of the victim’s face,” Marton explains. The SVPA’s mission is rooted in justice, rights, consent, dignity, respect, and protection, not just fear or caution.
Legal Gaps
At the moment, there are no clear federal civil options for those affected by non-consensual deepfake pornography. This gap is where the SVPA has been focusing its efforts. Their motivation is twofold: 1. Victims find it challenging to remove their images or pursue justice, and 2. The absence of penalties contributes to the ongoing problem. As technology evolves, our laws are often too slow to catch up, leaving vulnerable individuals to suffer unfairly.
To advocate for justice, the SVPA urges people to contact Congress and support the DEFIANCE Act. This legislation was developed through collaboration with victims and lawmakers to grant survivors the right to sue and seek justice. Aiming to deter the unauthorized creation and distribution of deepfake pornography, this bipartisan bill has been overwhelmingly approved as of January 13, 2026. Importantly, this law does not intend to limit free speech or creativity but is designed to protect individuals from digital exploitation.
Why This Matters
This issue resonates with everyone, regardless of identity or political stance. Each person is made in the image of God, and exploiting someone without their consent is a distortion of technological progress. It’s our duty to advocate for the vulnerable and push for laws that safeguard them, particularly those harmed by AI-generated pornography.
In Matthew 25:31-46, Jesus highlights that our actions toward the “least of these” reflect our service to Him. He references various vulnerable groups as a reminder that we must care for those in need. Our bodies are not to be objectified, ridiculed, or exploited. Consent is integral to God’s design for dignity and agency; however, sexual exploitation—whether physical or digital—twists that divine intention.
While deepfake pornography is a growing concern, it continues an age-old pattern of degrading individuals for selfish gratification. Christians—regardless of age or party affiliation—are called to protect the vulnerable, advocate for justice, and raise their voices when silence allows harm to occur.
Today, it’s worth considering both the benefits and drawbacks of technology and AI in our lives. They can be sources of beauty and progress, but we must navigate their use with strict moral guidelines to prevent exploitation. Laws like the DEFIANCE Act won’t resolve everything, yet they may establish necessary boundaries.
Now is the time to reach out to your representatives, keep victims in your thoughts, and remain engaged in discussions surrounding policies that impact human dignity. Personally, politics isn’t my cup of tea; I prefer to stay out of it. But when lives are at stake, it’s clear something needs to change.
“These actions carry profound psychological effects. They violate the victim’s autonomy and can lead to further sexual violence, reputational harm, and trauma. Our aim is to confront and halt the creation of this pornography and safeguard the victims. We’ve had individuals seek help only to be told that what happened to them was ‘legal.’ That’s what fueled our drive for justice,” Martone reflects.
The DEFIANCE Act has gained approval and will soon move to the House for further consideration. If enacted, it will offer survivors support under federal law and ensure those responsible for image-based sexual abuse, including those who leverage AI, are held accountable. As noted by Satya Nadella, CEO of Microsoft, this legislation could catalyze meaningful societal change.
“This is about uniting around shared norms, and ensuring that laws, law enforcement, and tech platforms work in harmony.”
