A recent study indicates that the number of films directed by women has hit a seven-year low in Hollywood during the Democrats’ time in power.
It makes one wonder, didn’t we think the film industry was driven by extreme-left progressives?
So, why would an industry so strongly aligned with progressive values allow a decline in diversity and inclusion?
Well, you can guess, can’t you? Hollywood seems to thrive on hypocrisy.
“The University of Southern California Annenberg Inclusion Initiative’s yearly report shows that, in 2025, only nine out of the top 100 grossing films were directed by women, marking the lowest level since 2018.” The report continues: “Women directed only 8.1% of the highest-grossing films this year, down from 13.4% last year but still better than 4.5% in 2018.”
It’s interesting; I can’t lay the blame at Trump’s feet, though I’m sure many would…
“The findings suggest that the decline in opportunities for women behind the camera was evident even before Trump’s presidency emphasized a rollback on diversity and inclusion,” the report highlights.
“The 2025 statistics indicate that any progress women directors made was fleeting,” stated Stacey L. Smith, Ph.D. “While it’s tempting to link these changes to the administration, the reality is that the decisions affecting these films were made well before the DEI policies were rescinded.” Many films were already in development prior to the 2024 election.
It’s striking; women make up more than half the U.S. population, yet only 8.1% of the top 100 films in 2025 were directed by them.
For decades, Hollywood has pointed fingers at others for sexism and discrimination, yet here we are, still grappling with similar issues in the 21st century. It’s baffling how, even now, women direct such a small percentage of top films.
This reflects a larger system of discrimination. The film industry resembles a sports team, where directing talent is supposed to be nurtured. However, it seems that women are not part of this development pipeline.
Or…?
Dare I say…?
Are men just naturally more equipped to be directors? There are genuine differences between men and women, and one argument could be that men may excel more in addressing the myriad challenges that come with directing.
Just asking the question.
What’s clear is that it’s not about a shortage of women wanting to direct. If Hollywood truly valued diversity and inclusion, wouldn’t there be at least 50 women directing the top 100 films? Surely, there are countless women eager for those opportunities.
Yet Hollywood seems to prefer male directors over female ones. There appears to be a prevailing belief that men excel compared to women, even in more traditionally feminine aspects.
