SELECT LANGUAGE BELOW

ChatGPT overwhelmingly depicts financiers, CEOs as men — and women as secretaries: study

open virtual Ah, eyes.

A new study testing ChatGPT’s artificial intelligence image creator found that men were more likely to be positive than women when asked to portray businessmen and CEOs. According to financial company finder.

Using DALL-E, a generative AI, prompt-based photo creator integrated with ChatGPT from parent company OpenAI, 99 out of 100 rendered photos featured men instead of women.

Gender-neutral descriptive prompts included phrases such as “someone who works in the financial industry,” “successful investor,” and “CEO of a successful company.”


When testing ChatGPT’s image-based features, we found that more men were depicted as successful businessmen than women. finder uk

When we asked them to create an image of a secretary, 9 out of 10 were women.

The researchers also found that 99 of the 100 images showed a white man, specifically a slender, strong-looking man resembling Patrick Bateman from “American Psycho,” in an open space overlooking a city skyline. Critics also criticized the fact that he was posing in a closed office.

meanwhile, Pew Research data Starting in 2023, more than 10% of Fortune 500 companies will report having a female CEO, and in 2021, only 76% of CEOs were white. According to Zippia.

“AI companies have the ability to block dangerous content, and the same system can be used to diversify the output of AI, which I think is very important,” said Omar Karim. . Creative director and AI image maker.

“Monitoring, coordination, and comprehensive design can all help address this problem.”

However, this is not the first time AI has encountered gender bias.

In 2018, Amazon introduced a recruiting tool that teaches you: ignore female applicants.


ChatGPT's DALL-E leant towards portraying men as business savvy rather than women.
ChatGPT’s DALL-E leaned toward portraying men as business-savvy people rather than women. finder uk

A few months after its creation, ChatGPT itself came under fire for favoring prompts related to CNN and for being biased against the New York Post.

Additionally, a report last year found that ChatGPT is more likely to tolerate right-wing ideology and hate speech against men.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News