SELECT LANGUAGE BELOW

AI ‘wingmen’ bots to write profiles and flirt on dating apps | Dating

AI Bots will soon be deployed on dating apps, creating messages and profiles on behalf of users.

However, in response to artificial intelligence to promote budding relationships, it puts most human credibility left in dating platforms, experts warn.

Match Group, a technology company with a portfolio of the world's largest dating platforms, including Tinder and Hinge, has announced an increase in AI investments with new products this month. AI bots are used to help users choose which photos are most popular, write a message to people, and provide “effective coaching for struggling users.”

But “struggling users” that may be There's a lack of social skillsand to rely on AI assistants to create a conversation, it can be difficult when they arrive on the actual date without using their phone to help the conversation. This can lead to anxiety and further recede to the comfort of digital spaces, a group of scholars argue. It can also erode the trusts that users have on the app. Who is using AI and who is pounding real flesh and blood humans behind the screen?

Dr. Luke Brinning, a lecturer in applied ethics at the University of Leeds, coordinated an open letter calling for regulatory protections on AI on a dating app. He believes that trying to solve the social problems caused by technology with less regulated technology will make things worse, and strengthening automated profiles will entrench a dating app culture that people always feel is superior to others to win.

“Many of these companies are correctly identifying these social issues,” he said. “But instead of trying to really expand their competitiveness, they're reaching out to technology as a way to solve them. [like] It makes it easier for people to become vulnerable, people to become incomplete and to make each other more accepting as ordinary people not more than 6 feet [tall] It has a fantastic and interesting career, well-written bio and a constant sense of witty jokes. Most of us aren't like that all the time. ”

He is one of dozens of scholars across the UK, one of the US, Canada and Europe, warning that the rush of generative AI could “decompose an already unstable online environment.” AI on Dating Platforms puts multiple harms at risk, including worsening loneliness and the mental health crisis of young people, worsening bias and inequality, and further eroding people's real social skills. They believe that the explosion of AI capabilities in dating apps needs to be quickly regulated.

Dr. Luke Brenner has organized an open letter calling for regulatory protection against AI on dating apps. Photo: Distribution materials

In the UK alone, 4.9 million people use dating apps, with at least 60.5 million users in the US. Approximately three-quarters of dating app users are 18-34 years old.

Many singles say it's more difficult than ever to find a loving relationship. However, the letter warns that dating app AI risks further degrading the landscape. It facilitates manipulation and deception, strengthens bias in algorithms regarding race and disability, and homogenizes even more uniform profiles and conversations than it is now.

But dating app supporters say “dating” with assistants WingmanAs they are known, it could help reduce the fatigue, burnout and admins trying to set dates on dating apps. Last year, Product Manager Aleksandr Zhadan Programmed chatgpt Swipe and chat with over 5,000 women on behalf of Tinder. Eventually he met a woman who is now his fiance.

Brunning says he's not an app, but he thinks the app is currently working for businesses, not for corporate people. He is frustrated that the digital dating sector is undergoing such a bit of scrutiny compared to other online life areas like social media.

“Regulators are awakened to the need to think about social media and are worried about the social impact of social media, their mental health impact. They are surprised that dating apps have not been folded into that conversation.

“In many ways, [dating apps] It's very similar to social media,” he said. “In many other ways, they explicitly target our most intimate feelings, our strongest romantic desires. They should attract the attention of regulators.”

A spokesman for Match Group said: “At Match Group, we are committed to using AI ethically and responsibly, placing user safety and well-being at the heart of our strategy… Our team is dedicated to designing AI experiences that respect the trust of our users and drive Match Group's missions in a consistent, comprehensive and efficient way.” A spokesman for Bumble said: “We believe that there is an opportunity for AI to increase safety, optimize the user experience, and enable people to focus on ethical and responsible use while better representing their most authentic selves online. Our goal with AI is not to replace love and dating with technology, but to make human connections better, more compatible and safer.”

Ofcom highlighted that online safety laws apply to harmful generation AI chatbots. A Ofcom spokesperson said: “When in effect, the UK Online Safety Act places a new obligation on the platform to protect users from illegal content and activities. We have made clear how the Act applies to Genai and have set up a platform that can be used to protect users by testing the AI ​​model for vulnerabilities.”

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News