Meta Chief Executive Mark Zuckerberg faces a possible rebellion at the company’s annual meeting on Wednesday as shareholders pressure the company to be more transparent about its efforts to protect children online.
A group led by Lysette Cooper, vice chair of Franklin Templeton subsidiary Fiduciary Trust International and the parent of a child sexual abuse victim, is supporting a non-binding resolution calling on Meta’s board to issue an annual report tracking the company’s performance on child safety and protecting young users from harm on the app.
The report will require “quantitative metrics suitable to assess whether Meta has improved its performance globally in terms of its impact on child safety on the platform and the reduction of actual harm to children.”
“If we want to reassure advertisers, parents, lawmakers and shareholders that we’re making a difference in addressing this problem that’s harming kids, we need transparency,” Cooper told The Washington Post in an interview. “We need better metrics.”
The resolution will be put to a vote at a time when Zuckerberg’s company, Meta, is under intense scrutiny as it faces legal and regulatory crackdowns in the U.S. and abroad for failing to keep children safe on Instagram and Facebook.
Zuckerberg himself recently apologized to the families of victims of online sexual abuse at a high-profile congressional hearing.
Mehta’s board of directors opposes the resolution. asserted in the April proxy statement. “The reports requested are unnecessary and do not provide any additional benefit to shareholders.”
Cooper and his supporters point to numerous pending lawsuits against Mehta regarding child safety: Last October, Mehta was hit with lawsuits from dozens of states alleging it “ignored the devastating damage these platforms have caused to people’s mental and physical health,” including sleep deprivation, disruption to schoolwork, anxiety and depression.
A separate lawsuit by the New Mexico Attorney General alleges that Meta exposed underage users to sexual predators.
As The Post reported earlier this month, Mehta is spearheading a major lobbying effort to repeal or weaken two New York state bills aimed at protecting children online.
“Kids are going to be the users of the future. If they have a bad experience on the platform, they’re never going to come back. That makes a big difference to us as investors,” Cooper added.
Two of the largest proxy advisory firms, Institutional Shareholder Services and Glass, Lewis & Co., recommended that shareholders vote in favor of the resolution.
Glass Lewis said of the proposal: “We believe that adopting and reporting on the requested reports and targets will provide valuable information to shareholders and allow them to better understand this sensitive issue in the context of their efforts to minimize harmful content on their platforms.”
ISS determined that “shareholders would be benefited from additional information about how the Company manages risks related to child safety.”
The shareholder resolution is virtually doomed to fail without the support of Zuckerberg, who controls 61% of the company’s voting power through his ownership of so-called super-voting Class B shares.
Proxy Impact, which filed the resolution on Cooper’s behalf, noted in the filing that a similar proposal received the support of about 54% of shares not controlled by Meta management at last year’s annual meeting.
“Getting data is a fundamental first step in any business plan,” says Michael Pusoff, CEO of Proxy Impact. “What gets measured gets managed, and they’re not doing that, or if they are, they’re just not sharing it with anyone.”
In its proxy filing, Meta cited a number of steps the company has taken to address concerns about child safety online, including the creation of “more than 30 tools across our apps to support teens and their families” as well as existing policies banning harmful content that seeks to exploit children.
“We want people, particularly young people, to foster their online relationships in a safe, positive and supportive environment and we work closely with a wide range of stakeholders to inform and enforce our safety commitments,” the company said.
Meta’s board also recommended shareholders reject several other resolutions, including one that would have called for the submission of a third-party report assessing the “potential risks and benefits” of raising the minimum age for users of social media products.
Meth’s legal and regulatory woes regarding child safety issues are not limited to the United States.
Earlier this month, the European Commission revealed it was investigating whether Meta had violated a sweeping new law, the Digital Services Act, which requires big tech companies to police the content on their platforms.
European watchdogs have expressed concern that Facebook and Instagram “could fuel behavioural addictions in children” and could create a “rabbit hole effect” in which kids become glued to the apps despite harmful health consequences.
If Meta is found to be in violation of the DSA, the company could be subject to fines of up to 6% of its annual revenue.
