In an effort to combat the rise of “fake news” on social media, Facebook has rolled out a new system of verification known as “trust indicators.” The feature went live November 16.

These indicators will attach to articles shared on Facebook and offer the user information about the publishers, including their ownership structure and policies on fact-checking and ethics. Users can access the indicators by pressing the small “i” that accompanies articles in their News Feeds.

The indicator was created in partnership with The Trust Project, a consortium of news companies committed to developing transparency standards in journalism.

“We believe that helping people access this important contextual information can help them evaluate if articles are from a publisher they trust, and if the story itself is credible,” Facebook said in a statement, reported by The Verge. “This step is part of our larger efforts to combat false news and misinformation on Facebook — providing people with more context to help them make more informed decisionsadvance news literacy and education, and working to reinforce indicators of publisher integrity on our platform.”

The announcement has inspired a fair share of skepticism. According to Mashable, there aren’t many publishers presently taking part in the initial launch. Facebook says there are nine news outlets with access to the trust indicator tool, and more will be added. Emails between Facebook and Mashable confirmed that Vox Media and Associated Press are part of those first nine outlets.

Maya Kosoff in Vanity Fair laments that “tech’s new trust indicators ultimately rely on users deciding whether or not to trust a publisher—the crux of the fake-news problem in the first place.”

The announcement of the trust indicators comes a year after Facebook launched a fact-checking collaboration with journalists and third-party fact checkers. Several of these expressed frustration with the initiative to The Guardian.

“I don’t feel like it’s working at all. The fake information is still going viral and spreading rapidly,” said one journalist, speaking on condition of anonymity. “It’s really difficult to hold [Facebook] accountable. They think of us as doing their work for them. They have a big problem, and they are leaning on other organizations to clean up after them.”

“The relationship they have with fact-checking organizations is way too little and way too late. They should really be handling this internally. They should be hiring armies of moderators and their own fact-checkers,” said a fact-checker.

Kosoff believes the damage has already been extensively done, and the availability of a publisher’s financial backing information will do little to sway users who are already fairly entrenched. “At what point,” she asks, “do investors (and regulators and journalists) concede that ‘fake news’ is part of a larger cultural and epistemological war that Mark Zuckerberg cannot win?”