Three high-profile Internet brands this week announced initiatives to fight "fake news" online.
Google revealed that it had twisted its search processes to help bring high-quality content to the top of search result pages.
Facebook also said, it had begun testing its Related Articles feature, which displays readers articles related to the topic of an article they've just read on Facebook's recommendations. The new related article feature displays the links to recommended reading in users' news feeds, along with the topic article link.
Meanwhile, Wikipedia also took a shot at fake news. Founder Jimmy Wales also announced a new online publication, Wikitribune, which focuses to fight fake news by pairing professional journalists with legions of volunteer community contributors.
There would be a plethora of benefits after applying the techniques:
Quality and Transparency will get improved
Among the changes Google has implemented to give high-quality content its due is a revamp of the guidelines used by its quality raters to evaluate the productiveness of the company's algorithms to identify controversial content, including misguiding information, unexpected offensive results, hoaxes, and unsupported conspiracy theories.
Ben Gomes, vice president of engineering at Google stated: "These guidelines will begin to help our algorithms in demoting such low-quality content and help us to make additional improvements over time.”
The company has twisted the signals it uses to rank pages, so more reliable content is displayed and lower-quality content is demoted in search results.
Google also has made it easy to go for searchers to flag inaccurate or offensive content that displays in the Autocomplete and Featured Snippets features on a Web page.
Further, it has added more transparency about its practices at its Help Center and How Search Works site.
Google's latest moves are smart for its business, regardless of whether they have an impact on fake news, suggested John Carroll, a mass communications professor at Boston University.
"When people start to doubt the reliability of Google's search results that can undermine what people have relied on Google for since its inception," he told TechNewsWorld.
"Google says that only 0.25 % of its search results link to derogatory or erroneous content, there has been enough press around it to make something Google can't afford to address," Carroll said.
Addressing the problem of fake news is a good idea for Google, at least conceptually, noted Tom Rosenstiel, executive director of the American Press Institute.
He told to TechNewsWorld "Google has power and impact here because most people don't move beyond the first page of indexing’s,", "so if you can rank higher things that have more reliability and are more likely to be true and accurate, that's mightier."
Paper for People, by People
Of the three initiatives to foil fake news, Wikitribune is probably the most ambitious. Founder Wales contends the online publication will be the first collaboration between professional and citizen journalists working side-by-side as equals, writing stories as they happen, editing them live, and having them fact-checked by a supportive community.
Mark Marino, director of the Humanities and Critical Code Studies Lab at the University of Southern California told to TechNewsWorld "I have great hope for the Wikitribune idea, Wikipedia has developed a robust system for establishing the veracity of content,"
Moreover, unlike Google and Facebook, "the processes are apparent to the people who read Wikipedia. You can see the conversations behind the content, “That puts it at a very different level than either Facebook or Google, whose algorithms are invisible to us at all times." Marino noted.
"As well-intentioned as journalists and entrepreneurs have been in pursuing that kind of collaborative journalism, it hasn't been successful in many of the cases where it's been tried," he said.
You may also Read: Google Algorithms- Why So Important
Let the Consumer Decide
"It's a smart strategy by Facebook to provide more information, rather than being the one that eliminates information," BU's Carroll suggested.
"One of the difficulties for Facebook in dealing with fake news is it doesn't want to seem like some arbitrary censor that picks what's true and what's not," he continued. "Facebook is saying to its users, 'You decide what's true and what's not. Here are some tools to do that.'"
Allowing consumers to decide for themselves is good in theory, but it may encounter obstacles in a world of alternative facts.
"We need to wait and see how valuable the related article's concept is," API's Rosenstiel said. "One truth about algorithms is that people who want to manipulate them continually adapt to them."
Manipulating Political Reality
The jury is out on how successful these latest efforts to muzzle fake news are likely to be.
"I'm happy to see companies putting some critical energy towards combating fake news, but I'd prefer to see people developing stronger media literacy on their own," USC's Marino said.
"For all the knowns we get in a press release from Google or Facebook about their procedures, there are too many unknowns about how their black box software operates, to begin with," he noted.
The fake news problem may be larger than even behemoths like Google and Facebook can tackle.
"The essential problem is there's no punishment for people who tell lies to the world," said Mark Graff, CEO of Tellagraff.
"Even if Google tweaks its algorithms and Facebook makes it easier to flag fake news unless there's some cost to spreading lies, people will use the Internet as a megaphone to spread lies," he told TechNewsWorld.
As laudable as these latest efforts to smother fake news may be, they raise questions about the human condition in the Internet Age.
"These companies, through an algorithm, actually shape people's political reality," said Vincent Reynaud, an assistant professor in the department of communication studies at Emerson College.
He told TechNewsWorld that based on the ability to control searches or what appears in a news feed, they can impact the way people understand how politics is working.