Harkening back to the election in November, Google has been burdened with the existence and spreading of “fake news” (or opinion-based content masquerading as fact) filtering through and, in some cases, registering at the top of the company’s search engine results.

However, Google made their first major steps to address these issues last week by launching a new project internally titled “Project Owl” in an effort to improve search results and eventually eradicate fake news from its searches.

Fake news has been an increasingly pesky issue for Google as their technological features have expanded, including the Google Assistant feature available on Android phones and Google Home. When asked a question, the automated Assistant will recite responses from the top Google search results, even if that answer could be considered “offensive” or factually inaccurate. Such an example was demonstrated last December when a Google device answered the question if “are women evil” with a response that all women have “a little evil” in them.

But while it has become their problem, Google can’t be held responsible for situations like this.

In addition to there being an increasingly large amount of people creating content to reaffirm particular opinions or views, there is a separate sub-culture that perpetuates the popularity of such content by continuously searching for it, for either harmful or humorous reasons, despite knowing the content isn’t factually based.

These constant searches signal Google’s search algorithms that the content is popular and registering positively (in regards to traffic), therefore bringing those results to the top.

To battle these specific issues, Project Owl is being incorporated in three steps:

1) through the launch of a new feedback form for search suggestions (including specified policies regarding removals)

2) a new feedback form for their “Featured Snippets” answers

3) the development of a new search algorithm with stronger emphasis on authoritative content.

While the new feedback forms will allow the users most directly affected by undesired results to report issues, Google’s new search algorithm should greatly improve the consistency of quality results and hopefully eradicate fake news from searches for good.

“We are super energized by this, I have to say, super energized to fix these problems,” said Pandu Nayak — a Google Fellow who works on search quality. “People [at Google] came out of the woodworks offering to help us with this. People felt really passionate about helping.

That being said, filtering out undesired results is less simple than it seems considering the near 6 billion searches Google processes every day. Even though Google revealed that only approximately 0.25% of these searches reveal problematic results, that still leaves a staggering amount.

“There’s already been a significant amount of progress, but there’s a long way to go. And we don’t believe it will ever be solved fully. It is in some ways like spam. There’s a little bit of an effort of people trying to game the system, while we try to stay one step ahead of them,” said Ben Gomes, Vice President of Engineering of Google Search.

It won’t be an immediate fix, that’s for sure, however Google is continuously making significant steps towards improving their search algorithms that’ll undoubtedly result in a better and more informative Internet experience in the coming years.