Ever since Google landed itself in hot water back in 2016, it’s been trying to dramatically reinvent how it approaches certain types of queries. The popular search engine came under fire when it was discovered that the number one hit when googling “holocaust” was a white supremacist site that vehemently denied the tragedy’s existence. Understandably, many Google users were upset by this fact and the company had to face the reality that its existing algorithms were not adequately filtering out offensive or erroneous content. Although they’ve taken several actions since the controversy to improve search engine results, their most recent update, rolled out only a few days ago, promises big changes.
Introducing Project Owl
“Project Owl” is the internal name given to this most recent update. This auspicious moniker was likely chosen because the owl denotes wisdom and Google is attempting to make a wiser search engine that can not only rank results by relevance, but also think critically about those results and whether they are appropriate for the search topic.
This update has several components, each designed to prevent situations like the one described above. First, Google made general improvements to the search engine and auto-complete algorithms. Second, Google implemented policies to aid in smarter searching and to add some transparency between Google and its users regarding why some information may be suppressed. Finally, and biggest of all from the user perspective, Google added tools allowing users to report offensive or inaccurate information.
General Improvements to Searches and Auto-Complete
Google is striving to boost more authoritative content to the top of the search listings when it comes to less frequent queries. Less common searches are more likely to have less available content to present. That makes it easier for obscure or niche websites to shine, which in turn makes it more likely that the top hits will be a mixed bag when it comes to accuracy.
Google wants to change this by editing its algorithm to promote trusted and accurate sites when it comes to these less common searches. How Google is defining trusted and how they are able to weed through this content is unclear. They are being necessarily closed lipped about it to prevent sites from trying to circumvent the system. However, there are reports that there may be a human element with reviewers set to assess search results.
New and Improved Tools for User Feedback
Of course, mistakes do happen. The tech giant understands that offensive or inaccurate content may slip through the cracks. They invite users to help them police this content by alerting them when said content is found. Users can do this in one of two ways.
Google’s auto-complete function, that list of suggested searches that pops up when you begin typing your own search, now comes with a feedback tool. When users see this list pop up, they’ll also notice a small gray button reading “report inappropriate predictions.” If they click this button, a window will pop up asking them to check the box next to the prediction in question and note why they are reporting it.
Featured Snippets Reporting
Users were already able to report featured snippets – those highlighted search results that Google boosts to the top of the rankings when it deems them particularly relevant or helpful. However, they can now provide more detailed feedback. If a user clicks the gray button on the bottom right of the featured result, they’ll get a pop-up like the one for reporting auto-complete predictions.
In either of these cases, users will not see the offending content disappear immediately. Google does not allow users to remove content on a whim. Google reviews all reports and makes an informed decision about whether or not to remove the content. This process could take several days.
More Transparency from Google
Possibly one of the least talked about but most interesting elements of Project Owl is Google’s publicly available formal policies on why certain auto-complete suggestions might be removed. The document isn’t long and may not be of interest to many users, but it’s very existence is noteworthy. This is the first time that Google has ever sought to explain why something might be removed. Many might want to see a similar level of transparency when it comes to why they remove certain featured snippets and how they define authoritarian content.
A Curated Truth
Many people look to Google as their primary source on truth. One might say that the top three hits on any search get dibs on shaping the conversation. To be sure, Google is not looking to shape that conversation for themselves. They are simply looking to curate a first page of results that presents the most relevant and trusted information. Otherwise, the truth is defined by those most able to manipulate the algorithm.
This post is part of our contributor series. It is written and published independently of TNW.
Read next: GIPHY Gaming Roundup: ‘Prey!’