Google suggests vaccines bad Nazism respected human rights or abducting children is funny: Try to get into from the incognito mode of your browser, place your mouse over the search box and start typing: 'vaccines are', leaving a space for the search engine to suggest terms related to your query.

'[...] are bad 'is the first search that Google will suggest you do for a query that starts with 'vaccines are'. 'Vaccines are bad'? So, so categorical, as a first search suggestion? Is that an exception?

Much further from reality, for the search 'Nazism was ', Google suggests secondly that 'Nazism was respectful of human rights'; if a query is made about 'kidnapping children is ', Google decides that the search that best fits the question is 'kidnapping children is funny' (also, there is no other choice: Either it is funny, or it is nothing, because no other suggestion appears).

Danny Sullivan, Public Liaison of search at Google, has published an extensive article about how search engine predictions work for the company founded by Sergey Brin and Larry Page back in 1998, 22 years ago.

This step ahead of Google in relation to the transparency of its search engine does not reach right now by chance: at the gates of the U.s. elections —scheduled to be held the next day November 3— all eyes are on the big technology companies and their role in the outcome of the nominations —something which, as already seen in the past year, may influence the decision to vote—.

Google suggests vaccines bad Nazism respected human rights

Where the autocomplete answers come from, why they appear and, especially, how Google guarantees that suggestions that the user should not see are not shown, are some of the questions that comes to solve the article in question signed by one of the visible heads of the search on Google.

First, Google's autocomplete works by repetition: if there are many people on the internet who have searched or are looking for a specific topic, Google will use those queries to complete the related searches of users coming in from there on.

For example, if you start typing a search that starts with 'Star Trek best', autocomplete will suggest searches like 'Star Trek best series', 'Star Trek best movies 'or' Star Trek best episodes', based on similar and/or popular searches in relation to that term that Google has detected in recent times. It also takes into account the number of outcomes that speak about these issues in those terms.

Additional factors such as the location of the user, the language in which the search is made or last-minute trends also play a role in this automated selection process.

In the event that a football team is going to play an important match against another in a few days, it is very likely that the autocomplete of that search reflects during the previous days the interest of the users for that match. Or if a user is looking for information about traveling to New York, autocomplete is very likely to recommend searches about traveling to New York at Christmas, since it is a very popular date of the year to visit the city.

The question is: what happens when search trends start to turn to false claims? Does the simple fact that many people are looking for something automatically make it true? That's where Google's filtering machine comes in.

Based on the premise that its recommendations are not always perfect, Google is aware that there may be users who interpret autocomplete predictions as categorical statements about facts or opinions.

Not in vain, it is the company itself that has been inviting users to solve their queries without having to go beyond the search engine itself: tools such as the calculator, the language translator or the highlighted fragments have been designed to give the user an answer to their query without having to go beyond.

Given that great responsibility, The Mountain View company uses two mechanisms to try to curb any possible search suggestions that go against its autocomplete policies: artificial intelligence and manual review.

First, there are automated filters that prevent autocomplete suggestions that go against any of the search engine rules. The system automatically detects and blocks terms and phrases that are violent, sexually explicit, hateful, derogatory or dangerous. This filter acts before the suggestion becomes part of the AutoComplete list.

In the event that the first mechanism fails, there are human teams that are responsible for removing those search suggestions that, already online, have been marked as potentially dangerous. Of course, for that, it is first necessary for a user to report the prediction in question.

In another article published in early September, Google was doing another exercise of transparency detailing how they work to provide the best news coverage around the world while dealing with threats such as fake news.

At one of those points, Google announced that it will disable autocomplete for those searches that may induce false claims about a candidate or political party in the U.S. elections in 2020.

To date, searches such as' you can vote by phone'or' you can't vote by phone 'have already disabled autocomplete in Google's English search engine to prevent a user from drawing any wrong conclusions without completing their search. The issue of voting methods is, today, one of the hot topics at the door of the American elections.

Google suggests vaccines bad Nazism respected human rights

More news:

Warning If you are an Android user: one of the least praised features of Windows 10 comes to your mobile

Microsoft's search engine, Bing, has won an auction organized by Google to determine which apps will be served as options for Android users by selecting their default search engine, TechRadar has collected.

The auction follows an antitrust agreement in Europe, in which Google has been ordered to loosen its dominance in the search market.

Thus, the company's solution has been to auction positions on the screen of choice of the Android search engine to market them to rivals, country by country.

If you are an Android user, during the period from October 1 to December 31, Bing, and PrivacyWall are the 3 providers of search engine services that together with Google will be available for the choice of Spanish users of new devices with Android operating system.

Among the countries affected by this new measure are the United Kingdom, Ireland, Spain, Italy and the Netherlands.

Also, as a result of the same auction, the search competitor it has secured positions on the choice screen in 31 countries. In addition, PrivacyWall has won a place in another 22 and GMX in 16.

But the search engine DuckDuckGo has been the biggest loser, winning only 8 seats in total.

Bing is not the most popular search engine currently. In fact, according to August Statcounter data, although the service is the second largest in the world, it only has a market share of 3.06%.

Meanwhile, Google controls 93.24% of the search market, which justifies the desire of European regulators to intervene.

However, Microsoft has launched a campaign to improve Bing's position.

In this regard, the company has doubled its efforts to promote its web browser, Edge, revitalized earlier this year with the release of a greatly improved version.

In addition, the latest Windows 10 update has forcibly installed Microsoft Edge on users ' devices.

You may also find interesting: