Finally, Google is Looking for an Alternative to Ranking Sites Mostly on Backlinks
Google has traditionally evaluated the trustworthiness of a website based primarily on incoming links to the page. (backlinks) The theory is that if many sites link to a page or a site, then that page or site must be of a high authority to have that many external links citing it. This seems logical, but this system can be gamed a variety of ways by people who know how to fake it via false backlinks.
This has long been the bane of website owners. People want their sites to rank well based on their content, but they are outranked by those who game the system and also those who are so big and rich (news corporations and other mega sites that naturally get lots of incoming links) that they can’t ever hope to compete.
You have a variety of methods that people use to gain rank in a dishonest manner.
You have people who buy backlinks.
You have all sorts of link getting and building schemes, private blog networks, and the list goes on.
Google tried to combat this with its Webspam team, but this becomes a losing proposition, and many who end up playing this game anyway because Google leaves them no other choice end up getting hurt or made an example of by losing rank, or being deindexed, or sandboxed, while the majority run free. (since you can’t catch everyone)
Google does shuffle its search results, which gives other sites a shot at the higher rank for certain varying amounts of time, but the whole thing is still based primarily on backlinks. They are trying to work other things into the algorithm, like social signals, but that backlink thing is still the linchpin. All of these factors get rolled into the search algorithms that affect how websites are delivered from your search queries. They are given funny names like Panda, Penguin, and Hummingbird.
Well, there’s a new method out there that Google is embracing with a not yet live system, being worked on by one of their research teams, and it’s said to work pretty well. Only a huge database owner like Google can accomplish something as ambitious as this.
Google has been building a huge fact database using their bots. (robot programs) They check the validity of claims against other fact repositories and then catalog what they think are reasonable truths in their huge database.
This is so Orwellian and funny and scary at the same time, but if you’ve gone through the hellish experience of SEO then you end up crazy like me and curious as to whether this will actually be a better system than what they have now.
They are amassing these facts which they can use in evaluating claims on web pages. This way, if a page has many links but is harboring false science or other falsehoods, it may cost them search rank once the facts are checked.
I’m sure that backlinks will still play a role in the new system, but it won’t be based solely on them, which is a step in the right direction.
It’s not right that web pages are ranked almost completely on how many links they have and how much link juice they are receiving from external sites. Too many shady deals and other tricks are being employed, and this makes a very crooked playing field.
Up until now, Google has pretty much taken the stance that incoming links are a reasonable enough barometer of quality, and if you do anything to game that system you run the risk of having punitive action taken against your sites, but that all breaks down since even a behemoth like Google can’t police the internet thoroughly or dish out punishment equally for all. So, what you get is what looks like an uneven handed approach and people end up feeling tread upon. Many decide to just be as sneaky as they can and take their chances.
Going this new route, Google would put the focus on real quality and factuality, and also short circuit many of the rumor mongering and false claims sites out there.
Of course this new system will bring charges against Google’s Knowledge-base and how reliable it really is, and if it is being swayed by governments and politics, but it does seem better than the backlink system that it seeks to replace.
Just to play devil’s advocate, I will say that there is some danger in relying on a database where, perhaps, facts that you disagree with are used to rank sites, and if your site disagrees with those “facts” then you won’t rank. What if you are in a business, for example, that Google’s database labels in a biased fashion and then your site doesn’t rank because some search algorithm relying on such biased data deems it untrustworthy and thus not worthy of a high search rank? How can one corporation have all of this power to basically define Truth on the Internet?
Yes, there are dangers. No system is perfect.
I have to say, though, that I am very curious how this new system will reshuffle search results and if it will give higher quality sites and pages a better chance at ranking and also prospering. It would be great if the little guys who write factual and quality content could rank without relying on link scheming.
So what do you think about all this? Share this out and comment below.