Two esoteric terms of modern age, for people who are not familiar with the digital marketing industry, Artificial intelligence or AI and Search Engine Optimisation SEO, both sound geeky and high-tech. While for many people working in the SEO industry, these terms might seem unrelated. And you might ask yourself: what does Artificial intelligence have to do with SEO?
You may think that the AI we commonly see in movies doing extreme computer tasks has nothing to do with optimising a website presence to improve search engine results, but you are mistaken. Actually, it has everything to do with SEO.
Before going deeper into this topic, you must first know the definition of Artificial intelligence.
An Overview: What is Artificial Intelligence?
A large population has already been acquainted with Artificial Intelligence, thanks to Benedict Cumberbatch's fantastic portrayal of Allan Turing, the man behind modern Computers, in the WWII film Imitation Game.
The movie actually portrays the basic definition of what an Artificial intelligence (AI) is: a machine that can think or perform tasks like a human being. This is the brain of the computers. AI is where computers can be as smart as a human being. It can perform tasks that only human beings can do; or it can outperform man in certain tasks.
An exaggerated form of an AI is what we see in movies such as Tony Stark's friend, coworker and computer named JARVIS which stands for Just A Rather Very Intelligent System.
But apart from the military application of AI and its presence in movies, did you know that you are actually interacting, and using AI in everyday life?
The application of AI is ubiquitous: the grammar checker in your computer is in a way an example of AI; so is Google translate; and Apple's Siri and Microsoft's Cortana.
And have you ever wondered why the ads you see on your Facebook wall and in Amazon website all seem particular to your choice? That is because an AI is learning your taste.
Marketers leverage this capacity of Artificial Intelligence. Because of this, SEO was born. Businesses and organisation that want to promote their business offering or brand in the internet employs this process, so that AI would pick their page for a relevant search query.
Whether you are in the field of SEO or not, it is good to know that the heart and soul of search queries on the internet are the Artificial intelligence. And the reason why videos, images, and web pages get viral is not only because people are impulsive link sharers but also (or even more) because there is this intelligent SEO guy doing his job effectively.
SEO and AI: How SEO Leverages Artificial Intelligence
The SEO process, however, varied from business to business, with the ultimate goal of making the web page rank in the search engine result. This means making a website gain high usability through the right people.
For this very reason, understanding AI is very important, as the first lesson of SEO is "How search engines work?".
Top Page Signals for ranking
Google search engine employs different signals that help in determining the rank of a webpage. In short, these signals are ranking factors. Imagine google algorithm as gatekeepers, only those that passed the signals will be allowed to enter the top results page area.
SEO experts are always on the lookout for google signals. They take note of every possible ranking factor that google employs in its algorithms. When google announces an additional signal, or when experts on the field forecast an addition signal, tactics and strategies for a more optimised search engine presence are also affected.
Some of the commonplace signals that every search engine employs are the following:
- Domain Authority. This is a grading system for websites that measures how a domain, or a site is trusted and authoritative. A higher DA means it is trusted and an authority in the field where it belongs. This 100-scale measurement comprises various other factors like domain age, inbound quality links, popularity etc.
- Page Relevance. A commonsense factor that Google employs in ranking a page. The web page content must be relevant to the search query. Several factors affect this signal including, keyword density, usefulness, and originality of the content.
- Site loading speed. How the web page performs in terms of its loading speed is also considered by Google. Turtle-like site speed is a bad shot for google.
- HTTPS. One of Google's recent addition to its over 200 ranking signal is security for users. It ranks higher the websites that are encrypted using HTTPS communication protocol.
How Google Worked in the Past
Google is only one of the various search engine site on the internet. Youtube in its own right, is a search engine for videos, and so is Amazon, which is intended for buyers.
In the earlier years of Google, search queries work in exact-match mode. This means Google matches results by the exact word that a user enters in the query. During those years, SEO copywriting was arduous as it needed to match the exact word of users.
For example, to make intelligent SEO for a shoe company, you have to account for all keywords that match your business offering. So if someone wants "affordable running shoes" you have to use the keywords [affordable running shoes] in the webpage's content.
As we can surmise, during this period Google can't understand the content. It only reads through the content but doesn’t get the meaning of it. In a way, it works like a dictionary: a lexicon of keywords.
Farm content and Scraper content became the target of one of the updates of the algorithm of Google. In 2011 it launched Panda. This algorithm looks into the quality of content and ranks the quality of the page by its content.
It is in hot pursuit for poor content, also known as Farm content, and the scraper ones, these are the duplicates or scraps-glued-together content, hence Scraper content.
In 2013, google announced its new algorithm, the Hummingbird. The Google team revamped the overall algorithm of their search engine. Algorithm is the system that analyses a user's query and looks into trillions of pages where it ranks and picks top results.
As a result,the SEO process was also revamped. With this new algorithm, Google search engine treats queries as if reading what the user really wants.
Google now analyses keywords semantically. By semantics, it means that Google now includes other keywords related to the search query i.e. synonyms, contextual relevance. With hummingbird, Google search became quicker, intuitive, and relevant.
Hummingbird paved the way for better content pages. There is no longer a need to be stringent in embedding keywords to the article. Content has to be specific and relevant. In short, great content is the name of the game.
To get Google bots to crawl your pages and thus index your content, links from other pages have to be established. This is how crawling works: Google follows links from pages to pages.
After the launch of Penguin algorithm in 2012, link building evolved. Buying links were caught in its filters; spam links no longer work, and pages with too many ads are penalised. Hence on, websites that rank but violate the quality guidelines of Google dwindled down.
The sole concern now becomes quality over quantity: quality meaning, links came from relevant and authority sites.
The Google RankBrain
The newest addition to the family of updates is RankBrain launched in 2015. RankBrain does not replace the Hummingbird algorithm, in stead it is only an addition to the algorithms that hummingbird employs.
This algorithm is a learning technology that interprets and understands the query of a user and learns the various ways to submit the request. It learns what kind of mixture of the core algorithm is best suited to each type of search result. It could pick any of the signals as the appropriate solution for the query.
Analysing the query got even better. RankBrain is able to effectively interpret long-tail queries and find the best pages for the searcher.
This also includes analysis, Google told Search engine Land :
"It can see patterns between seemingly unconnected complex searches to understand how they’re actually similar to each other. This learning, in turn, allows it to better understand future complex searches and whether they’re related to particular topics."
This algorithm learns from search history in order to develop better results. Results are also specialised according to location. Thus, results on queries about weather, events, and even measurements are picked by Google according to places
How RankBrain is Changing SEO and Businesses?
As Google’s AI is highly iterative, it gets updated yearly. And with every big revamping it undergoes, major changes also happen in the world of Search Engine Optimisation. Thus the rule is: intelligent SEO always has to tag along with AI updates.
So now, after the latest make-over of google’s algorithm, how does it affect the SEO industry? Well, there are key points to look at to better understand its ripple effect.
Regression Analysis No More
Regression analysis is a tool in understanding the meaning of data and their relation to other data or variables. For a long time, This was a key tool for intelligent SEO. This tool is used in determining the performance of the website, keyword ranking, ETC. But with RankBrain's deep learning algorithm, regression analysis may no longer be as useful as it was. One of the reason is that Google search engine now processes queries individual to its own; it employs different mix of algorithms for each result.
TechCrunch said, "For these reasons, today’s regression analysis must be done by each specific search result."
More Intense Competition for the Top 3 Spots
With RankBrain and the previous updates that Google made to its algorithm, search results are increasingly becoming more relevant. Users now find what they are looking for by only looking at the top three spots. Thus, organic listing competition for the top 3 spots is getting more intense.
Unity with Neighbouring site
Websites should focus on its topics to rank better. Expanding its horizon might not be a good thing to do with RankBrain algorithm. This means mirroring structures and other features of authority sites in your neighbour or niche. Why? Because RankBrain is a learning algorithm. It can learn what a good site is by going over the top quality sites in a particular niche, and everyone not close to that look might be lowered in rank. If a website wants to rank on top, it has to do well in its own niche.
Thus an intelligent SEO has to size up authority pages to improve the rank of the website they are optimising.
Alternately, Authority websites can affect and alter the algorithm of Google by setting a benchmark.
RankBrain can index the Backlinks to an authority site and may learn to take these backlinks as standard of the rest of the web pages in the particular field. A quick example will suffice to explain this:
A Catering company X is an authority site on its niche. It has backlinks from the following neighbourhood:
- Party supplies
- Events Management
- Wedding Blogs
If an intelligent SEO guy tries to optimise a local catering services website, he must account for the backlinks of company X. An addition to or subtraction to this list of backlinks might trigger a mismatch on RankBrain algorithm. Worst, when the list of backlinks mirrors spammy websites.
Website Structure Mirroring
Similar to backlink mirroring, RankBrain can also determine what the structure of a good website is from what is not. Thus, authority sites in the field of sports will set standard for the rest of the site in that neighbourhood.
Finally, you get acquainted with the essential bits of information about AI and how it is related to optimising search engine results for businesses and organisations. With this, you can also optimise your SEO strategies by understanding how you can bypass and leverage the algorithms of Google.
Do you find this article useful?
Never miss a post. Follow us here.