SEO Dictionary of the real digital language: we present a complete SEO dictionary with all the terms related to the subject so that you can solve your digital doubts.
Within SEO there are many terms and concepts that we need to be clear about when it comes to optimizing and positioning our site or project in the best possible way.
SEO dictionary with all the necessary terms to optimize and position your website Tweet this
The correct interpretation of a term is essential. For this reason, in this section we will expand new terms and “words”, all related to SEO so that you are always up to date.
The Google Algorithm is the way that the search engine has to position the pages before a search, that is, it is what decides if you go first, second or on the second page.
This algorithm changes about 500 times a year and is difficult to keep track of. That is why it is preferable to know well important changes such as Panda and Penguin , how they affect SEO and how to recover.
The Anchor Text or anchor text is the visible text in a link or hyperlink that provides information about the content to which we want to direct the user and the search engines.
Search engines have gotten better over time and are using more and more factors to create their positioning rankings . One of these metrics or factors is the relevance of a link. The relevance of a link depends both on the authority of the page where the link comes from, and the visible text of the anchor text. Of course, the link must always be as natural as possible or Google will understand it as a bad practice.
We can classify anchor text into the following types:
- Naked or without anchor text. Only one URL is displayed. For example: www.ghostseo.org
- Generic. Include words such as: “this blog”, “click here”, “this page”.
- Keyword. Depending on whether we are interested in positioning one or another keyword, we use a different anchor text and choose the terms that we want to highlight, for example “ Link Building ”.
- Name. When it is made up of a text other than the previous ones, and the objective is to link to a brand, a website, etc. The link would be: “ ghostseo ”.
Backlinks are the links or incoming links that point from other pages to your own. The number of backlinks on your page is important because the more relevant pages link to you, the more notoriety your website will gain in the eyes of Google. Make sure they are natural and convenient links, always quality before quantity.
Black Hat SEO or negative SEO
In SEO, Black Hat is called the attempt to improve the search engine positioning of a web page through unethical techniques or techniques that contradict Google's guidelines, making "cheats". These practices are increasingly penalized by Google . Some examples of Black Hat SEO are:
Keyword cannibalization occurs when there are several pages on a website that compete for the same keywords, confusing the search engine by not knowing which page is the most relevant for that keyword, causing a loss in positioning.
How do you fix this? The easiest way is to focus each page on one or two keywords at most and in the event that this cannot be avoided, we will have to create a main page of the product from where the pages of the different formats in which we will include are accessed. a canonical tag to the main page of the product.
Cloaking is a widely used SEO Black Hat technique that consists of displaying different content depending on whether it is a user or a search engine robot that reads it.
Google is very harsh with this practice and although years ago it could have given results, forget it, it is out of what search engines are looking for with their updates: a more natural, ethical and more user-focused SEO.
Duplicate content occurs when the same content appears in multiple URLs and in principle is not a reason for a penalty, unless a high percentage of your website has duplicate content. Having a few duplicate pages won't make Google mad at you, but avoiding it will give you clues that you're on the right track.
Although it does not imply a penalty, it can generate a loss of positioning potential due to the search engines not knowing which are the most relevant pages for a certain search.
The CTR (Click Through Rate) is the number of clicks that a link obtains with respect to its number of impressions. It is always calculated as a percentage, and it is a metric that is normally used to measure the impact that a digital campaign has had.
How to calculate the CTR?
As we said before , the CTR is calculated as a percentage. It is obtained by dividing the number of clicks that a link has obtained by the number of times it has been seen by users (impressions) multiplied by 100.
Let's see an example: Let's imagine that we have a result in Google that has been seen 2000 times and that has obtained 30 clicks, our CTR would be calculated as follows:
- CTR= (Clicks / Impressions) x 100 = (30 / 2000) x 100 = 1.5%
- CTR = 1.5%
The keyword density is the percentage of times that a word (or series of words) appears in the whole of the text compared to the total number of words.
A few years ago, keyword density was one of the most important factors in SEO positioning, since it was the method used by search engines (Google, Yahoo, Bing) to identify the main topic of a page.
However, SEO has changed, now Google guidelines recommend writing in the most natural way possible, that is, you have to write for the user instead of for the search engine.
Although there are still people who recommend not exceeding 3% keyword density, there is no ideal percentage .
The Canonical tag was introduced by Google, Yahoo! and Bing in 2009 to solve the problem of duplicate or similar content in SEO.
If the canonical tag does not exist in your code on a set of pages with duplicate or similar content, search engines will have to decide which is the URL that best suits what the user is specifically looking for. However, if we introduce this tag, we are the ones that indicate to Google and other search engines which is our preferred page. This will improve the indexing and positioning process of our website in SERPs .
Canonical tag example: <link rel=”canonical” href=”http://www.miweb.com/principal” />”
Let's see an example: if our website is the platform from which we sell apartments in the Chueca neighborhood, in Madrid, and we have several pages with very similar content, we must choose the canonical URL for which we want to position ourselves. This may be the one that has brought us the most traffic or the one that brings the most benefit.
To use the canonical tag effectively in SEO, just follow these steps:
- Choose which is the canonical or main page.
- Decide which or which are your secondary pages that can compete in positioning with the main one.
- Add the canonical tag on child pages pointing to the parent page between "<head>" and "</head>"
- Put the canonical tag on the main page pointing to itself between "<head>" and "</head>"
The Meta Robots tag is an HTML tag used to tell search engines to treat a URL in a certain way.
This tag is necessary if we do not want our website to be indexed or positioned in search engines .
This function can also be done through the page's Robots.txt file.
The difference between using the Meta Robots tag and the Robots.txt file is as follows:
- Through the tag we indicate to Google that we do not want to index certain pages, but we do want the bots to crawl them.
- However, using the Robots.txt file tells bots not to bother entering and crawling certain pages at all.
This difference is important to keep in mind. You will understand it better with an example:
Imagine that you have 2 urls that you don't want to appear in the Google index.
Url 1: blocked by robots.txt file
This url will not be crawled nor will it be indexed (a priori, never trust Google 100% :-P).
Url 2: blocked with meta robots tag
This url, by blocking it with the meta robots tag, will not be indexed but it can be tracked by search engines, which will cause all the content to be analyzed and therefore, search engines can track and follow links to other pages.
Google Panda is a change in the Google algorithm that was published in the United States in February 2011 and in Europe in April of the same year. On its way out, it affected more than 12% of all search results.
The maxim with Panda to avoid being penalized is to be sure that your content is totally original and adds great value to your user, that you keep the page up to date or even look for new formats to enrich your contribution to the user. The most actionable metrics in this case will be the bounce rate, the CTR in your search results, the time spent and the number of page views.
Penguin is the official name for the Google algorithm update designed to fight webspam. This update was released in April 2012.
It focuses on the off-site factors of a website, rewarding those sites that have a link profile with links from high-quality and unmanipulated domains, and trying to punish those pages that have violated Google's guidelines, that have profiles of unnatural links, too many links on low quality sites, etc.
It was Google who, from the beginning, decided that the links generated to a website were a sign that its content was relevant. Hence, everyone began to generate links to go-go. However, Google Penguin is a “where I said, I say, I say Diego”.
The improvements that the algorithm implements include a better detection of low value links, purchased links, in article networks, directories and basically any dynamic that implies trying to modify the link profile of your website. The best way to make sure Penguin doesn't penalize you is to follow Google's guidelines and passively generate links through your content.
How does SEO change with Google Penguin?
- Natural links, that is, generated passively or through real value. The syndication of articles, spinning , hidden links, directories (free or paid), promotions that result in links, etc. are prohibited .
- Variety of anchor text : It no longer makes sense to generate links with a link text that you want to position. If Google detects a pattern that it does not consider natural, it can penalize you.
- Search in your niche : The most valuable links are those from domains and pages in your niche or that talk about related topics.
- Quality, not quantity : It is preferable to generate few quality links than many of little value.
It refers to the keyword (or keywords) to refer to the terms by which we want to attract traffic to our website through search engines . You must take into account some factors associated with Keywords (abbreviated KW) such as competition, the number of searches, conversion or even the potential as a branding tool.
Keyword Stuffing is a Black Hat SEO technique that consists of the excessive use of keywords within a text with the poorly focused objective of giving this word more relevance. Google very often penalizes this type of over-optimization.
To avoid any type of negative action by Google, the texts should always be written to provide value to the user, and in the way that best suits your audience profile. If the text manages to give useful, original and well synthesized information, that will be a better indicator for Google than any variation in the number of keywords in the text.
There is no percentage that defines a perfect keyword density and Google recommends naturalness above all.
Technique of attracting links organically by creating high-value content. One of the essential factors for search engine optimization is the number of links to a given page.
Link Baiting is intended for a large number of users to link to content on our site. To do this, we must create original , relevant and innovative content, such as articles, videos or infographics that attract the attention of users.
Link Building is one of the fundamentals of web positioning or SEO, which seeks to increase the authority of a page as much as possible by generating links to it.
The algorithms of most search engines such as Google or Bing are based on SEO on-site and SEO off-site factors , the latter based on the relevance of a website, whose main indicator is the links that point to it or backlinks. . There are another series of factors, such as the anchor text of the link, if the link is a follow or not, the mentions of the brand or the links generated in RRSS.
It is important to keep in mind that good content is often linked naturally, so the effort to get links happens organically and with less effort than in other ways.
It is the authority that transmits a page through a link. Google positions web pages based on their authority and relevance, this is transferred from one page to another through links, and this transmitted authority is what we call Link Juice.
To understand it, we have to understand a web page as a large glass of juice (web) to which we make several holes (links) at the base. In this way, a glass that has a hole will transmit all its link juice through that single hole. If you have 10, each hole will pass 10% of the total link juice, and so on.
Suppose your website attracts traffic through 100,000 keywords, and you focus on the 100 with the most visitors. Let's imagine that they represent around 20% of the total traffic (depending on the nature of your website) correspond to these terms, and the remaining 80% will correspond to terms with a very small number of searches. So the vast majority of traffic that your website attracts comes through terms that you are not analyzing and that you do not even know what they are.
This is what we call long tail, searches with more specific terms that individually generate very little traffic, but together are the largest source of visits to the web. The term is applicable to other realities apart from online marketing; It was popularized by Chris Anderson in an article in Wired, giving examples of companies that have succeeded thanks to the business generated by their long tails, such as Amazon, Netflix or Apple.
Meta tags or meta tags are information included in web pages but which in turn are not seen directly by the user. They are used to provide information to browsers and search engines in a way that helps them better interpret the page and are written in HTML language within the web document itself.
The meta tags have been important at the SEO level due to their ability to affect the behavior of search engines, providing information on which pages a website should position, giving a description of it or blocking access or indexing of the website by users. search engine robots.
Microformats are a simple form of code that gives semantic meaning to content so that machines can read it and understand our products or services.
If we add Microformats to our website, Google can read it and show it in search results. This information may include user votes, author's photo and name, video, audio, etc.
The term "not provided" is a term used in Google Analytics that identifies all "safe" traffic within Google, or what is the same, all traffic that comes from users who have logged into their Google account.
What happens to this data? What do I do with them?
It is the part of the SEO work that focuses on factors external to the web page we work on and that affects our site, including external links, social signals, mentions and other metrics that reinforce the authority of the page.
One of the most important tasks of off-site SEO is link building , generating links that point to your page on external websites, with which Google will give it greater relevance.
SEO On-site or SEO On-page is a set of internal factors that influence the positioning of a web page. They are those aspects that we can change ourselves on our page such as:
- The meta information , such as the title or meta-description
- the url
- The content
- The <alt> attribute on images
- the web structure
- The internal link
- The HTML code
Optimizing SEO On-site is an essential process that every website must take care of if it wants to appear in search results.
The Page Rank is the way in which Google measures the importance of a website, the search engine classifies the value of websites on a scale of 1 to 10 .
When a page links to another website, it transmits a value, and this value depends on the Page Rank of the page it links to.
At present, Google has stopped publicly updating the Page Rank, and now nobody can see what a website's score is for the search engine.
However, although they continue to use it internally to establish their search results, it has less and less weight within the algorithm as a whole.
The Page Rank is given by factors such as the number of links and domains pointing to the web, their quality, the age of the domain, etc.
The English term “query” means doubt or question. When we talk about databases, query or query string is a request for data stored in said database, although in a generic way it can refer to any interaction. When we talk about search engines, a query is the term that we write in Google, a query that will later lead to a SERP .
Search Engine Ranking
Search Engine Ranking is the position your website occupies on a search results page . That is, the position in which you appear in Google, Yahoo. Bing... when a user performs a search.
To improve our positioning we must use strategies and tools that help us optimize our website, increasing accessibility, usability and content.
Schema Markup is the specific vocabulary of tags (or microdata) that you can add to the HTML code of your website to provide more relevant information . In this way, search engines will better understand your content and provide better results. Also, enhance the way your page is displayed with rich snippets that appear below the page title .
Schema.org is the reference website for this type of strategy where you will find all kinds of hierarchies and ways to organize your content. But wait a minute, what can I structure? Hundreds of things ! Currently there is a wide variety of tags to structure and surely with time there will be more: places, events, movies, books, recipes, people, etc. Also, to make it much easier Google created “ Markup Helper ”. Very useful.
(Search Engine Results Page) refers to the results page of a search engine, such as Google or Bing.
It is the page that appears after performing a search, it is where the results are displayed in order.
The more a website is optimized according to the quality criteria of search engines, the more likely it is to position itself better in the SERPs.
A sitemap or website map is an XTML document that is sent to search engines. This document allows search engines to have a complete list of the pages that make up a website, so that they can index pages that their robots cannot access because there are no direct links, being behind a form, etc.
Spinning is a Black Hat SEO technique that refers to the creation of an article by reusing different original texts.
In this way, the generation of content is accelerated in a simple way. It can be carried out using software that automates the process of modifying the content or manually, making it appear that they are different texts through synonyms or changes in order and words.
Although this technique has been widely used, doing it automatically is one of Google's penalty factors. Since its now famous Penguin, Google detects these practices more frequently.
White Hat SEO
White Hat SEO are those ethically correct techniques that meet the guidelines set by search engines to position a website.
Its goal is to make a page more relevant to search engines. To achieve a good White Hat SEO there are some characteristics that you must take into account:
White Hat SEO is the most beneficial way to optimize the positioning of a website in the medium-long term.