HOW TO DO ON-PAGE SEO ANALYSIS
On Page SEO Optimization
When a new SEO project falls into our hands, we feel the irrepressible need to start doing things, as soon as we see something that may be slightly better we need to fix it, but this is not a good idea, it is best to calm down, take a deep breath and start doing Good planning based on a good study of the web and the competition.
Before starting to do any analysis or study the first thing to do is to navigate the page to get an idea as deep as possible of what we are facing.
On-site analysis can be easily grouped into four:
- Website Accessibility
- Website Indexability
- Have you been penalized?
- On-site positioning factors
If Google or users can not access a page, it could also not exist, it would be the same. So the first thing to look out for is whether or not our web is visible in the eyes of search engines.
- TXT FILE: The robots.txt file is used to prevent search engines from accessing and indexing certain parts of your website, although it is very useful, there may be a situation where we block access to our website without realizing it. In this extreme case, the robots.txt file is blocking access to the entire web:User-agent: *
Disallow: /What we need to do is manually verify that the robots.txt file (usually the URL is usually www.example.com/robots.txt) is not blocking any important part of our website. We can also do this through Google Webmaster Tools.
- Meta Tag Robots: Meta tags are used to tell search engine robots whether or not they can index that page and follow the links it contains. When analyzing a page we must check if there is any meta-tag that is mistakenly blocking access to robots. Here is an example of how you look at these tags in the HTML code:
<Meta name = “robots” content = “noindex, nofollow”>
- HTTP Status Codes: In the event that a URL returns an error (errors 404, 502, etc.), users and search engines will not be able to access that page. During this step all the URLs that return error must be identified, later to be fixed. To identify these URLs we recommend using Google Webmaster Tools.We will take this step to know if there is a redirection in the web and if so, if it is a temporary redirect 302 (in this case we would turn it into a permanent redirect 301) or if it is a permanent redirect 301 but redirects to a page that it did not have Nothing to do with it, then we will redirect it to where it should.
- SITEMAP: The sitemap is literally a map for Google with which we make sure that Google finds all the important pages of our website. There are several important points to keep in mind with the Sitemap:- If the map does not follow the protocols for it, Google will not process it properly. Is the Sitemap uploaded to Google Webmaster Tools? Are there pages on the web that do not appear in the Sitemap? If so, the Sitemap must be updated. If you find any page in the sitemap that Google has not indexed means that you could not find them, make sure that at least one link on the web points to it and that we are not blocking access by mistake.
- WEB Architecture: Make an outline of the whole web where you can easily see the levels you have from the home to the deepest page and how many clicks are needed to reach them. Also check if all pages have at least one internal link pointing to them. It can be easily checked with Open Site Explorer filtering the links only by internal. This is because Google has a limited time to crawl a web, the fewer levels you have to jump to get to the deeper, the better.
- Website Speed: It is well known that one of the factors of the rebound percentage increase is the loading speed of a page, but keep in mind that the Google robot also has a limited time when navigating our page , the less Late every page in loading more pages will get you there.You can use several tools to see the loading speed of the web. Google Page Speed offers a wealth of data about what slows down the web load as well as tips on how to improve the upload speed.
We have already identified which pages the search engines can access, now what we have to do is to identify how many of those pages are actually indexed by them.
- SITE SEARCH COMMAND: Google offers the possibility to do search with the command “site:” With this command what we do is do a search on a specific web, which gives us a very approximate idea of the number of pages that Google has indexed.This serves to compare the number of pages that Google has indexed with the actual number of web pages, which we already know by the sitemap and have browsed the web previously. There are 3 things that can happen:-
>>>>The number in both is very similar … WELL!!
>>>>The number that appears in Google search is lower, which
means that Google is not indexing many of the pages
>>>>The number that appears in the Google search is greater, this almost always means that the web has a duplicate content
In the second case what we will do is to review all accessibility points in case something has happened to us. In the third case we will review that our page does not have duplicate content, which I will explain later.
- RELEVANT PAGES: To know if the pages that are positioned are the most important ones we only have to search again with the command “site:”, it would be logical that the home appeared first followed by the most important pages of the web, if it is not So we must investigate…
- BRAND SEARCH: After knowing if the important pages are well positioned we have to know if the web is positioning well for a search with the name of the web.If the web appears in the top positions, everything is fine, if the web does not seem anywhere, we have a problem: Google may have penalized the web, it’s time to find out.
HAS GOOGLE PENALIZED YOUR WEBSITE?
If you have stopped at this point, it is possible that you have a problem with Google. It is relatively easy to know if you have been totally penalized by Google, but the best thing is to find out even the slightest penalty that the Google has been able to give us.
You have to keep in mind that traffic losses can also be due to changes in the Google algorithm, so you have to follow two steps to make sure it really is A penalty:
- MAKE SURE WHAT HAPPENS: The first thing to do is a search for our brand in Google, if we do not appear by any site is possible a penalty, but also may be blocking the access to Google to the web, if so should be followed by Again the steps I have explained above.Once we have verified that everything is fine as far as we are concerned, we will do a search of the URL of the web in Google (without the command: site ) and we will see if all the important pages appear on the first page; Then we will search the URLs of each type of page, if it were an e-commerce would be category, sub-category, product, manufacturer, etc.
- IDENTIFY THE CAUSE: Once you are sure that some part of the web is being penalized you have to get down to work and find out the reason. This requires a thorough analysis of any internal or external factor: same anchor text in a multitude of internal or external links, keyword stuffing in the content of the pages, web links of questionable quality … the reasons can be many, just have to be patient.The last step would be to correct it and send reconsideration to Google from Google Webmaster Tools, but that will be discussed later in Blog section.
ON-SITE POSITIONING FACTORS
We have already analyzed the accessibility and indexability of the web, now it is the turn to factors that directly affect the positioning in Google. This step is very interesting because it is where we will find more opportunities to improve things on the web.
URLS: To analyze the URLs of the pages we must ask ourselves 5 questions:
- Is the URL too long? It is advisable not to exceed 115 characters
- Does the URL have relevant keywords? It is important that the URL describes the content of the page, since it is the gateway of Google to the web
- Are there too many parameters in the URL? The best thing would be to use static URLs, but if it is not possible, as can happen in e-commerce, we will avoid excessive use in addition to registering them in Google Webmaster Tools ( configuration> URL parameters )
- Are “low bars” being used instead of scripts? From Google it is recommended to use hyphens to separate the words in the URL
- Do you correctly redirect the URL? With this I mean if for example redirects from “no www” to “www” or yes redirects from “.html” to “no .html” . This will avoid duplicate content issues
THE CONTENT: We think it’s been pretty clear from Google that the content is the king, let’s give him a good throne then.
To make a good analysis of the content of our website we have a few tools at our disposal, but in the end the most useful is to use the page cached by Google to see the version in text only, this way we will see what content is actually reading Google and In what order is it arranged.
At the time of the analysis of the content of the pages we ask several questions that will guide us in the process:
- Does the page have enough content? There is no standard measure of what is “sufficient,” but at least 300 words
- Is the content relevant? The content has to be useful to the reader, simply asking if we would read that, we will give the answer.
- Does the content have important keywords? Do these keywords appear in the first few paragraphs?
- Does the content keyword stuffing? If the content of the page has too many keywords, Google will not be grateful
- Is the content well written? That is, are there spelling mistakes, syntax errors, or punctuation?
- Is the content easy to read? If the reading is not tedious, it will be fine.
- Is the content well distributed? Focus on what the perfect layout would look like
- Are there multiple pages focused on the same keywords? If so, we will be falling into what is called “cannibalization of keywords”, what we have to do is add a “rel = canonical” tag to the most important page.
The best way to check all this, besides having patience, is to make an excel file in which to point for each of the pages the keywords in which it is centered and all the errors that we go seeing in them, well at the time of Reviewing it for your improvement will make it much easier.
DUPLICATE CONTENT: Having duplicate content means that for several URLs we have the same content, it may be internal or external duplicate content.
In order to detect duplicate content we can measure how you have done browsing the web by pointing out possible cases of duplicate content, then copying a paragraph of the text and doing the Google search between [brackets] , this will not result in all pages as much Both internal and external. Tools like Copyscape are also useful.
HTML TAGS: Do not underestimate the code of the page when doing the analysis, it contains a good part of the most important positioning factors.
Before starting with specific points of the HTML code we must make sure that it meets the standards, W3C offers an HTML Validator that will show us any code failure.
TITLE: The title of the page is perhaps the most important element. It is the first thing that appears in the results in Google and is what is shown when people put the link in social networks (except twitter).
- To analyze the title we will also follow (as not) a series of steps:
- The title should not exceed 70 characters, if not cut.
- The title must be descriptive with respect to the content of the page
- The title must contain important keywords, to be able to be at the beginning, as long as it does not lose the logic
- We should never abuse the keywords in the title, this will make users suspicious and Google think that we are trying to deceive
Another thing to keep in mind is where to put the “brand”, ie the name of the web, usually is usually put at the end to give more importance to important keywords, separating these from the name of the web with a hyphen Or a vertical bar. We can easily see all the page titles with tools.
META DESCRIPTIONS: Although not a positioning factor, it significantly affects the click-through rate in the search results.
For the Meta description we will follow the same principles as the title, except that the length of this should not exceed 155 characters. For both titles and descriptions we must avoid duplicity, we can check this in Google Webmaster Tools (optimization> HTML improvements).
OTHER HTML ELEMENTS:
- Meta Keywords – Simply delete them if they exist, Google does not use them at all
- “Rel = canonical” tag – If there are any pages using this tag we must make sure that you go to the correct page. How to use the label “rel = canonical”
- Tags “rel = prev; Rel = next “- If there are pages using these tags, the same, make sure they point to the appropriate pages. How to use the labels “rel = prev; Rel = next “.
- Tags H1, H2, H3 … – We must analyze if all the pages make use of these tags and if they include relevant keywords for the content.
IMAGES: When analyzing the images we must take into account 3 factors:
- The weight of the image – By this we mean its weight Kb, for this it is necessary to check if they are compressed, if they have redimension, etc. This information will be provided by Google Page Speed
- The label “alt” – This has to be descriptive with respect to the image and content that surrounds the image
- Title of the image – Same as in the previous point
RELATED LINKS: It is very important to analyze the number of links, both external and internal, and where they are linking, for this (again, weighed the guy with the lists) we will make a list with the most important factors:
- The number of links per page should not exceed 100 and all pages should have a similar number of links
- External links should point to sites that are important and relevant to web content
- The anchor text of the links should have important keywords, but we should not use the same keywords for all links
- There should be no broken or pointed links to pages with 404 errors
- There may be links pointing to pages with redirects, if so, the URL would be changed by the final landing page
- There must be a good ratio between links follow and nofollow , not all should be follow nor all nofollow
You should also make sure that important pages receive links from other important web pages and that all web pages have at least one link pointing to them.
Following all these steps we will do an on-site SEO analysis of any website with which we can throw ourselves hours in front of the screen and then leave it flawless. We like very much to influence that as much link-building as it does, if those links point to pages that make us cry, they will not be worth anything.