Share Page |
Website Audit ReportsHere are our Website Audit Reports Packages designed to help your business succeed further. Hover over the question marks to get a quick description. You may also download this as a PDF with or without the extensive descriptions at the bottom part of this document. In addition, you may view or download our primer on the marketing benefits of Website Audit for your business. |
Manual Keyword Mining
Before launching a marketing campaign we need to know what keywords to target. A “keyword” is what we refer to as the word or phrased typed into a search engine to return search results. Utilizing software and several man hours of filtering through the Google data, we return a full keyword report. The report not only shows keyword traffic, but also competition analysis and where the website is currently ranking in Google, Yahoo, and Bing for that particular keyword. The keywords chosen determine the direction of the entire marketing campaign, thus the extreme importance we place on keyword research.
Keyword Ranking Report
Keyword Rankings, or where your site is ranked in search engines for keywords, has a major impact on your Web traffic, lead generation and conversions. Research shows more than 75% of all search engine users click on a result on the first page; so the higher you rank in the search engine results pages, the better your chances are of gaining more traffic.
Using specialized software, the rankings of the target keywords for the website will be generated and tabulated. The results will be used as a reference point for the initial rankings of the keywords before the marketing campaign is commenced.
The keywords chosen for the SEO campaign will be analyzed and grouped together. When we group together keywords we will be classifying them as primary or secondary keywords. The keywords will be grouped together based on relevancy. The most competitive keyword of the keyword group will be the primary keyword and all others will be secondary keywords.
Each keyword will be checked by us to know the baseline ranking status and so as to determine the ranking URLs. To further determine the competition level of a keyword we have to manually check your keyword rankings to make sure we know 100% where your website is currently ranking. It is also important to manually check rankings to help us determine target URLs per keyword.
We analyze the competitors per keyword you want to target. It is important for us to look at the competition to have a better understanding of the keyword competition level.
There are a lot of website platforms. The website can be of pure HTML, ASP, PHP or is built using a Content Management System like Wordpress. However, not all will appear to be SEO or user friendly. It is important to know this first so that there’s nothing that can hinder or limit all required adjustments in the optimization process. In rare instances, the website may be built on a platform that does not allow us to perform SEO and in that situation we would be recommending to redesign the website before moving forward with SEO.
If a website is infected with malware, then it could hinder it's search engine rankings. It is important to make sure the website does not have any malware installed on it because malware can infect visitors to your website as well.
When we analyze the website we will be looking at the design components as well. This is not an in-depth analysis, but a light overview. Typically it is clear if a website requires a "face-lift" in order to reduce bounce rates. The website should be attractive, and clearly display the purpose of the website and the nature of the products or service. The website should also be easy to navigate.
The site performance will be checked in terms of page loading time. The page loading time is one of the important ranking factors that’s why checking this is necessary. If a page loads too slow then your visitors may leave your website. Google has also placed more importance on page load speed as part of their ranking algorithm. As per Google, if your website loads too slow, it will negatively impact your rankings. When we find a website that has a slow page load speed we will take the necessary steps to investigate. Often times it is the hosting provider.
Main navigation is checked to know if it is crawl-able or not. Preferably, the navigation should be built in pure HTML text and not on images or JavaScript. This is mainly because Google is not able to read images or JavaScript as of yet and since the context of the main navigation serves as the primary title of the whole category or theme, then it is important that Search Engines are able to read the text within the navigation.
Content or the main article content is the most important part of a page and it is important to be crawlable by Search Engines. What can hinder bots from crawling the content is when the site is made up of flash or the content is hidden by JavaScript.
Upon launching the site, the hosting should be checked for us to secure that the website is safe from any potential malware attacks. We can also determine if a group of sites are hosted on the same server and have the same IP address which is not preferable in terms of SEO especially when dealing with the same niches.
We will check to see if there are any existing Webmaster Tools and Analytics installed on the website. If there are no analytic tracking services or webmaster services installed on the website, then we will go ahead and install Google Analytics and Google Webmaster tools. As part of our analysis of the website we need to be able to check Google Webmaster, as well as their analytics.
There are two types of sitemaps. One is the user sitemap that can be found on the website that helps in guiding users while the other is the XML sitemap. We are primarily concerned with the XML sitemap. A XML sitemap is generate for use by search engines, to make sure they can send their crawler bots to all of your pages. If a search engine cannot find all of your pages, then there is no way for those un-found pages to be indexed. XML sitemaps are extremely important and we will be making sure your is installed properly or be taking notes to make sure this is accomplished during on-page optimization.
By organizing the URLs, it will also helps organize the entire website. By having a proper URL structure, it will help users to easily determine what page they are on by just looking at the URL alone. Most websites do not have a proper url structure and simple put all pages as a direct extension off the homepage.
Lets look at an example of a pesticide company that has 5 different services.
Wrong URL structure:
hompage.com
hompage.com/service1
homepage.com/service2
homepage.com/service3
homepage.com/service4
homepage.com/service5
Correct URL structure:
homepage.com
hompage.com/service/1
homepage.com/services/2
homepage.com/service/3
homepage.com/service/4
homepage.com/service/5
The search engines can only interpret a website as well as it is built. Having properly structured URLs is essential to ensure maximum rankability of your website.
Broken links are links that lead to pages that do not exist. When clicking on a broken link, the page you land on is called a 404 error page, a standard HTTP response that indicates that the requested URL doesn’t exist.
What do you do when you happily surf the web and suddenly come across a 404 error? For most of us, the immediate response would be to simply leave the current site in favor of another one because both people and search engines consider broken links as unprofessional.
404 errors and broken links also have negative effects on your search engine rankings so it is quite reasonable to be proactive in avoiding them to improve exposure and increase site traffic. We will be looking for all the broken links on your website and reporting back to you what will be required to fix all of them.
The 404 error page is what you see when you go to a page that does not exist. We will be checking if the page is returning the right error code which is the 404 and also checking to see if you have a custom 404 error page so your visitors do not leave your website if they run into a 404 error. A customized 404 page is essential to improving the user experience on the website.
This is a txt file that is uploaded on the website and on the webmaster tools that gives instructions to bots on what to crawl and not to crawl or what pages to index and what not to include on the index. This is helpful in blocking irrelevant pages or directories that is not part of the main informative pages of the site. We will be checking if this is already existing on the site or not and will be giving proper recommendations according to our findings. A misplaced robots.txt could de-index the entire website in the search engines, so of course this is a very important step in our auditing process.
This is another form of Robots.txt. The only difference is how they are uploaded on the website. This one is part of the meta tags that gives instructions to the crawlers on what pages to follow or nofollow or index or not to index (noindex). This works the same as Robots.txt but instead of being a site-wide instruction, it is a page specific instruction. We will be checking if this is already existing on the site and making sure it is configured properly.
We will be checking how many pages are already indexed in Google versus the total amount of pages. This can also help in determining what version (www or non-www) of URL is good to optimize. If your website has 500 actual pages, but Google says they have only indexed 100 pages of your website, then we need to start our detective work. It is important for Google to index 80%+ of your pages, and if they are indexing less than 80% of your pages then we need to instigate the reasons why they are not indexing more of your pages.
These are basically the parts of a meta tag group. Each page should have a unique set of meta tags since each should represent the contents of the page. We will be checking the duplicates and short meta descriptions and providing proper recommendations for improvement. Having a good meta tag will help increase the click through rate of the pages on the SERP and increase the pages likelihood to rank higher in the search engines.
This is basically counting the existing backlinks on the website so that we know the baseline and can be able to determine how many backlinks have been built during the active SEO campaign. This will be documented to show the difference and improvement.
Note: For websites with existing backlinks, there are possibilities that those backlinks will get deindexed or devalued by Google and we do not have control over that since we are not the ones whom were building those existing backlinks. In these instances, please be reminded that it would cause ranking fluctuations.
We will be checking your website for duplicate content and providing a duplicate content report if any exists. If you hired someone to write your website are you 100% sure it is unique content. Many copywriters simply rewrite or blatantly copy content from other websites. It is of extreme importance to have original, unique content on your website. If we detect significant duplicate content we will be requiring for this content to be rewritten in order to ensure maximum rankability of the website.
This is where we group the target keywords that have been agreed upon into sets of 1-3 keywords which will then be designated to target pages during the URL Mapping stage. Keywords of a given campaign are grouped based on, but not limited to, similar keywords, related terms, and geo-target.
If an SEO campaign has 30 keywords, they will not all go on the same page. We usually target 1 keyword on each page, with unique exceptions to the homepage. The process of determining which page should contain certain keywords is URL mapping.
Factors such as theme relevance and page rankings will come into play when URL mapping is performed. Among the target pages that are prioritized are those that are convertible and/or will catch a user's attention, engaging them and encouraging them to interact and browse through the site. The homepage, which is the most highly evaluated page among all the others, is always targeted. If a page with a matching theme does not exist for certain keywords, then a new page with fresh content will have to be created.
The Target URLs are determined during keyword URL mapping. These URLs are simply the pages we are primarily targeting with our On Page and Off Page optimizations. When we track the keyword rankings, these are the pages you should see rankings respective to the keywords targeted on each page.
This is a set of instructions or recommendations generated from the URL Structure analysis results. This contains a step by step process on what needs to be done to properly architect your urls and what are the concerned pages that need to be fixed. This also contains the instructions on how to optimize the final target pages that were identified from the analysis part of the process.
When we analyze your website during the URL mapping process we will determine the target page for each keyword. Once we establish the target page for each keyword we will analyze the content on that page to see if the content has enough words (400 words minimum) and also to see if the content is relevant to the keyword. Many times you do not have a page established for a particular keyword, and in this instance we would recommend a new page creation. Most websites require a significant amount of additional content and this content can be written for you or provided by you. We will supply you with a report to show you what pages need content, and what the content needs to be written about (keyword-wise anyways).
When we finish running all of our checks on your website, we will produce an easy to interpret summary report and provide this along with the rest of the audit in a zip file. The error reports, duplicate meta report, and duplicate website content will be in a separate document inside of the zip file. We will do our best to make the summary informative and to the point. We understand that you do not want to sort through all the data we send over, so we will let you know all the important aspects to look an in our summary report. Within the summary we will also be providing further recommendations. Further recommendations could be anything from telling you that you need a blog to telling you that you need to re-design your website before starting a marketing campaign.
After performing the website audit, if we have further concerns about the keywords you want to target we will contact you and go over our concerns in a consulting session.
Manual Keyword Mining
Before launching a marketing campaign we need to know what keywords to target. A “keyword” is what we refer to as the word or phrased typed into a search engine to return search results. Utilizing software and several man hours of filtering through the Google data, we return a full keyword report. The report not only shows keyword traffic, but also competition analysis and where the website is currently ranking in Google, Yahoo, and Bing for that particular keyword. The keywords chosen determine the direction of the entire marketing campaign, thus the extreme importance we place on keyword research.
Keyword Ranking Report
Keyword Rankings, or where your site is ranked in search engines for keywords, has a major impact on your Web traffic, lead generation and conversions. Research shows more than 75% of all search engine users click on a result on the first page; so the higher you rank in the search engine results pages, the better your chances are of gaining more traffic.
Using specialized software, the rankings of the target keywords for the website will be generated and tabulated. The results will be used as a reference point for the initial rankings of the keywords before the marketing campaign is commenced.
The keywords chosen for the SEO campaign will be analyzed and grouped together. When we group together keywords we will be classifying them as primary or secondary keywords. The keywords will be grouped together based on relevancy. The most competitive keyword of the keyword group will be the primary keyword and all others will be secondary keywords.
Each keyword will be checked by us to know the baseline ranking status and so as to determine the ranking URLs. To further determine the competition level of a keyword we have to manually check your keyword rankings to make sure we know 100% where your website is currently ranking. It is also important to manually check rankings to help us determine target URLs per keyword.
We analyze the competitors per keyword you want to target. It is important for us to look at the competition to have a better understanding of the keyword competition level.
There are a lot of website platforms. The website can be of pure HTML, ASP, PHP or is built using a Content Management System like Wordpress. However, not all will appear to be SEO or user friendly. It is important to know this first so that there’s nothing that can hinder or limit all required adjustments in the optimization process. In rare instances, the website may be built on a platform that does not allow us to perform SEO and in that situation we would be recommending to redesign the website before moving forward with SEO.
If a website is infected with malware, then it could hinder it's search engine rankings. It is important to make sure the website does not have any malware installed on it because malware can infect visitors to your website as well.
When we analyze the website we will be looking at the design components as well. This is not an in-depth analysis, but a light overview. Typically it is clear if a website requires a "face-lift" in order to reduce bounce rates. The website should be attractive, and clearly display the purpose of the website and the nature of the products or service. The website should also be easy to navigate.
The site performance will be checked in terms of page loading time. The page loading time is one of the important ranking factors that’s why checking this is necessary. If a page loads too slow then your visitors may leave your website. Google has also placed more importance on page load speed as part of their ranking algorithm. As per Google, if your website loads too slow, it will negatively impact your rankings. When we find a website that has a slow page load speed we will take the necessary steps to investigate. Often times it is the hosting provider.
Main navigation is checked to know if it is crawl-able or not. Preferably, the navigation should be built in pure HTML text and not on images or JavaScript. This is mainly because Google is not able to read images or JavaScript as of yet and since the context of the main navigation serves as the primary title of the whole category or theme, then it is important that Search Engines are able to read the text within the navigation.
Content or the main article content is the most important part of a page and it is important to be crawlable by Search Engines. What can hinder bots from crawling the content is when the site is made up of flash or the content is hidden by JavaScript.
Upon launching the site, the hosting should be checked for us to secure that the website is safe from any potential malware attacks. We can also determine if a group of sites are hosted on the same server and have the same IP address which is not preferable in terms of SEO especially when dealing with the same niches.
We will check to see if there are any existing Webmaster Tools and Analytics installed on the website. If there are no analytic tracking services or webmaster services installed on the website, then we will go ahead and install Google Analytics and Google Webmaster tools. As part of our analysis of the website we need to be able to check Google Webmaster, as well as their analytics.
There are two types of sitemaps. One is the user sitemap that can be found on the website that helps in guiding users while the other is the XML sitemap. We are primarily concerned with the XML sitemap. A XML sitemap is generate for use by search engines, to make sure they can send their crawler bots to all of your pages. If a search engine cannot find all of your pages, then there is no way for those un-found pages to be indexed. XML sitemaps are extremely important and we will be making sure your is installed properly or be taking notes to make sure this is accomplished during on-page optimization.
By organizing the URLs, it will also helps organize the entire website. By having a proper URL structure, it will help users to easily determine what page they are on by just looking at the URL alone. Most websites do not have a proper url structure and simple put all pages as a direct extension off the homepage.
Lets look at an example of a pesticide company that has 5 different services.
Wrong URL structure:
hompage.com
hompage.com/service1
homepage.com/service2
homepage.com/service3
homepage.com/service4
homepage.com/service5
Correct URL structure:
homepage.com
hompage.com/service/1
homepage.com/services/2
homepage.com/service/3
homepage.com/service/4
homepage.com/service/5
The search engines can only interpret a website as well as it is built. Having properly structured URLs is essential to ensure maximum rankability of your website.
Broken links are links that lead to pages that do not exist. When clicking on a broken link, the page you land on is called a 404 error page, a standard HTTP response that indicates that the requested URL doesn’t exist.
What do you do when you happily surf the web and suddenly come across a 404 error? For most of us, the immediate response would be to simply leave the current site in favor of another one because both people and search engines consider broken links as unprofessional.
404 errors and broken links also have negative effects on your search engine rankings so it is quite reasonable to be proactive in avoiding them to improve exposure and increase site traffic. We will be looking for all the broken links on your website and reporting back to you what will be required to fix all of them.
The 404 error page is what you see when you go to a page that does not exist. We will be checking if the page is returning the right error code which is the 404 and also checking to see if you have a custom 404 error page so your visitors do not leave your website if they run into a 404 error. A customized 404 page is essential to improving the user experience on the website.
This is a txt file that is uploaded on the website and on the webmaster tools that gives instructions to bots on what to crawl and not to crawl or what pages to index and what not to include on the index. This is helpful in blocking irrelevant pages or directories that is not part of the main informative pages of the site. We will be checking if this is already existing on the site or not and will be giving proper recommendations according to our findings. A misplaced robots.txt could de-index the entire website in the search engines, so of course this is a very important step in our auditing process.
This is another form of Robots.txt. The only difference is how they are uploaded on the website. This one is part of the meta tags that gives instructions to the crawlers on what pages to follow or nofollow or index or not to index (noindex). This works the same as Robots.txt but instead of being a site-wide instruction, it is a page specific instruction. We will be checking if this is already existing on the site and making sure it is configured properly.
We will be checking how many pages are already indexed in Google versus the total amount of pages. This can also help in determining what version (www or non-www) of URL is good to optimize. If your website has 500 actual pages, but Google says they have only indexed 100 pages of your website, then we need to start our detective work. It is important for Google to index 80%+ of your pages, and if they are indexing less than 80% of your pages then we need to instigate the reasons why they are not indexing more of your pages.
These are basically the parts of a meta tag group. Each page should have a unique set of meta tags since each should represent the contents of the page. We will be checking the duplicates and short meta descriptions and providing proper recommendations for improvement. Having a good meta tag will help increase the click through rate of the pages on the SERP and increase the pages likelihood to rank higher in the search engines.
This is basically counting the existing backlinks on the website so that we know the baseline and can be able to determine how many backlinks have been built during the active SEO campaign. This will be documented to show the difference and improvement.
Note: For websites with existing backlinks, there are possibilities that those backlinks will get deindexed or devalued by Google and we do not have control over that since we are not the ones whom were building those existing backlinks. In these instances, please be reminded that it would cause ranking fluctuations.
We will be checking your website for duplicate content and providing a duplicate content report if any exists. If you hired someone to write your website are you 100% sure it is unique content. Many copywriters simply rewrite or blatantly copy content from other websites. It is of extreme importance to have original, unique content on your website. If we detect significant duplicate content we will be requiring for this content to be rewritten in order to ensure maximum rankability of the website.
This is where we group the target keywords that have been agreed upon into sets of 1-3 keywords which will then be designated to target pages during the URL Mapping stage. Keywords of a given campaign are grouped based on, but not limited to, similar keywords, related terms, and geo-target.
If an SEO campaign has 30 keywords, they will not all go on the same page. We usually target 1 keyword on each page, with unique exceptions to the homepage. The process of determining which page should contain certain keywords is URL mapping.
Factors such as theme relevance and page rankings will come into play when URL mapping is performed. Among the target pages that are prioritized are those that are convertible and/or will catch a user's attention, engaging them and encouraging them to interact and browse through the site. The homepage, which is the most highly evaluated page among all the others, is always targeted. If a page with a matching theme does not exist for certain keywords, then a new page with fresh content will have to be created.
The Target URLs are determined during keyword URL mapping. These URLs are simply the pages we are primarily targeting with our On Page and Off Page optimizations. When we track the keyword rankings, these are the pages you should see rankings respective to the keywords targeted on each page.
This is a set of instructions or recommendations generated from the URL Structure analysis results. This contains a step by step process on what needs to be done to properly architect your urls and what are the concerned pages that need to be fixed. This also contains the instructions on how to optimize the final target pages that were identified from the analysis part of the process.
When we analyze your website during the URL mapping process we will determine the target page for each keyword. Once we establish the target page for each keyword we will analyze the content on that page to see if the content has enough words (400 words minimum) and also to see if the content is relevant to the keyword. Many times you do not have a page established for a particular keyword, and in this instance we would recommend a new page creation. Most websites require a significant amount of additional content and this content can be written for you or provided by you. We will supply you with a report to show you what pages need content, and what the content needs to be written about (keyword-wise anyways).
When we finish running all of our checks on your website, we will produce an easy to interpret summary report and provide this along with the rest of the audit in a zip file. The error reports, duplicate meta report, and duplicate website content will be in a separate document inside of the zip file. We will do our best to make the summary informative and to the point. We understand that you do not want to sort through all the data we send over, so we will let you know all the important aspects to look an in our summary report. Within the summary we will also be providing further recommendations. Further recommendations could be anything from telling you that you need a blog to telling you that you need to re-design your website before starting a marketing campaign.
After performing the website audit, if we have further concerns about the keywords you want to target we will contact you and go over our concerns in a consulting session.
Webdesign and hosting by BeanBlossomWeb.com a Results Through Technology company