What is a manual site review? A manual site review is a human review of a website in addition to the use of automated site review tools. It consists of manually visiting pages of a site to catch issues that may not be uncovered with automated site audit tools.
Automated site audit / review tools are great and necessary in today’s ever changing online organic search results environment. However, there are issues that they may just not uncover.Recently, a new client came to us for search engine optimization campaign management. Site audits were run. Keyword research was performed.
The results of the automated site audits delivered the usual suspects:
- Missing metadescriptions
- Duplicate metadescriptions
- Metadescriptions too short or too long
- Missing page titles
- Duplicate page titles
- Page titles too short or too long
- Missing h1 tags
- Matching page titles and h1 tags
- More than one h1 tag on page
- Missing alt attributes
- Pages with low word counts
All basic items to be expected. The last item answered a question about the results of our keyword research.
As the world leading white label provider to agencies across the globe we can help you deliver outstanding SEO results for your clients. Can we help you? Check out more about our White Label SEO Services and learn how we help you achieve the outcomes you are looking for.
The keyword research indicated that this client was a product / part supplier in a particular industry. Nothing about what they really did which is design, develop & manufacture products / parts for this industry and catered to wholesalers, distributors, installers, contractors and retailers in this industry; not the general public, nor as a single part supplier.
Pages With Low Word Counts
The keyword results, combined with the pages with low word count results, prompted a manual review of the on-site content before preparing a strategic plan to move forward.
The manual review revealed that only the parts / products pages had content on them. They only contained part descriptions & specifications, thus the results of the keyword research leaned heavily on products / parts only.
Further review, by clicking on each and every menu item and every link throughout the site, found that all of the client’s content had been converted to .pdfs and hosted on a content delivery network. This was Not just images and videos, but all of their on-page, written word content.The only content that was ranking,(poorly by the way), was for their product / pages, which are the only pages on the site with real on-page content.The keyword tool could not make the hop over to the content delivery network .pdf content to give us the keyword data that we needed to deliver the client’s message online.
Multiple Versions Of The Website
Another item that was revealed while performing the manual review is that there were multiple versions running live on the internet.The website in a browser resolved as a non-secure, http version of the website, which is problematic in its own right. Google was crawling and indexing only the http version.
We found this new multiple version issue after we created a new Google Analytics and Google Search Console for the site. When setting up the Google Search Console account, we discovered that in addition to the non-secure, http version of the site, there was also a secure, https version; a www version; and a non-www version of the site that all would resolve in a browser.
All of these are problematic, but not noticeable to any of the automated tools. Only a human eye caught this.
This resulted in:
We also found that when the site had been developed that it was set in the administration system to only allow a crawl of the non-secure, http version of the site, when there was clearly an option to set the crawl for the https version only. This option was marked as the preferred option by the administration system, but someone had selected the ‘only crawl the http version’ on purpose.
We resolved the multiple versions of the site issue by simply selecting the preferred option, ‘crawl only the secure, https version of the site’. A redirect was put in place to point the www and non-www versions of the site to the secure, https://www.domainname.com version.
Problem solved and moving on to the next manual review discovery.
Punching through the rest of the pages on the site, we also found issues with the website policy privacy, terms and conditions, and other important content.All of these pages are necessary if you want to get anywhere in the search results pages, especially in Google’s SERPs.
The first issue was with the terms and conditions. The only link was found on a page linked from a top menu item, not in the footer or top menu where a visitor could view the terms and conditions from any page on the site. A visitor would have to find this top menu link by chance in order to find the link to the terms and conditions page.Further, the link to the terms and conditions page was, (yes, you guessed it), to a .pdf located on the content delivery network and not on the client’s website.
More general information can be found at SearchEngineLand here: https://searchengineland.com/how-many-google-privacy-policies-are-you-violating-50182
Also, there were two other important pages that were not on the site but linked to a content delivery network .pdf, the website’s limited warranty page, and mapp pricing policies (minimized advertised price policy) page. These two pages could add additional content to the site and provide valuable information for the site’s visitors.Both of the links for the above pages were also found on the same single page with the terms and conditions link to the delivery network .pdf content.In short, if a visitor to the site did not visit this single page, they would not find this very important content. Not to mention that search engine bots couldn’t crawl the content.
Automated tools are great, but a manual inspection may help discover many other critical issues that should be addressed. On the issue of automated tools, none of them are perfect and many of them will deliver false positives, thus clouding the view of what the critical issues really are. The results of an automated tool still do require that a human being perform a manual review of the results to ensure that any issues reported on are either, in fact, a problem or are only a false positive.
Contact Us For The Best Search Engine Optimization Advice at SEO Company , SEOCompany.com