This post covers many, but not all, of the web development issues that we have experienced over our years of being in business. Some of these can be considered a hazard that may impact attempts at search engine optimization success.
- Blocks In The .robots.txt File
- Ownership Of The Core Files/Full Server Acces
- No Metadata Access
- Forms Not Working
- Not Mobile Friendly
- Broken Internal Links
- XML Sitemaps
Blocks In The .robots.txt File
Too many times have we discovered pages that are blocked in the website’s robots.txt file that are pages that should not be blocked and allowed to be properly crawled and indexed by the search engines. In many cases, a block is put in place in the robots.txt to keep Google and other search engine crawlers from crawling a new website this is under development. When the website is launched, we have seen that the crawler block has not been removed from the robots.txt leaving us with no ranking results no matter how many search engine optimization solutions have been implemented. One of the first elements that I always check, whether this is a new website or a new search engine campaign for an existing site, is the robots.txt file. Believe it or not, this happens quite often.
Ownership Of The Core Files/Full Server Access
All too often we see this issue when taking on a new client for a search engine optimization campaign. You have had a site created but when you need to make changes you have to contact the original web developer to make those changes. Most websites require multiple search engine optimization recommendation implementations and/or revisions that require someone to make these changes. If you do not have access to your core files, and have to rely on your original developer, can result in delays of implementation and/or additional costs. You need to have access to your core files and content management system.
As a recent example, a client that we manage multiple online properties since 2013 was acquired by another entity within the last year. One of the core missions was to create a new corporate identity that brought together all of its core industry software solutions and equipment solution brands under one identity. One of the tasks to complete this mission was to create a new website that spoke to all of their corporate solutions for their specific industries. The development was contracted to a web development provider that the parent company already had a relationship with.
The problem with this solution is that the web development provider’s existing websites for the client’s multiple online entities that had any search engine elements applied, thus their other online assets rank poorly. Our task for this project was to provide upfront search engine optimization recommendations in two different languages. The client has full access to the content management system so our search engine optimization recommendations were able to be put in place. However, our site quality audits revealed issues that required programming to the core files.
During this period, the original web developer, for lack of better words, was fired and a new web development firm was hired to correct other programming issues that were found outside of search engine optimization. Keep in mind that this site has been released to the web but remains blocked in the .robots.txt file to keep Google and other search engines from crawling and indexing the site until all issues are worked out. We run a new site quality audit at least once per week to uncover any search engine optimization issues that may be uncovered. We have been performing this task for over three months as of the time of writing this post.
Within the last week, a new error showing over 2,000 temporary re-directs appeared in our site quality audit report. What was discovered and made known to us just two days ago from our client employed web developer contact is that the newly hired web development firm had resolved the issue on their development web server but to be able to push these changes live they had to rely on the original web development company.
We are now on hold on this project as the temporary redirect errors keep us from unblocking the search engines from crawling the live site. Just another example of the importance of owning your core file access. And, yes, believe it or not, this issue still happens today as shown in the previous example.
No Metadata Access
How many times have we taken on a new client or have had an existing client build a new website and the content management system does not allow for revisions to the page <title> or <metadescription>? No matter what, you will not succeed with your search engine optimization campaign if you have no access to revise either the page <title>, <metadescription>, or both.
Believe it or not, this happens more than you would think. If you don’t want to be a website with missing metadata, you should consider using our white label services, for online marketing needs.
Forms Not Working
This one happens quite often as well. A site is built, contact/conversion forms are created but they are never tested for functionality. Your client engages in a search engine optimization campaign and does not get any leads. You may be able to see that the conversion pages are being recorded in your site’s Google Analytics account but the client receives no form to fill out the information.
Many times they blame the search engine optimization company when, in truth, this is a web development issue. There is also the issue that the Google Analytics tracking code was not properly implemented and the reverse of the result. You report to your client that you are not seeing any conversions and they report that they are getting leads. The same applies to eCommerce shopping cart functionality.
[bctt tweet=”Again, another in the long line of web development set up issues. Believe it or not, this happens all of the time.” username=”ThatCompanycom”]Not Mobile Friendly
In this day and age of mobile-friendly first indexing from Google, your site must be mobile-friendly. This can set back any search engine optimization campaign if the site was not built with mobile-friendliness first in mind. If you have any concerns you can test your individual pages here,
Broken Internal Links
Many times, during a rebuild, URLs change, and/or pages have been removed. These require 301 permanently moved 301 redirects. This task of creating proper 301 redirects is a resource-intensive process. Quite often we find that this step gets overlooked.
XML Sitemapsv
The step of creating a correctly formatted .xml sitemap is also often overlooked. The .xml sitemap provides a list of your website’s URL structure. It is used by search engines to more efficiently crawl your website’s pages. Properly done, your .xml sitemap should be submitted to your Google Search Console account for processing. Most current content management systems programmatically create a .xml sitemap, however, you must manually submit your website’s .xml sitemap to the Google Search Console. This issue happens more than you would think.
Before making a website these are all things to take into consideration, but something else you should consider is your software. Read here to see if you have the best software for web development.
Summary
This is not a complete list of web development issues that can cause your site to fail in its mission to improve ranking results, lead generation, and sales. There are more, but these are the most common that we have experienced over the course of our business’s lifetime. Make sure that you cover these issues with your web developer when considering building a new site. Or even when you are rebuilding an existing site.
Want to avoid falling victim to common web development issues? You should consider using our white label web design services. Let us know what you thought of the article, and if this helped you avoid these web development issues.