Last Updated on August 31, 2021
Those that have been in the search engine optimization industry for some time know that no two search engine optimization campaigns are the same.
Many times, we run into technical challenges that have a negative impact on search engine optimization results. Often, correcting them are out of our control.
Technical challenges go far beyond standard search engine optimization operating procedures and require web development involvement.
Below are several of the challenges that we find when managing a search engine optimization campaign.
- No web developer
- Non-search engine friendly content management systems
- No access to the client’s content management system
- No access to the client’s website file server
What to Do If There Is No Web Developer Available for The Search Engine Optimization Campaign?
In this case, there are only two answers. You will need to gain access to both the web site file server and the content management administration system. Option “B” is the client will need to hire a web developer. He or she should perform the necessary revisions to bring the site into search engine optimization specification requirements.
The process is instrumental to SEO success and should be obvious, but incredibly, it is overlooked at times. The deployment of effective content and structure that enhances a search engine optimization campaign relies heavily on web design and development being aligned with the same goal.
What to Do If the Content Management Administration System Is Not Search Engine Friendly?
To properly perform search engine optimization, you will need the following but not limited to:
- The ability to generate unique <head> area page titles separate from being generated from the content’s <h1> tag
- The ability to generate <head> area meta descriptions
- The ability to generate search engine friendly URLs separate from the content’s <h1> tag
- The ability to add alt and title text to your site’s images separate from the content’s <h1> tag or the page’s <head> area page titles
Many content management administration systems simply do not offer basic search engine optimization services and it is highly recommended to have white label SEO services. When faced with these challenges you will need to recode the content management administration system. With access to the content management administration system, you can research how to implement the necessary changes and complete them.Google demotes ranking results for site’s that have 4xx response codes. Click To Tweet
However, you may not have access to the content management administration system. If the client has a web developer on staff, you can instruct them. Advise them what you need to do to perform search engine optimization for the client’s website. If the client does not have a web developer on staff, they will need to hire one. It would be best if they are proficient with the client’s content management administration system.
What to Do If You Have No Access to The Client’s Content Management Administration System at The Beginning of The Search Engine Optimization Campaign?
You are simply going to have to ask the client for this information.
Some clients are going to be leery of providing this access. In this case, you are going to need to have a clear and honest conversation with the client. Remind them how they have trusted you with this search engine optimization campaign. Certainly, they can trust you with access to their content management administration system, too.
If they refuse, then you will need to work with their web developer, if they have one. Implement any required changes for the client’s site to meet current search engine optimization specifications.
If they do not have a web developer, then provide the necessary revisions and requests to the client. Be sure to have them implemented correctly.
Why Is Having Access to Your Client’s Website File Server Important?
This is critical—especially if there is no web developer on the client’s staff.
The website file server has two critical elements needed to perform a successful search engine optimization campaign. Those are the web sites .htaccess file and the robots.txt file.
The .htaccess file
The .htaccess file is where we control the site’s 3XX redirects which are critical to prevent the “4XX File not found” response codes. These generate all kinds of problems with trying to rank a client’s site in the Google (and other search engines’) Search Engine Results Pages. Google demotes ranking results for site’s that have 4xx response codes. If your site is generating 4xx response codes and your competitors are not, then they have a better chance of outranking your website in the Google Search Engine Results Pages.
The robots.txt file
The robots.txt file is critical as well. Many times, we experience new client sites that have been rebuilt or are new and are having struggles with their ranking results. Many times, we have existing clients that want to refresh their site’s look, feel, and navigational structure. In both cases, their web developers may use the disallow robot crawl directive to block the search engines from crawling and indexing their new or refreshed site while under development.
In many cases, when the new or refreshed web sites go live, the disallow directive has not been removed from the robots.txt file, thus blocking the search engines from crawling and indexing these sites.
Existing client sites with great ranking results, visitor traffic, and conversions begin to fail because they went live without removing the disallow directive. New clients with new or refreshed web sites experiencing failing ranking results, visitor traffic, and conversions face the same result.
The robots.txt file is one of the first elements I check when working with a new client that has failing ranking results, visitor traffic, and conversions. I do the same with existing clients after a refresh even before they begin experiencing ranking drops, visitor traffic decline, and conversion losses.
You are going to need access to your client’s web file server to make these changes or the client is going to need to have a web developer on staff.
You must keep this review of your client’s robots.txt file high on your list. For some reason, I have experienced random acts of implementation of the crawl disallow during my search engine campaign management projects.
These are just a few of the technical challenges that you may face during a search engine optimization campaign.