Sometimes you need both tools to make your case and solve a problem. For one client in particular, we were receiving mobile usability issues in search console. Specifically we found:
- Viewport not configured: 465 pages with errors
- Small font size: 465 pages with errors
- Content not sized to viewport: 464 errors
- Touch elements too close: 439 errors
- Flash usage: 396
We spot-checked several of these reported URLs with mobility issues, which failed the mobile friendly test in the Google mobile friendly testing tool, https://www.google.com/webmasters/tools/mobile-friendly/. We screen-capped these results to supply to the client. Further, we downloaded each section’s data and created a spread sheet with each section’s data as its own tab for the client’s use in correcting the mobile usablity problems. We sent an email explaining the problem, including the screen caps in the actual email, and the spread sheet as an attachment.
The response we received is that the references provided were for an area of the web site that had older posts unlinked and that updating these specific pages to be mobile friendly would be time consuming and unnecessary. True. The screen caps were for one particular area of the website and for older posts in the section. We will call that section /files. We had no prior knowledge that these pages had been unlinked, or specifically what pages in this section remain live. Hmmmm? There was a spread sheet supplied with five tabs with a combined total of over 1,600+ pages with errors. Many of these results in the spread sheet contained other areas that we know are live on the site. They resolve in the browser and remain failing the mobile usability test along with newer posts in the named section. This includes most of their top level navigation.
The mentioned, /files, area of the site which had older posts ‘unlinked’ from the site, still resolved in a browser. They are still ‘live’. There are no links on these /files pages that might lead to other /files pages. So, we can assume that these links to these older ‘unlinked’ posts are coming from somewhere else other than the pages themselves. We know that Search Console only reports on pages that can be crawled via some sort of a link. So, where are these links coming from?
Starting with the basics, we checked the .xml sitemap. And what did we find? You guessed it: URLs in the .xml sitemap with the /files older ‘unlinked’ posts in them. They are pointing to these ‘unlinked’ files themselves. Great! Next, we checked in Search Console, the links to your site tool under the Search Traffic tab. Yep! You guessed it. Links pointing in from other domains-1,375 of them. This was a total of 539 domains pointing to these ‘unlinked’ but still resolving pages. To further validate our findings before responding to the client, we reached in to the Google Analytics software and took a look around there as well. Guess what? These supposed ‘unlinked’ pages are being found by humans as well as by the Googlebot and other crawlers!
In the behavior section of GA, in the Site Content Menu, using the Content Drilldown tool, and looking at the previous thirty day period, and applying a filter to view only these /files pages, we were able to see that there were 3,089 visits to pages in this section; both for pages known to exist and those supposedly non-existent. There were a total of 341 unique pages visited in total of which many of these were the older posts that had been unlinked. There was very little sign of bot crawl activity as many of these page visits were better than the 1 visit, 0:00 time on site, 100% bounce rate of a bot crawl.
The following are examples of these ‘unlinked’ pages and their results:
ex.#1: 119 page visits; 104 unique page views; 2:06 avg. time on page; 60% bounce rate
ex.#2: 101 page visits; 95 unique page views; 4:18 avg. time on page; 75.31% bounce rate
ex.#3: 59 page visits; 47 unique page views; 1:08 avg. time on page; 44.44% bounce rate
ex.#4: 17 page visits; 16 unique page views; 2:28 avg. time on page; 50% bounce rate
So, the facts are in. Simply ‘unlinking’ pages from the site navigation, pages, does not resolve mobile usability issues. Nor, does it remove the pages from the index if there are ways to navigate to them.
This is an open issue with this client so we are still looking into what tasks to implement to correct these mobile usability issues. We could advise to 301 redirect these pages and update the .xml sitemap. We could advise to remove the pages, update the .xml sitemap and create a proper 404 page to offset the 404s that will generate from the inbound links from other domains.
We could advise the client that they should update these pages and that it is necessary for these pages to be mobile friendly. The real kicker here is that this is a very well-known company that relies on their customers to be mobile friendly off site and on site to generate income.
Now, what would you do?
–Mark Gray, Senior SEO Manager