Want to Boost Your Search Performance? Invest in UX Design

The world of SEO is an inherently volatile one. For a while it was about having all the keywords in all the places all the time. Then it was about links — any links at all, anywhere you could get ’em, in any context you could secure. Then it was about making sure those links were quality and the keywords weren’t stuffed. With every Google algorithm update, conventional wisdom on SEO has crumbled and risen anew from the ashes. And even when it takes on a new form, it’s still mostly guess work.

 

SEO1

 

But does it have to be? Are companies chasing the wrong thing as they shell out cash for the latest and greatest SEO tacticians? Are they possibly missing the forest for the trees?

Take, for instance, the most recent suspected update. Though there’s some debate as to whether the shift was related to Penguin, Panda, or some other animal-associated algorithm change, significant swings in site performance in June left many thinking Google was moving the goal posts once more. As Search Engine Journal reports, sins that got sites punished included:

 

  • Burying content below sponsored posts: A site that included a number of link thumbnails to sponsored content at the top of articles ended up getting hammered in the search results.
  • More ads than content: A site that featured two thirds ads, and one third content, ended up seeing a significant drop in search results.
  • Thin Content: A site focused on Q&A ended up seeing a drop in search results, most likely because it contained many pages with thin or irrelevant content.
  • Generic Content: A site that provided very generic content for the topic it covered ended up seeing a drop in search results. Content consisted mostly of rewrites of information that could be found elsewhere on the web. It probably didn’t help that the content was filled with ads as well.
  • Indexing Issues: One site that saw major fluctuations in search results is one that had indexing issues based on the site’s robots.txt file. This could have led to Google not being able to properly crawl the content on the site.

 

Notice a common thread up there? With the exception, perhaps, of the indexing issues, the sites punished most severely were ones who provided a poor user experience for the visitor in either aesthetic or substance. In other words: sites that provided a poor user experience.

Expect that trend to continue. SEMRush explains:

 

While user behavior data isn’t the top-ranked signal just yet, Google continues to integrate machine learning and artificial intelligence into its algorithm to interpret SERPs like a human would.

Moz’s Dan Petrovic wrote, “Google has designed and patented a system in charge of collecting and processing user behavior data.” Put simply, because Google is smart enough to know whether users like your website and content, it’s now accounting for this in search rankings.

 

That makes things a bit tougher, doesn’t it? SEO strategy has forever been focused on words, links, and combinations of links and words. But as Google starts folding in metrics like bounce rate, time on site, and pages per visit, it’s becoming clear that old SEO methods are going to have to evolve.

Then again, maybe it’s not about the latest search fads and “best practices.” Maybe, just maybe, if you stop focusing on how to satisfy the most recent iteration of the Google algorithms and instead focus on trying to be the sort of result Google is attempting to elevate, you might come out ahead instead of barely keeping pace. What does that mean? Giving the user a good experience. 

 

If you need help with that, we know some creatives here at crowdSPRING who might have some ideas for you.

 

Image Credit: The Next Web