On Friday, Nov. 3, 2011 the SEO community received a wake up call that may have left an aftershock thru the weekend and perhaps for the weeks to come.  For anyone that’s interested in Search Engine Optimization (SEO) and internet marketing specialists in particular will find the following information to be quite resourceful.  Google’s “Freshness” update was discussed by none other than SEOmoz CEO & Co-Founder Rand Fishkin (@randfish).  This time around on special whiteboard Friday next to Rand was Mike King (@iPullRank) sharing the new changes in Google’s algorithm which have reflected approximately 35 percent of searches in the Search Engine Result Pages (SERPs). So, what’s all the fuss about?  Well as you may already know Google’s attempting to dominate the “freshness” factor.  Something that they’ve been on top of quite frankly for the last 10 years and technicality nothing unexpected.   The recent Aftermath of Panda 2.5.2 update clearly states that they’re interested only in domains that provide quality fresh content.  I highly suggest that you watch the video with Mike King and Rand Fishkin to get a firm understanding of how this change may affect your website.

Takeaways From Google’s “Freshness” Update

As a website owner Search Engine Optimization (SEO) should be a primary marketing technique for creating visibility of your brand on the web.  Treating this lightly may be costing your business potential leads and hundreds maybe even thousands of dollars in untapped revenue.  That’s right!  Get educated on the issue before it’s too late and get to the bottom of it.  In the meantime let’s break down the effects Google’s “Freshness” Update may have on your website.

  1. Search Engine Results Pages (SERPs) – it’s slightly unclear when discussing the 35 percent change in search queries.  Just to set the record straight as Mike King has outlined in the video this only applies to the SERPs as oppose to individual keyword terms.  Let’s rewind that for an instant replay.  Google’s “Freshness” update impacts 35 of search queries only.  Don’t be a victim of this change and ensure that your website follows SEO best practices such as content, meta data, keyword density and anchor text just to name a few out there.
  2. Query Deserves Freshness (QDF) Terms – in the video Mike and Rand talk about Query Deserves Freshness keyword terms such as “Kim Kardashian wedding”.  Quite honestly I hope that the buzz about her screwed up wedding goes away sooner than later.  Although the majority of search users which are some 10%+ for this instance are Googling this insanely.  Therefore the new algorithm update which runs off the Caffeine infrastructure fluctuates the search results.  When you think about it in a logical way it makes perfect sense.  Volume and search demand for such keyword phrase is abnormal due to increased demand by searchers.  So each time a user performs a search query on Google the engine will attempt to serve up the most recent/relevant result.  How does this affect your website?  Let’s say that you wrote a blog post a week ago on a given subject.  In Google’s eyes today it’s already outdated and that’s why it will fall off the SERPs ladder.
  3. RSS Feeds – they’re more important than ever before.  In the new “freshness” update based off the Caffeine infrastructure GoogleBot will attempt to check your domain for an rss feed.  Does your website have one?  Better get one then and here’s why.  Most of the indexed results in the Search Engine Results Pages (SERPs) contain content from domains that have an rss feed.  In Google’s eyes relevance and freshness can be found the fastest thru an rss feed as oppose to contextual targeting.  Still following along?  If your domain does not have an rss feed that broadcasts the latest content than consider getting one.  Open Source Solutions such as WordPress or Joomla provide built in rss functionality that can be customized to your liking.  Furthermore once you acquire one consider burning it thru the Google Feedburner.  It will help ensure proper communication of your domain with GoogleBot for proper crawling of content.
  4. XML Sitemaps – now you wonder what this new technical term really means.  Well, it shouldn’t be a new term to begin with and rather something that you’re quite aware of frankly.  An XML sitemap is a file that contains url links to pages of your website.  Typically it can be found in the root folder of your domain located on the server that controls all your files for the domain in reference.  It’s an extremely important file used by all search engine crawlers in an attempt to identify the structure of your domain.  Some basics about XML sitemaps will give you a brief understanding of its importance in addition to demonstrating the creation of an actual file.  One crucial detail that Mike and Rand discuss in this video pertains to the timestamp that’s associated with each url in your XML sitemap file.  Something that GoogleBot now takes strongly into consideration when indexing content of your domain into the Search Engine Results Pages (SERPs).  Google’s Webmaster Tools offers all the necessary tools for submission of your XML sitemap in addition to proper configuration for ensured visibility.

The above mentioned items are something that every website owner should take serious notice of in order to stay ahead of the SEO curve.  As if the Google Panda Update change wasn’t enough of a change that impacted some 10-12% of search queries on the internet.  As always keep in mind that content is king.  If you think about it from a user’s perspective and put yourself in their place you’d understand exactly what I’m referring to.  Each searcher expects to access the latest information on a given subject or topic.  Nobody likes stale content that’s outdated.  When it comes to organic and on-page optimization of your content strongly consider the above items.