Over the past few weeks Google has taken some serious measures to eliminate web spam from its organic search results. Early February, JC Penny was hit with a manual and algorithmic penalty for “buying” links with very specific targeted keywords. More recently Overstock and Forbes have been penalized for participating in both “buying” and “selling” links respectively.
We knew it was not going to be long before Google released a major algorithm update to combat the very prevalent web spam and link farms we have seen growing over the past couple of years. Well the time has come; today Google’s Matt Cutts & Amit Singhal unveiled an algorithmic change that claims to impact 11.8% of search queries.
According to Singhal, this update is targeted to “reduce the rankings for low quality sites while increasing the ranking for high quality sites.”
What exactly is Google’s definition of “low” quality and “high” quality? The official definitions from Google are:
“Low-quality sites – sites which are low-value add for users, copy content from other websites or sites that are just not very useful.”
“High-quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.”
Google is also claiming that the update does not rely on the feedback that it receives from the “Personal Blocklist Chrome Extension”. They do however claim to have compared it to the Block List Data they have gathered to date and show a staggering 84% match with the algorithm update. Coincidence?
Finally this update is currently only being rolled out in the United States Only, other countries will follow over time.
You can read the Offical Blog Post from Google here.
My previous post describes Google Instant and the new search results user interface. Now that folks have had several hours to play certain realizations begin to set in. What does this mean for Search Engine Optimization? What does this mean for my traffic?
All good questions in this post I will address the first question which came to my mind. What about Analytics? How do I track Google Instant partial queries? Now that Google is presenting real time or instant results, there is a high chance that the query string that gets passed to Google Analytics is incomplete or rather partial because the link was displayed before the user even completed typing the query!
For example an instant query result for “weather” may only be passing along “w” as the query parameter to Analytics since Google displays the link to weather after just typing “w”. To understand what a user needed to type to find the result they were looking for an additional parameter is being used in the result set. The parameter is “oq=” which will give you the information you are looking for.
To track Partial Queries, and their position in Google Instant, you will need to create a new profile along with a new filter in your Google Analytics Report. It is pretty straight forward; below is a sample filter you can use to start tracking.
- Create a new Filter name: “New Instant Ranking Filter”
- Set Filter type: “Custom filter – Advanced”
- Field A -> Extract A: Referral, ^https?://www\.google\.(co.uk|com)/(?!custom|m/).*[?#&]cd=([^&]+).*&q=([^&]+).*&oq=([^&]+)
- Field B -> Extract B: Medium:^organic$
- Output To -> User Defined: $A5 (position: $A3)
You may have to play a little with the filter for you specific requirement but this should give you a good start.
Let me know if you have any other suggestion or comments.
The big anticipated announcement from Google this morning is “Google Instant”.
Google is moving away from the traditional HTML based results to a more robust AJAX based application for delivering ‘real’ time search results. Marissa Mayer noted that Google has already made approximately 500 changes to search ranking and user interface (UI) in 2010.
It takes a user on average 9 seconds to enter a search query followed by a few hundred milliseconds on Google’s Servers to render a search result. The user then averages about 15 seconds looking at the results. Google Instant claims to save user 2-5 seconds per query, which in turn will save 11 aggregate hours per second.
Google will display characters in black that they have typed followed by shifting grey predicted characters as the user continues to type. Why even keep the search button at this point? Well it forces Google to search for exactly what you’ve typed, without predicting how you’ll finish that search.
Instant will begin rolling out to Google domains in the US, UK, France, Germany, Italy, Spain and Russia who use the following browsers: Chrome v5/6, Firefox v3, Safari v5 for Mac and Internet Explorer v8.
For more information from Google you can visit their brief description over at:
Google’s announcement this past Friday (August 21st 2010) has many SEO talking. Everyone is well aware that Google makes approximately three hundred algorithm changes a year, which equates to roughly 1 change per day. This time Google accompanied an algorithm update with an announcement on their Google Web Central Blog.
So what’s the announcement? Here it is. “Showing More Results From A Domain”. Google announced a tweak designed to surface multiple pages from a single site for relevant queries.
“For queries that indicate a strong user interest in a particular domain, like [exhibitions at amnh], we’ll now show more results from the relevant site,” says Google software engineer Samarth Keshava. “Prior to today’s change, only two results from www.amnh.org would have appeared for this query. Now, we determine that the user is likely interested in the Museum of Natural History’s website, so seven results from the amnh.org domain appear. Since the user is looking for exhibitions at the museum, it’s far more likely that they’ll find what they’re looking for, faster. The last few results for this query are from other sites, preserving some diversity in the results.”
This change does not come without controversy, many SEO are screaming similarities to “Mayday”. What are your thoughts? I would love to hear from you.