As of Monday 21st November 2011 Yahoo Site Explorer has been shut down. Yahoo made their offical announcement on 18th November on their blog. This marks the loss of a vital tool to the SEO industry.
Yahoo site explorer was an incredibly valuable tool as it allowed users to perform searches and explore the link profiles of any website. This allowed users to evaluate their competitor’s incoming links and devise strategies to gain the same or similar links to boost their own rankings. Similarly SEO consultants were able to view the highest performing sites in a particular industry, analyse their incoming link portfolio and strategize with their client in methods of gaining similar links.
Since the move to Bing webmaster tools most of this functionality has been lost. Both Bing and Google Webmaster tools only provide link data about a site that has been claimed within their systems. Therefore neither provides information about a site that is not owned by the user.
Google has deliberately suppressed link information and does not condone the reporting of link data through any of their services, therefore loss of Yahoo site explorer equates to the loss of all free comprehensive link reporting tools.
Third party tools such as SEOMoz Open Site Explorer and Majestic Site Explorer are available; however, they are subscription based and their resources are not exhaustive. Blekko, is a search engine that offers link data reporting tools, although it is much smaller than search engines giants Google, Yahoo, and Bing.
Do you sometimes wonder why your website is indexed highly one day and almost nowhere to be found the next day? Or why Google has subtle changes one day to the next? Don’t worry, you are not alone and for the most part it will be a recently introduced Google algorithm change from Matt Cutts and his team at Google.
In most cases Google doesn’t announce a change to their algorithms and there is generally no prior warning, but it doesn’t take a good search engine optimization company long to undertake some speedy research in internet land to find out what changes have been made. Often followed by a large panic by their clients wondering what went wrong with their SEO plans. Depending on your relationship with your search engine optimization consultant, you should be advised of a Google algorithm change that will affect the indexing of your site – before your monthly report arrives in your inbox!
Why Google makes changes and improvements to algorithms
Google wants to have a search engine that works for users that allows them to find what they really want to find – the first time. Before Google makes algorithm changes they put in a huge investment into understanding what works for users. Every year Google will implement over 500 improvements to its search algorithms to make sure they achieve the best results for each user. Scott Huffman, Engineering Director at Google says “we really analyse each potential change very deeply to try and make sure that it’s the right thing for users”.
The process for changing algorithms
If Google sees a set of motivating searches that are not performing as well as Google would like, the engineers will then come up with an hypothesis about what signal or data can be changed in the algorithm.
A change or improvement to the algorithm may also start with a creative idea, but it always goes through a testing process. “All ideas are tested through rigorous scientific testing” says Amit Singal, Google Fellow.
After a change has been made to the algorithm the first test is with “Raters” who are external people trained to judge if one ranking is more relevant and of a higher quality than another. These results are laid out side by side on the screen for queries that the engineers’ experiment might be affecting.
Once this has been tested Google then uses what they call a “sandbox” which is where they send a small fraction of actual Google traffic to the sandbox and are then able to compute lots of different metrics. In 2010 Google ran over 20,000 different experiments. Amit Singal says “if scientific testing says this is a good idea for Google users, we will launch it on Google”.
The importance of an optimized website
Taking all of the above into consideration it’s not difficult to see why SEO companies have to work extremely hard to keep your site optimised and why SEO companies really need to be on the ball. With over 500 changes to the algorithms each year it shows how important a continued relationship with your SEO Company is!
I am continuously asked by clients about whether they should register domain names with their target keywords within them. Many are given advice by providers to register tens or hundreds of domain names that are laden with keywords to create multiple websites or to 301 Redirect those domains to the primary domain.
Our usual advice at Keyword Intent is that creating multiple websites under keyword domains is not as effective as building the authority and trust of the one website. Redirecting multiple keyword domains back to a primary domain also does not provide any benefit. It does not transfer the word signals from one domain to another and therefore is wasted effort.
Matt Cutts, a leading spokesperson at Google shared in his video post on YouTube (March 7th, 2011), that Google has tweaked it’s algorithm to ensure that keyword laden domains are not given as much weight as they used to.
Matt recommends using “brandable” names like Twitter, YouTube, Digg, etc as it ultimately helps you stand out amongst all the other players that use generic keywords within their domains. He states that it is possible to succeed without keywords within the domain name.
Having said that though, it is basic marketing advice that if you do not have a large advertising budget, then using words within the business name to describe the activity of the business can be useful to communicate clearly to potential customers what you do. A balance between brand names and keywords is recommended.
It is also important to seek legal advice around brand names that can be trademarked. In many cases, generic keywords can not be trademarked.
Google users will soon have the ability to block whole domains from their search results. Next to the “Cached” link within Google search results will be a text link option “Block all [site.com] results”. It is currently being rolled out on Google.com for specific English browsers including Chrome 9+, IE8+ and Firefox 3.5+ and will soon be implemented to other regions, languages and browsers.
This is a move by Google as part of their ongoing campaign to improve the quality of sites delivered within their search results to users. It allows users to personalize their experience on Google and improve the search results they receive.
In order to block a site, the user will need to be logged into their Google Account. Users will also have the option to unblock a blocked site. If a blocked site would normally appear within the search results page, a message will display informing the user that a site has been blocked. They can then go into their Google Account to manage their blocked site list further.
So what does this mean for website owners? At the moment Google states that the “blocked sites” function will not be used as a signal for ranking factors, however they will use the data for analysis and to help them improve future results.
Google announced today they launched a large change to their algorithm in order to substantially improve rankings. They say it is a noticeable impact of 11.8% which will affect rankings for many websites. Many website owners will not be happy, although the algorithmic change has been a long time coming with much discussion on the Internet over the last year to provide prior warning.
Google believe in a healthy web ecosystem and therefore want to reward high quality content websites. This particular algorithmic change is being rolled out in the United States first, however it will be implemented around the world within time.
So what is a low-quality website?
Provides low-value add to users
Content is copied from other websites
Sites that are not very useful
Google is interested in displaying high quality sites that possess the following criteria…
Provides content as research
Thoughtful analysis, etc.
Although Google have listed “original content” as being high-quality, what other factors will be used within their algorithm to determine the most relevant, authoritative and “best quality” content to display within search results? Does length of copy determine high quality? What about choice of words? Or use of contrast and comparison? Number and quality of comments and number of times shared by influentials?
Time will tell as SEOs around the world examine, test and report ranking results.
Google recently rolled out a new search filter within its Advanced Search function called “Reading Level”. It allows you to find websites aimed at a Basic, Intermediate or Advanced reading level.
For example, if you are a teacher wanting to find web based materials on a topic targeted at juniors, you can select “show only basic results” to return results aimed at a basic reading level. In contrast, if you are a scientist, you are able to filter results to display websites aimed at an advanced reading level.
If we are able to display results based upon reading level, then we should be able to assess the level for each website. To run this assessment, simply go to Google’s Advanced Search function and type in a site query analysis command for a website. E.g. site:www.keywordintent.com
Next, select “annotate results with reading levels” and then click the Advanced Search button.
Google will return the reading level for the website tested. In this case KeywordIntent.com has been assessed by Google as having a 100% intermediate reading level. This indicates that a single website could actually target various reading levels for its content.
So, what about your website? Go on, do the test for yourself.
Not only does Google allow you to perform the reading level test on a website using the site: query tool, but you can also get a sense of the reading levels of websites focused on a topic. For example, if you type in the word “news” into Google, the results are 36%, 62% and 1% respectively for basic, intermediate and advanced reading levels across websites.
If you type in the word “photography” into Google, the results are 61%, 23% and 14% respectively for basic, intermediate and advanced reading levels across relevant websites.
Knowing the reading level of your customers will help you to write content at their preferred capacity. Allowing for different levels of reading across your site will help you to cater to the various reading levels of your customers.
For example, do all your customers want information at a basic level or do some of them want more detailed and substantial content? Understanding the range of content that is available online for a specific keyword can guide you as to the level of text required for your own website. It also helps you to understand if there are any gaps in the market.
Even though the Reading Level search filter is only on the Advanced Search page, you never know when Google will consider it as a mainstream function in the future. For now, it is an interesting tool to experiment with.
Google over the last few days has updated its search results pages for Google Places. With a touch of Bing inspiration, the new search results enable you to view the website page before clicking through to it by clicking on the magnifying glass.
Relevant text from the website page is displayed in the side pop-out. Click on any magnifying glass to view the website summary pop-out. Another change is that the map has now moved to the right hand column above Sponsored Links (Google AdWords).
Google Place pages and reviews are also displayed within search results, encouraging users to learn more about the business in a trusted environment. This signals the importance of small and large businesses alike generating positive reviews about their products and services.
The comments from the Search Media industry are that the new Google Place search results indicate another step towards the death of Yellow Pages and other directory businesses around the world.
Google claims that they “clustered search results around specific locations so you can make comparisons and right the best sites”. I think they meant “rate the best sites” as star ratings are also displayed alongside search results.
The “Places” link is available in the left hand navigation links to display only local website listings.
The downside from a user and a business perspective, in my opinion of course, is that the maps box with 10 listings has been removed and now local search results dominate the page. Organic search results are pushed even further down the page, although if Places results are not as relevant to the search query, they too can be pushed down the results page. This may also have an impact for advertisers who rely upon Google AdWords as a way to generate traffic for local terms.
Overall, I find this change quite encouraging as it presents a range of opportunities to businesses.
Google Instant, a new method by the search giant delivering search results to its users, is being rolled out to the US, UK, France, Germany, Italy, Spain and Russia over the next several days. Search results automatically appear on screen as you type the words into Google allowing users to redefine their search queries as they are being typed.
Google likes to think of it as being “search before you type” rather than “search as you type” functionality.
Google claims that Google Instant will not have an impact on the ranking of search results. However Google Instant really is a fundamental shift in search due to results being localized. Google identifies where the user is located and as terms are being typed, it will display predicted results relevant to the local area first.
Google Instant can be accessed by logging into your Google Account and by using specific browsers including Chrome V5/6, Firefox v3, Safari v5 for Mac and Internet Explorer v8. This means that web history and personalized results will influence what websites are returned within predicted search results. So websites that have been searched upon and visited previously are more likely to appear within the new search results.
Even though there may not be a change in Google’s algorithm to determine relevant results, search user behavior is likely to change as search terms can be adapted on the fly. This will reduce the “search buying cycle” time as any irrelevant results can be weeded out quickly.
The upside is that websites that normally do not appear within search results may actually be displayed earlier in the keyword query stem, enabling the user to stop or go back to results that catch their eye. Impulse clicking may encourage users to visit websites that they may not have found normally.
The downside, which is also a benefit (depending how you look at it) is that users are more likely to click on results that are even more relevant to their search queries. Even though some websites may not generate as much traffic from search engines because the initial results are irrelevant, this may help to generate better quality traffic because the visits will be even more qualified. The challenge for site owners is creating content pages that are highly relevant to search users’ needs.
Google Instant will be rolled out to other countries including Australia and New Zealand over the next several months.
Google is preparing for Keyword Tool to come out of beta as per their announcement on Monday. The intention behind the latest update of Google’s Keyword Tool was to combine all the features of the original Keyword Tool and the Search-based Keyword Tool into one tool.
Both Keyword Tool (original) and Search-based Keyword Tool will redirect to the new Keyword Tool by the end of August 2010. Currently you can view these tools online, however in a few weeks time, the urls will be redirected and the “beta” label will be removed.
The updated Keyword Tool provides flexible search options, keyword refinement and advanced options. New features have also been added to Google Keyword Tool including the addition of negative keywords and the removal of duplicate keywords.
Google promises to continually develop and improve upon the latest update of Keyword Tool.
On Tuesday, Yahoo Japan announced that they will use Google Search technology for its own search engine and search advertising platform. This is a somewhat different arrangement to its sister company Yahoo in the United States in which they have a partnership with Microsoft to implement Bing search technology by end of 2010.
Yahoo Japan is currently the search leader in Japan owning a market share of 53.2% of search queries, while Google, even though its share is growing, remains at 37.3%. Microsoft MSN and Bing garner 2.6% market share.
Combining both Yahoo Japan and Google technology would mean that the company would represent approximately 90% of total search queries in the country.
The deal involves Yahoo Japan paying for Google’s search technology in addition to supplying content to Google. However, both search engines will still compete with each other.
Yahoo US and Softbank owns 34.8% and 38.6% of Yahoo Japan respectively. Softbank also owns shares in Alibaba Group, which runs Yahoo China. Yahoo US supports the deal between Yahoo Japan and Google.