Google recently rolled out a new search filter within its Advanced Search function called “Reading Level”. It allows you to find websites aimed at a Basic, Intermediate or Advanced reading level.
For example, if you are a teacher wanting to find web based materials on a topic targeted at juniors, you can select “show only basic results” to return results aimed at a basic reading level. In contrast, if you are a scientist, you are able to filter results to display websites aimed at an advanced reading level.
If we are able to display results based upon reading level, then we should be able to assess the level for each website. To run this assessment, simply go to Google’s Advanced Search function and type in a site query analysis command for a website. E.g. site:www.keywordintent.com
Next, select “annotate results with reading levels” and then click the Advanced Search button.
Google will return the reading level for the website tested. In this case KeywordIntent.com has been assessed by Google as having a 100% intermediate reading level. This indicates that a single website could actually target various reading levels for its content.
So, what about your website? Go on, do the test for yourself.
Not only does Google allow you to perform the reading level test on a website using the site: query tool, but you can also get a sense of the reading levels of websites focused on a topic. For example, if you type in the word “news” into Google, the results are 36%, 62% and 1% respectively for basic, intermediate and advanced reading levels across websites.
If you type in the word “photography” into Google, the results are 61%, 23% and 14% respectively for basic, intermediate and advanced reading levels across relevant websites.
Knowing the reading level of your customers will help you to write content at their preferred capacity. Allowing for different levels of reading across your site will help you to cater to the various reading levels of your customers.
For example, do all your customers want information at a basic level or do some of them want more detailed and substantial content? Understanding the range of content that is available online for a specific keyword can guide you as to the level of text required for your own website. It also helps you to understand if there are any gaps in the market.
Even though the Reading Level search filter is only on the Advanced Search page, you never know when Google will consider it as a mainstream function in the future. For now, it is an interesting tool to experiment with.
Google over the last few days has updated its search results pages for Google Places. With a touch of Bing inspiration, the new search results enable you to view the website page before clicking through to it by clicking on the magnifying glass.
The following search for “turf supplier brisbane” shows the new format of the search results page.
Relevant text from the website page is displayed in the side pop-out. Click on any magnifying glass to view the website summary pop-out. Another change is that the map has now moved to the right hand column above Sponsored Links (Google AdWords).
Google Place pages and reviews are also displayed within search results, encouraging users to learn more about the business in a trusted environment. This signals the importance of small and large businesses alike generating positive reviews about their products and services.
The comments from the Search Media industry are that the new Google Place search results indicate another step towards the death of Yellow Pages and other directory businesses around the world.
Google claims that they “clustered search results around specific locations so you can make comparisons and right the best sites”. I think they meant “rate the best sites” as star ratings are also displayed alongside search results.
The “Places” link is available in the left hand navigation links to display only local website listings.
The downside from a user and a business perspective, in my opinion of course, is that the maps box with 10 listings has been removed and now local search results dominate the page. Organic search results are pushed even further down the page, although if Places results are not as relevant to the search query, they too can be pushed down the results page. This may also have an impact for advertisers who rely upon Google AdWords as a way to generate traffic for local terms.
Overall, I find this change quite encouraging as it presents a range of opportunities to businesses.
Google Instant, a new method by the search giant delivering search results to its users, is being rolled out to the US, UK, France, Germany, Italy, Spain and Russia over the next several days. Search results automatically appear on screen as you type the words into Google allowing users to redefine their search queries as they are being typed.
Google likes to think of it as being “search before you type” rather than “search as you type” functionality.
Google claims that Google Instant will not have an impact on the ranking of search results. However Google Instant really is a fundamental shift in search due to results being localized. Google identifies where the user is located and as terms are being typed, it will display predicted results relevant to the local area first.
Google Instant can be accessed by logging into your Google Account and by using specific browsers including Chrome V5/6, Firefox v3, Safari v5 for Mac and Internet Explorer v8. This means that web history and personalized results will influence what websites are returned within predicted search results. So websites that have been searched upon and visited previously are more likely to appear within the new search results.
Even though there may not be a change in Google’s algorithm to determine relevant results, search user behavior is likely to change as search terms can be adapted on the fly. This will reduce the “search buying cycle” time as any irrelevant results can be weeded out quickly.
The upside is that websites that normally do not appear within search results may actually be displayed earlier in the keyword query stem, enabling the user to stop or go back to results that catch their eye. Impulse clicking may encourage users to visit websites that they may not have found normally.
The downside, which is also a benefit (depending how you look at it) is that users are more likely to click on results that are even more relevant to their search queries. Even though some websites may not generate as much traffic from search engines because the initial results are irrelevant, this may help to generate better quality traffic because the visits will be even more qualified. The challenge for site owners is creating content pages that are highly relevant to search users’ needs.
Google Instant will be rolled out to other countries including Australia and New Zealand over the next several months.
Social media and search complement each other and are increasingly becoming more intertwined than ever before. Google has launched its newest search product “Google Realtime” in an effort to provide real time content from comprehensive sources.
So far when I search within Google Realtime, Twitter results are primarily displayed.
Search results can be refined by search type (everything, blogs, news, discussions, videos, maps, shopping, books and more), time period and location.
Twitter retweets and replies to an initial tweet can be viewed as an entire conversation with additional comments indented for easier viewing. Messages are also organized from oldest to newest.
In addition, Google Realtime is integrated with Google Alerts, allowing for updates of specified topics to be emailed to you.
Google is preparing for Keyword Tool to come out of beta as per their announcement on Monday. The intention behind the latest update of Google’s Keyword Tool was to combine all the features of the original Keyword Tool and the Search-based Keyword Tool into one tool.
Both Keyword Tool (original) and Search-based Keyword Tool will redirect to the new Keyword Tool by the end of August 2010. Currently you can view these tools online, however in a few weeks time, the urls will be redirected and the “beta” label will be removed.
The updated Keyword Tool provides flexible search options, keyword refinement and advanced options. New features have also been added to Google Keyword Tool including the addition of negative keywords and the removal of duplicate keywords.
Google promises to continually develop and improve upon the latest update of Keyword Tool.
On Tuesday, Yahoo Japan announced that they will use Google Search technology for its own search engine and search advertising platform. This is a somewhat different arrangement to its sister company Yahoo in the United States in which they have a partnership with Microsoft to implement Bing search technology by end of 2010.
Yahoo Japan is currently the search leader in Japan owning a market share of 53.2% of search queries, while Google, even though its share is growing, remains at 37.3%. Microsoft MSN and Bing garner 2.6% market share.
Combining both Yahoo Japan and Google technology would mean that the company would represent approximately 90% of total search queries in the country.
The deal involves Yahoo Japan paying for Google’s search technology in addition to supplying content to Google. However, both search engines will still compete with each other.
Yahoo US and Softbank owns 34.8% and 38.6% of Yahoo Japan respectively. Softbank also owns shares in Alibaba Group, which runs Yahoo China. Yahoo US supports the deal between Yahoo Japan and Google.
Source: NY Times
Twitter’s co-founder Biz Stone reported that Twitter’s search volume has increased by 33% since April 2010 growing from 19 billion to 24 billion searches per month in June 2010.
Most of this traffic however, does not occur on the Twitter.com property itself, but rather through API requests from Twitter clients such as TweetDeck and Seesmic. No one 3rd party delivers the main share of API calls.
In April 2010, Danny Sullivan spoke with Twitter’s director of search, Doug Cook, who said that at times, queries per day read 750 millions, and expects Twitter to have 1 billion searches per day in the coming months.
Walter Isaacson interviewed both Biz Stone and Evan Williams, founders of Twitter during the Aspen Ideas Festival, July 2010.
They stated the following metrics for Twitter are:
- 130 million registered users
- 70 million tweets per day
- 200 million users visit the site every day
- 800 million search queries a day
Biz describes Twitter as an “information network, to get information needed now”, rather than a social network. Their positioning statement “what are you doing now” has changed over time to “what’s happening”.
Evan says that Twitter Search is still in its infancy. Twitter messages are provided to Google, Bing and Yahoo to display within their search results, however even they say it’s a search problem that is yet to be solved. Essentially search engines like Google use “freshness” as one of their signals to find the most relevant information, however since it is real time information, there is no history for the document so it is currently very difficult to deliver the best answer.
New functionality on Twitter allows users to tag tweets with location meta data such as venue name, neighbourhood or city. The exciting thing about this aspect of Twitter if it is used widely enough is that it can provide users with extended search capability to find out what is happening around them.
View the full video interview with Biz Stone and Evan Williams.
I have a love hate relationship with Facebook. On a personal front, I love being connected to my friends and hate the idea of businesses marketing to me. On my professional side, I am a “search” person at heart, because I like being in control of what advertising is presented to me when I’m searching specifically for a product or service. Being found at the moment a customer is searching for you, in my opinion, is an ethical way of doing business. You are not bombarding people with messages that are irrelevant to their needs.
Since the inception of Facebook, there have been shifty marketers who create applications and trick people into providing information. I absolutely despise that type of behaviour and it has rubbed off on my perception of what Facebook is about, even though there have been many changes made by the company in regards to privacy policies and user data access.
However, I cannot ignore the ever growing popularity of Facebook, and now the numbers stare at me in the face, that its community is using the internal search engine to search for products and services.
In February 2010, ComScore announced there was a 10% increase in the number of searches from the previous month conducted on Facebook.com. This was a jump from 395 million searches to 436 million searches.
Jumping forward to April 2010, Facebook experienced 624 million searches, although there was a slight decline of 2% in May 2010 to 609 million searches. It will be interesting to watch the search growth in the coming months.
Even though there has been substantial growth in the use of Facebook’s internal search engine, we need to analyze this further. Do people use the search engine in the same way that people use Google or Bing? What are people searching for? A keyword research tool for Facebook would be highly beneficial, but I imagine this will only be made available if it is of benefit to Facebook itself, in the same way that Google has their Keyword Suggestion Tool for the purpose of encouraging marketers to advertise with them.
AimClear Consulting Services underwent a Facebook SEO Ranking Factors study. Following is a summary of some of those results posted by Marty Weintraub on June 24th, 2010:
- The Facebook Suggest Box targets users that have the search query within their name
- Facebook heavily focuses on personalization, by presenting results of pages you have visited previously
- Facebook will also prioritize results of relevant events that the search user or their friends are attending
- Facebook returns pages that the user or their friends like, when they have clicked on the “Like” button
- Fan Pages and Applications are returned within the results based upon the highest friend count
- Mentions of the business name, product or service by friends of the search user increases rankings within results
It is still early days for Facebook’s internal site search function and what impact this will have on the search industry. Over the coming year we will keep an eye on developments to watch how social search unfolds, particularly with Facebook.
SEO consultants and website owners around the world will breathe a sigh of relief once they learn that Google’s new indexing infrastructure “Caffeine” is not a change to the search engine’s ranking algorithms. Caffeine, first mentioned by Google in August 2009, was fully rolled out over the last day and provides 50 percent fresher results for web searches than the last index.
The Caffeine run down is:
- It’s Google’s largest collection of web content so far
- Newly published web content will be made available through Google’s index much faster than before. Reduced from a couple of weeks to almost immediately.
- Google now updates their search index on a continuous basis, globally across all data centers, regions and languages. When new pages are found, they are added directly to the index.
- Google now has significantly more storage capacity, which will enable the index to scale as more content becomes available online.
- Google Crawl rates will remain the same.
- Ranking algorithms remain the same with Google indexing pages and associating anchor text and source of external links to those pages.
Even though Caffeine is not an algorithm change, it does mean that additional information can be annotated to web content much more quickly than before resulting in possible new ranking signals emerging in the future.
The number of words people use within their search queries determines how specific their inquiry is to find required information and where that query sits within the search life cycle.
The less words we use within our search query, the earlier we are within a search cycle and the less likely we know what we want. Usually we are in “just looking” or “browse” mode. The more words used within a search query, the more we know what we are specifically searching for and thus, the search query is placed later within the search cycle where there is a higher propensity to make a purchase decision (if our search is related to shopping).
The following table from Experian Hitwise monitors the number of keywords that search users from around the world type into search engines.
|Percentage of U.S. clicks by number of keywords
||Month-over-month percentage change
|Eight or more words
|Note: Data is based on four-week rolling periods (ending March 27, 2010, and May 1, 2010) from the Hitwise sample of 10 million U.S. Internet users.
|Source: Experian Hitwise
The data above shows that the majority of searches conducted contain between 1 to 3 keywords. A single word query could be keywords for example, like “cars”, “travel”, “restaurants”. Examples of two word queries are “Toyota cars”, “travel London” or “Brisbane restaurants”. Three word queries are even more specific like “Toyota Prius cars”, “cheap travel London” or “Italian restaurant Brisbane”.
1-2 keyword terms usually denotes that the search query is a “search head” term, those words that are highly searched upon. 3-4 keyword terms are starting to fall into “longtail term” territory, whereas queries that contain 5 or more keywords are definite long tail terms, those terms that are searched upon less frequently, but the intent behind the query is more specific.
Businesses need to target a mix of keywords within their Keyword Strategies to ensure that they are being found for at least 1 to 4 keyword terms that makes up 80.37% of searches.
Single word queries are likely to be highly competitive, since the lure of gaining rankings for this type of term would deliver substantial traffic. However, one word search terms are quite broad and suggest that the user is at the beginning of their search cycle. The customer has started their search journey and it will be further defined as they learn more about the topic they are searching upon.
Targeting single word terms are likened to a “brand marketing” campaign. Since these words fall at the beginning of the search cycle, it is beneficial to associate your brand and be found for the generic, single word queries within your category. Although these words are less likely to convert to a lead or sale compared to more specific terms, they do bring excellent brand awareness advantages and the opportunity for prospective customers to learn more about the brand. Due to the competitive nature of ranking for generic, one worded terms, the effort and cost associated to achieve and maintain the ranking will be high.
Many small businesses are less likely to rank for one worded terms due to competition and costs, so they are more inclined to target longtail keywords containing 3 or more words. Small businesses though, should not necessarily be put off targeting search head terms, because with persistence and commitment, they can still rank for one and two worded terms. It will mean though, the timing required to achieve the ranking will take longer ranging from 6 months to years. Business owners need to decide what level of investment they are prepared to commit to and select their keywords for organic rankings accordingly.
So, with the number of keywords people use within their search queries, what does it say about us? I believe it says that people rely on search engines to discover information about the topic they are interested in. For the majority, Google is their “first search” destination, leaving other information sources as secondary options to explore.