Rob Cottingham from ReadWriteWeb commented on a cartoon, Search Engine Pessimized. He states, “It’s happening to more and more of the blogs I read: the personality, quirkiness and unique voice that once made them so appealing to me are fading. In their place, an SEO-driven uniformity that puts keyword placement ahead of pretty much everything.”
As an SEO consultant, I would have to agree with Rob. I’ve been working with a few publishers recently and what makes their headlines so magnetizing and hooks people into reading their articles, are at risk of blandness.
A message that I have heard so often in the SEO industry, is that we are to optimize our websites not only for search engine spiders, but also for humans. Real people folks. Those that feel real emotions when they laugh, cry, shout or grumble when they read or watch something that touches them.
The balance between SEO and creativity is being lost. Being too far one sided with “keyword rich” content will surely miss the point of what we are trying to achieve. In most cases, our goal is to generate qualified traffic that converts into an action such as buying a product, subscribing to an email newsletter and even generating repeat usage through brand engagement.
An audience member or customer will only want to establish a longer term relationship with a brand should it be engaging and touch the core of what is relevant to them.
Unlike PPC, a tap that turns traffic on when you shovel money into the Google slot machine, the purpose of SEO is to generate sustainable business. Sustainability for websites, whether it be a blog, a newspaper or ecommerce site, is to ensure that when people arrive on those sites, they want to return again.
Can SEO and creativity go hand in hand?
Sure they can! SEO is both a science and an art. If SEO leans towards science too much, you will certainly loose the sparkle that makes a piece of content go viral. We want people to find our content, click on it, enjoy it and spread it around the world to like-minded friends.
Getting our pages ranked within search engines is only the first part of the challenge. On a search engine rankings page of 10 possible listings a person could click on, I want them to click on my website or that of my client’s. Content headlines need to include both SEO and creativity. A well written, intriguing headline will certainly get more clicks than one that is optimized purely from a robotic perspective.
Winning the click, is the second part of the challenge. A quirky headline that is also optimized for search engines, increases the chance of that listing being clicked. Keyword rich content that is also highly engaging and thought provoking, increases the chance of people blogging and linking back to that piece of content. Natural link building, is the best form of inbound links and is an integral aspect of SEO.
Even if there is a homogenous trend happening in the SEO industry, it needn’t be that way. SEO guys and gals need to step up with their creativity. I for one am not the most creative or the best writer, however, by engaging creatives who can make a piece of content stand out amongst the rest, are certainly worth their weight in gold.
SEOs and creatives make highly effective teams, however, balance is the key.
One thing you have to get used to in the Search Marketing industry are the continual changes implemented by search engines. What might have worked a year ago, may not work right now. Case in point is PageRank Sculpting.
Lisa Barone attended SMX Advanced in early June and during a Q&A session with Matt Cutts, an official spokesperson for Google, found that he (or Google) does not support PageRank Sculpting. This doesn’t mean however, that your site will get penalized if you have implemented PageRank Sculpting on it.
Essentially Google is saying that PageRank Sculpting is not useful anymore and that it is better use of one’s time to fix the website’s architecture in the first place.
PageRank Sculpting is a technique where a nofollow tag is applied to internal links within the site to prevent spiders from crawling or passing PageRank from one page to another. If you had 10 links on the site and half of them are nofollowed, then the theory is that all the PageRank will flow to the 5 that have no nofollow tags. This essentially boosts the importance of the pages that have not been nofollowed and increases the chances of them ranking in search results.
Matt Cutt says, “Your leftover PageRank will now evaporate”. Essentially you can’t direct PageRank where you want it to go. PageRank Sculpting just isn’t effective anymore. It is far more effective to create new content for your website, rather than bend and shape the flow of PageRank.
In my opinion, it is best to follow Matt Cutt’s advice. I would rather focus my energy and time on developing a great website that has engaging content and has useful functionality. This will grab the attention of users and the media who will want to share it with other like minded people.
“If you have a lot of blog content for a new section of a site (100+ pages), is it best to release it over a period of time, or is it fine to just unleash 100 pages?”
Response by Matt Cutts:
I think in most cases, especially if it is high quality content, I would just unleash the 100 pages. If you are talking about 10,000 or 100,000 or a million pages, you might be a little more cautious. It’s not like this would cause any automatic penalty, but if Google sees a site that was nothing the other day and then suddenly there are 4 million pages in the index, then this could be the sort of thing where someone (from Google) will take a look at the site to confirm if it is legitimate content or automated junk that has not added value.
So, 100 pages I would not worry about, but I would make sure it is high quality content. A lot of times when you create content organically, you’ll end up with a page at a time, so go ahead and publish it when you have the new page. You do not have to wait until you have a lot of different content and batch it up and release it. It is okay to release it a page at a time. Especially if it is small scale or high quality content, I wouldn’t worry about it.
Originally named Kumo, Microsoft’s new search engine is now rebranded to Bing and is about to be revealed. Microsoft is rolling out a TV, radio, Internet and newspaper brand campaign to convince people to “use today’s search engine”.
Microsoft believes that 42% of all first query searches need to be refined which presents a huge opportunity. The goal of Bing is to reduce the time involved with searching by providing better results the first time round. They will do this by displaying related categories to help provide a better user search experience. For example, Bing will provide related categories such as Reviews and Prices for product related searches. Finding exactly what you are looking for within a shorter amount of time is quite compelling if Microsoft can pull it off.
The question remains, since Google provides a great search experience and is adequate, will users switch to a better engine? Is a better search experience on another engine just frills that the masses won’t switch for?
In any case, the web needs Microsoft’s Bing to succeed. Even though Google is not technically a monopoly, it dominates most markets around the world and is certainly beginning to feel like one. Google still demonstrates its ability to innovate, specifcally with its recent Rich Snippets release and support of RDFa and Microformats. However, we need competition in the search industry as it provides choice for users and stimulates innovation. It is obvious that Google’s new use of data was spurred on by competitive moves by Wolfram Alpha. These new services by Google is proof that competition ensures that the encumbent keeps on innovating.
Can Microsoft innovate to a level that keeps Google on its toes? I hope so. There are plenty of smart people who work for Microsoft. Contrary to popular belief, the brightest people in the world do not all work for Google. Let’s hope that the Microsoft Bing team have the ability to experiment and develop search technologies that can make a difference.
Mobile web usage including mobile search is growing faster than for the desktop counterpart. Users’ expectations are that the mobile search experience should be equivalent to the same performance that Google provides for the web. People expect fast, relevant, comprehensive and fresh results.
There are huge challenges being faced with the mobile web in general, little alone mobile search. Hundreds of devices with widely varying capabilities and that do not follow any set standards makes it difficult for websites owners and search providers to provide simple, easy to use web services. On top of this, the mobile web is inventing itself at the same time that people are discovering these new devices and providing feedback.
Google knows that search needs to be easy and effortless to get answers on a mobile device. It is their objective to provide “all of Google” on mobile.
The cornerstone of mobile web usage is that mobile is inherently local. The mobile phone is where the user is and that specific location can provide results based upon locality. No longer do you need to type in the city or suburb of where you are. Your mobile phone and Google already knows this.
Usability of Google’s mobile search product is incredibly important. For example, when you click to search, the search field zooms to the top of the screen. When you start typing, Google Suggest provides recommended relevant results to explore that matches your search query. This is more than displaying another suggested search term query, but rather actual local search results including business name, address, phone, post code and link to map. The most relevant results are displayed that are closet to the user’s current location.
Beside individual search listings, Google displays buttons to “click to call” and “get directions”.
With image search, depending on the device you are using, the user is able to slide from one picture to the next with ease.
Product Search for mobile provides nice rendering of images with an application type feel. The user is able to go down several layers in order to gain further details such as detailed reviews or technical specs.
Mobile web is a subset of the wider Web of which Google searches both to return suitable results to the user’s search query.
Some websites do not have a mobile purpose built website, but rather apply CSS as a way to render the site for mobile devices. An example of this is CNN.com.
However, for some website owners, and particularly those in Japan and China, they build websites specifically for the mobile web.
Even though Google blends mobile rankings with websites from both the mobile web and the wider web, there is a bias towards the former. If your website is built specifically for the mobile web it is likely to rank higher than a “CSS rendered mobile site”, even if both are equally relevant to the user’s search query.
Source: Mobile Search Quality, Searchology Presentation, Scott Huffman, Engineering Director, Google.
I viewed Google’s last Searchology event a couple of years ago when they launched Universal Search. That single event was a game changer in the search industry. Vertical searches by media type (video, images, maps, books, web pages, etc) were blended together within main search results.
Well, Google has done it again! Yesterday at the second Searchology event, Google announced the adoption of Microformats and RDFa, a structured format of data. This is another game changer that search marketers need to be aware of.
Similar to Yahoo’s Search Monkey, website owners who use microformats or RDFa to format their content may have it displayed within Google search results. Google has a never ending quest of delivering results that are the closest match to the intent of the user’s search query and to offer other relevant results that could benefit their search further.
Google has enhanced search result snippets by adding content from website pages such as Reviews, People Contact Information, Business and Organization Information and Products. Let’s take a closer look at what data types are supported for each of these and how they can be implemented. Read the rest of this entry »
Dee Barizo poses the question, “how many links is enough?” This is a very good question as most clients when they learn about link building want to quantify the effort involved with undertaking such a task. Usually SEO professionals answer, “it depends”.
It depends on what words you want to rank for and how competitive those words are.
In Dee’s article, he makes a very good point that most other sites were not actively building links. It was a surprise to him that SEO is not as competitive as he first thought simply because most sites aren’t doing it. This in turn made it much easier to rank for competitive terms through the use of link building, although Dee’s client had a head start by being on page 2 or 3 of search engine results for specifc terms.
This reinforces my observations that many marketing managers and website owners are not committed to a long term SEO strategy. Most people see SEO as a tactical implementation, rather than an ongoing program of work that realizes the full benefits. This is largely due to, in my mind at least, marketing managers being focused on “campaigns”. Once a campaign is completed, the microsite or content is thrown away, until the agency comes up with the next brilliant idea.
SEO is more than a one-off project to tick off on the to-do list. It is a foundation or a philosophy in which all online activities are carried out. Campaigns should revolve around a “SEO’d platform” that can leverage efforts and results of previous implemented tasks.
During such times as a recession, SEO is a reliable and low cost way of generating qualified traffic to any website. All it takes is a committment to see the strategy through.
So, how many links is enough? There is no magic answer here, but I do recommend that website owners should seek at least one backlink per day. Soon enough, your online presence will be a force to be recokoned with.
Who says that Twitter and SEO do not go together? Wednesday night, Santosh Jayaram, the Vice President of Twitter Operations stated that Twitter Search will begin to crawl links submitted to its application and will index the content of those pages.
It will also incorporate a reputation ranking system based upon the reputation of the person submitting the article. When a topic is gaining popularity within Twitter it will insert it within the Twitter Sidebar.
The reputation system will certainly improve the quality of search results that Twitter Search provides, which currently displays low-value content and plenty of re-tweets.
Hopefully these changes will influence Twitters to add more meaningful content to their tweets. Still, there will be those who will always want to manipulate the ranking system so let’s hope Twitter Search has a few tricks up their sleeves to combat this.
For SEO professionals and their clients, Twitter Search has just become an essential item on our To Do lists.
Mozilla Labs has been working on a new plug-in called “Weave” for Firefox that allows users to log into any website account that supports OpenID. Users don’t even have to have an existing account with the website, thus making it easier for site owners to increase sign-ups.
From a users’ perspective, it makes the logging in or signing up process much easier than ever before.
To enable this function, the user simply logs into their browser so that Weave will know who you are.
Weave also supports normal username and password logins to create a single login experience. It allows the user to select whether they want to be automatically logged in next time when they click the login button on the related website.
Love the work you are doing Mozilla – keep up the excellent work!