Wednesday, March 31, 2010

KPI's and Monthly Objective Metrics for People Who Do SEO

Posted by richardbaxterseo Todays post is inspired by a brilliant question that came up recently in Q&A. The question was based on targets and objective setting for SEOs and it went something like this: "What metrics should an SEOs monthly objectives be based on?" Having spent a good portion of my SEO career managing SEO teams in-house, this question really reminded me how interesting the topic of organisational SEO can be,

Follow, update and search all your social networks in one place

Nsyght is a web application for Twitter and Facebook that lets you search information from your friends in real time, post directly to your networks as well as search and filter your own "personal fire hose" of information. You might remember Nsyght as a bookmark based search engine from when I wrote about it back in 2008 . It was down for some time and is now reborn as a web client that allows you to aggregate, share, and search your social graph in real time.

RSS Feed Readers Twitter Highlights Top Tweets, Users on its Homepage

Twitter has started rolling out a new design of its homepage , the page that loads on your browser when you visit Twitter.com and you are not signed in to your Twitter account. Actually it's not so much of aesthetic redesign but more on what are displayed on the homepage. Previously, Twitter.com just gives you the large Twitter header containing Twitter logo, search box, the famous tagline that says "Share and Discover What's Happening Right Now, Anywhere in the World,

SEO

The SEO industry has been plagued for years by a lack of consistency with SEO terms and definitions. One of the most prevalent inaccurate terms we hear is "duplicate content penalty." While duplicate content is not something you should strive for on your website, there's no search engine penalty for having it.

Duplicate content has been and always will be a natural part of the Web. It's nothing to be afraid of. If your site has some dupe content for whatever reason, you don't have to lose sleep every night worrying about the wrath of the Google gods. They're not going to shoot lightning bolts at your site from the sky, nor are they going to banish your entire website from ever showing up for relevant searches.

They are simply going to filter out the dupes.



The search engines want to index and show to their users (the searchers) as much unique content as algorithmically possible. That's their job, and they do it quite well considering what they have to work with: spammers using invisible or irrelevant content, technically challenged websites that crawlers can't easily find, copycat scraper sites that exist only to obtain AdSense clicks, and a whole host of other such nonsense.

There's no doubt that duplicate content is a problem for search engines. If a searcher is looking for a particular type of product or service and is presented with pages and pages of results that provide the same basic information, then the engine has failed to do its job properly. In order to supply users with a variety of information on their search query, search engines have created duplicate content "filters" (not penalties) that attempt to weed out the information they already know about. Certainly, if your page is one of those that is filtered, it may very well feel like a penalty to you, but it's not - it's a filter.

Penalties Are for Spammers

Search engine penalties are reserved for pages and sites that are purposely trying to trick the search engines in one form or another. Penalties can be meted out algorithmically when obvious deceptions exist on a page, or they can be personally handed out by a search engineer who discovers the hanky-panky through spam reports and other means. To many people's surprise, penalties rarely happen to the average website. Sites that receive a true penalty typically know exactly what they did to deserve it. If they don't, they haven't been paying attention.

Honestly, the search engines are not out to get you. If you have a page on your site that sells red hats and another very similar page selling blue hats, you aren't going to find your site banished off the face of Google. The worst thing that will happen is that only the red hat page may show up in the search results instead of both pages showing up. If you need both to show up in the search engines, then you'll need to make them substantially unique.

Suffice it to say that just about any content that is easily created without much human intervention (i.e., automated) is not a great candidate for organic SEO purposes.