David Leonhardt’s SEO and Social Media Marketing

Tips for better SEO (search engine optimization) and website marketing …

THE HAPPY GUY MARKETING

 

Archive for the ‘rankings’ Category

How to chose a link partner

Monday, November 19th, 2007

Most webmasters are at a total loss when they try to decide whether to do a link exchange.  In fact, they are so lost that they rely on how much green is showing on the notoriously inaccurate Google Toolbar.

Here is my top-5 list of how to decide if a link exchange is worthwhile.

1. The page is cached by Google.  That is the drop-dead bottom line.  If it is not cached, Google can’t find it.  And Google is the biggest search engine by far.  If Google can’t find it, chances are that Yahoo, Ask and MSN can’t either.  And chances are that real people won’t land on the page or navigate to it.

2. Relevance. The page should not be optimized for “links”. “link exchange” or “resources”, unless are searches you are targeting in your SEO efforts.

3. Relevance.  The page should be relevant for the specific words you are targeting.  In other words, the title tag and the heading should include at least one of the main words of the search you are targeting.

4. Relevance. The page should be on topic, regardless of specific words.  If it is full of totally unrelated websites, the search engines can see that it is just a collection of random links.

5. If you can get a link on a content page, or where yours is the only external link on the page, you have struck gold!

 


Grab The Bookmarketer For Your Site

Reciprocal link heresy

Friday, October 26th, 2007

Let’s go hunt us a sacred cow today, OK?  Specifically, the sacred cow that all links must be reciprocated. 

There are a number of software programs you can purchase that can periodically make sure that all your link partners retain their links to your website so that you can notify them, warn them, threaten them and remove their links if they have removed yours.

Others spend time double-checking by hand.

Is this money well spent?  Is this time well spent?

No.

First, you have gained nothing by removing the links of those few who have reneged on their end of the bargain.  You have not increased your link popularity.  You have not gained additional PageRank.  You have not increased your website’s trust, relevance, content, number of pages or any other indicator that will lead to higher rankings.

Second, you have just spent money buying software that could have been spent elsewhere.  Or you have spent time checking backlinks that could have been spent creating them.

Third, you might even be doing yourself a disservice by making every outbound link on your resources page a reciprocated one.  The search engines are pretty clever.  They can detect when 100% of your outbound links are reciprocated.  They can detect when 100% of your links are part of a triangular linking pattern.  Do you think they are impressed with that?  My logic is that it is to your advantage if over time some of your link partners reneg and you have less of a pattern (remember that when it comes to linking, patterns should be avoided, for they indicate to the search engines that the links are contrived).

So, with apologies to everyone hocking link-checking software, my recommendation is to not waste another minute of the precious few God gave you checking up on your link partners.  A nice hike in the mountains would be a much better investment for your business…and of course for you!

 


Grab The Bookmarketer For Your Site

SEO Software – just say NO!

Monday, September 10th, 2007

Over at my favorite SEO forum, High Rankings, somebody asked for recommendations about SEO software.  The consensus response was, “Yes, don’t do it.”

But that response deserves a little more explanation, some of which was also posted at the forum.  There are two reasons why SEO software should be avoided.

Avoid SEO software for link-building.  What doe Google and company look for in links?  They are looking for recommendations.  They seek a sign that a web page is considered a good reference on a certain topic.  They are looking for natural links, not contrived ones meant to alter their results. 

What does automation do?  It creates patterns – patterns that are not natural, but contrived. What is the one strength that computers have that mere mortals like you and me and Paulina Rubio do not have?  The ability to sort through almost infinite data in almost no time at all and recognize patterns. Using SEO software is like posting a neon sign that reads, “Yoohoo! We’re trying to mess with your results.”

Think your SEO software can fool the Google algorithm?  Hmmm.

Avoid SEO software, because this is a sport.  I know a lot of web folks are techies who are used to the scientific principle that if you take certain steps, you will get certain results.  Repeat the same steps, get the same results.  SEO is not like this.  If a thousand people all repeat the same steps, there will not be 1000 websites in Google’s top 10 for “Paulina Rubio lyrics”.  There will still be just 10 results.

In any competition, the goal is not to duplicate what everybody else is doing.  The goal is to do more than everyone else.  To do better than everyone else.  And, if possible, to do what nobody else though of. It’s OK to study the competition.  It’s OK to study others who are not competitors.  It’s OK to take the best of what each of them is doing, but then you have to go out and do the very best you can.  No me-too software program will do that for you.

All that being said, I do use some software for SEO purposes. 

I use Internet Explorer to view websites (Yes, IE is software.)

I use Roboform to prefill forms for directory and article submissions.  But note that I manually edit important things like “title” and “description”.  Roboform just saves me from having to misspell my own name hundreds of times a day.

I use Keyword Discovery to help research the best search terms for my clients (It’s a web-based application, but it counts as software).

And of course, I use Word to compose articles and news releases, to edit source code and to do plenty of additional tasks.

And let’s not forget WordPress, which I use to blog about SEO.  :-)

So software, yes.  Software to automate SEO, no.

 


Grab The Bookmarketer For Your Site

Google cracking down on paid links

Tuesday, June 12th, 2007

First, let me preface this post by saying there is nothing wrong with buying paid links, regardless of what Google says or what you think Google says.  Paid links are called advertising.  “Free” links, which are never actually free, are called public relations.  This has been going on since someone in ancient Egypt first wrote a sandwich board reading “The End is near” and someone else asked, “How much to add ‘Reserve your burial plots today!’”

However, Google does not appreciate links sold strictly to boost PageRank, specifically targeting its ranking algorithm.  This is understandable.

So what is a website owner, intent on promoting his website and his services, to do?  Go for the best links possible, whether they are paid or free, sticking within budget.  If most of your links are paid, that sends quite a red flag that maybe there is nothing on your website of enough value to actually earn links.  In fact, that in itself is a pretty good case for Google to demote your website in its rankings.

On the other hand, if there is a website that could be sending you some targeted traffic, that can show high relevance and offers good link juice, why not pay for the link?  Google will not penalize you for having bought a link or two; Google will penalize you for trying to purchase a re-arrangement of its listings.

 


Grab The Bookmarketer For Your Site

SEO and Length of Text

Friday, April 27th, 2007

Over at High Rankings I just responded to a question about whether it really is necessary to have 200 – 250 words of text on your page for SEO, when the title and meta description tags are optimized.  Surely a newbie question, but one that clients often bring forward based on a very quick reading of plenty of inaccurate information floating around on the Web.  Here is the response I just posted

I suppose it is redundant to have nine players come up to bat when you really need no more than four (in the event you load the bases), so why not just field a team of four players?As for limiting your text to a sparse 200-250 words, why not limit your team members to anyone 5′ and under?

The answer, of course, is because only one of two teams will win any given game, and every advantage you have is good and every advantage they have is bad. Only ten of a million or two web pages will be in Google’s top ten for any given search phrase, and every advantage you have is good and every advantage they have is bad.

There is a notion that search engine rankings can be achieved primarily in a scientific manner.  Science is when repeating the same action provides the same result, no matter how many time you repeat it.  But SEO is more like a sport, where the rules and the players are constantly changing, and where you are competing for the prize and there can be only a handful of winners and many, many losers.

 


Grab The Bookmarketer For Your Site

My Right to Google Rankings

Tuesday, April 17th, 2007

I have the right to Google.  After all, I pay taxes to Google, don’t I?  And the Constitution says that I have rights to Goiogle rankings, doesn’t it?

Is it just me, or is this how most websmasters think?  The laters kerfuffle (is that how you spell it?  Is kerfuffle even a word?) began when Google’s webmaster liason Matt Cutts blogged that people should report paid links to help Google develop ways to reduce the skewing effect of paid links in their search results. 

Quite frankly, it’s a little silly to expect most people to go along with this, and Matt could probably find plenty on his own, but he  apparently wants some outside feedback to catch what he might have missed.  So what?  It’s his right to ask in his blog for any kind of feedback he wishes, just as it is my right to ask for any feedback I wish.  It’s up to people to decide whether they wish to provide that feedback.  Nobody is obliged to report anything.

But the debate is raging strong at Threadwatch and at WebProWorld.  Here are a few of the incredible things people are saying:

“Isn’t this somewhat hypocritical? Doesn’t Google sell links through AdWords?”
 

“It’s alright to sell links just as long as we’re the ones selling them. That’s the message I’ve been getting loud and clear from Google.”
 

“If I want to buy a link to generate traffic (not caring about SEO) or I want to sell a link because people want my traffic, who is Google to tell me I can’t or my site will be punished.”
 

“We don’t owe Google anything. Google owes us everything!”
 

Adwords are paid links, but they do not affect the content of anyone else’s site without their consent.  If I sell links on my site, it absolutely affects the content on Google’s site, so they have every reason to be concerned.  They have no right to stop me from selling links, but they have every reason to want to control for the effects those paid links would have on their results…which is what they are hoping to do. (Google is not threatening to punish any site.)
 

How about this comment:
 

“I think Google should show us the alternatives if they don’t want us to go down the paid link route.”
 

Considering that I have been doing SEO for , what 3 or 4 years now without buying almost (I said “almost”) any links, I think we all know how many linking alternatives there are.
And now there is an article by  iEntry CEO Rich Ord, 7 Reasons Google’s Paid Link Snitch Plan Sucks, that panders to the congregation (although at least his arguements make a little more sense, except for #6: The hypocrisy of being in the business of selling links and then asking others not to sell them is a bit much for many webmasters.

Here is my take:  It is my business and mine alone whether I sell links or not, and mine and mine alone whether I buy links or not.  It is Google’s business and Google’s business alone to decide which links, if any, will form part of its algorithm calculations.  And as much as everybody seems to think they own Google, they do not.  It might be silly or even useless to ask people to report paid links, but the vitriol and false entitlement are clearly  misplaced.

Here is my take:  It is my business and mine alone whether I sell links or not, and mine and mine alone whether I buy links or not.  It is Google’s business and Google’s business alone to decide which links, if any, will form part of its algorithm calculations.  And as much as everybody seems to think they own Google, they do not.  It might be silly or even useless to ask people to report paid links, but the vitriol and false entitlement are clearly  misplaced.

 


Grab The Bookmarketer For Your Site

Younanimous – AfterVote

Monday, April 16th, 2007

I have a new favorite search engine: http://younanimous.com/ .  Younanimous has just changed its name to AfterVote, but it remains at the same URL. Why is this such a cool engine?  For three reasons:

1. It provides a hybrid of Google, Yahoo and MSN, so results are not skewed by one engine’s particuylar preoccupations.

2. It allows you to customize results more than the big ones.  You can filter out PDF document, for example.  

3. It gives more info at your finger tips. In addition to the individual results of Google, Yahoo and MSN, it gives Alexa data, Google PageRank, Compete data and WhoIs data for each result.  

Here are a few ways to use AfterVote:

1. To quote for new SEO clients, one can more quickly determine how competitive the field is by searching for a couple of the top search terms.  At Google, I would have to click on several results to see their PageRank and how well they are optimized on-page.  Here I can quickly see the PageRank of all the top players and whether they are the same players across all the engines (800 pound gorillas) or whether the results are more fickle (much less competitive). 

2. For a website owner, this can be used as a keyword research tool.  Hmmm, do I optimize for “hotel jobs” or for “hotel careers”?  This is much more intuitive and probably more reliable than KEI data.

3. Only one result returned per domain.  Note that Alexa and Compete are domain-specific data, not page specific, so they are good for broad assessment goals: which sites would make good JV partners, which would be good for link exchange, which might be good for advertising, etc.

4.  For people who are afraid to use Web Position or other rank-checking software, they can quickly see how they are doing across the three engines for their top keywords.

5. One weakness is that it returns only 10 results from Google, whereas you can set it for 100 at Yahoo and 50 at MSN.  Hopefully this will be upgraded somehow.

 


Grab The Bookmarketer For Your Site

The Silly Myth About Reciprocal Links

Monday, April 2nd, 2007

“I would like to exchange links with you to maximize Page Rank on Google for both of us, but it is important not to link to the same site – they need to be different sites to count on Google” 

It’s not the first time some linking email message lectures me about how Google ignores reciprocal links.  Of course, that’s total phony bologna.  Google values links based on their value, not on whether the other site links back.  It is actually a very natural thing for two sites in the same niche to link to each other.  It is also good marketing to exchange visible links with non-competing, related websites.  And it is totally legitimate to show visitors and search engines alike that you are related in topic to another website that Google might also value.  Google has no interest in discounting legitimate reciprocal linking. What Google does want and even need to discount are links set up to mess up its results.  All links built solely for the purpose of cooking Google’s results are therefore discouraged.  Those that are aggressive enough to skew Google’s results must be stopped.  Google has that obligation, otherwise it will lose its clientele. In case you, too, are tired of receiving such misinformed emails, here is how I just responded to one: 

“I think you have been taken for a bit of a ride by some way-too-clever SEO charlatan who thinks that reciprocal linking is being penalized or discounted by Google.  At best, three-way link exchanges add some variation amidst two-way link exchanges; at worst, the search engines (who can easily read such schemes) would read this as an attempt to scam them.  I personally don’t think it makes a hill of beans difference whether there are two-way or three-way exchanges.  I do what makes sense for each website.”       

 

 

 


Grab The Bookmarketer For Your Site

A brand-new SEO scam

Friday, March 9th, 2007

This SEO scam is so new that it hasn’t even begun yet, at least not to my knowledge.  I don’t want to give SEO scammers ideas, but I am 100% certain that this is coming and that there will be many, many, many (did I mention “many”?) unsuspecting webmasters who will fall for it, so let’s for once get the warnings about the scam out there before it begins. 

Google’s new personalized search has already begun, and within months it will start to skew Google’s rankings in two ways.  

First, data Google gathers about how people are searching will certainly start to be factored into the general algorithm.  This means that on-page relevancy and inbound links will have to share the stage with such factors as click-through rates, click-back rates (back to Google from the site), length of visit, number of pages viewed, repeat visits, etc.  In other words, Google will be better able to measure “good” content from trash.  

A whole industry will sprout up to help webmasters take advantage of this, much of it black hat (like click fraud, perhaps?), some white hat, mostly to create more “sticky” content, improve click-through rates and encourage people to “vote” in some way for the site. On the white hat side, TheBookmarketer can help you move ahead right away, as I reported in this post on how to use social bookmarking to a website’s advantage

Second, the data it collects from each individual will be used to present more personalized results to that individual.  Exactly how this will work remains to be seen, as there are many ways that Google has hinted it can factor the information into a person’s individual results.  But one thing is for certain…as soon as SEO scammers get a sense of some of the factors that affect personalized results, the scamming will begin.  Here is exactly what the scammers will do:  

1. The scammer will tell the website owner to sign up for a Google account.  

2. The scammer will tell the webmaster to “visit your website every day” or “visit at least ten pages of your site in succession every day” or “Google bookmark your website” or “do the following ten searches and click on your site from the rankings every day”.  The precise instructions will depend on the factors that most influence personal search.  

3. The scammer will promise that the website owner will see his site move up in the rankings.  And he will see it move up in the rankings.  But only on his computer using his personalized search.  Even if his website shows up as #1 for “broken glass”, none of the broken-glass-buying market might even see his site in their results. 

This scam won’t fool everybody.  It is most likely to work on the little guy, who operates from one computer and would not think to compare results.  It might not work forever, but what scammer will stick around to argue the finer points once he’s sucked the money out of an unsuspecting website owner’s pockets? 

Google will surely take steps to reduce this in order to protect the integrity of its results (remember the searcher is whom Google must please), but like every game of locks and lock-pickers, there will be plenty of scams flying under Google’s radar or keeping one step ahead. 

The best protection a webmaster has against this sort of scam is to include mention of it in passing in every article posted on the Internet about personal search.  Hopefully not too many webmasters will miss it before hiring an SEO scammer.  And that’s why today I am outing the scammers before they even start!

 


Grab The Bookmarketer For Your Site

KEI Formula Misleads for Keyword Competitiveness Research

Friday, February 23rd, 2007

Many SEO specialists wonder why I don’t use Keyword Effectiveness Index, or KEI, to research the right keyword phrases to target.  On the surface, the KEI formula makes sense, and it struck me as so obvious when I first learned about it. 

To the best of my recollection, WordTracker invented KEI, and their original description of the formula was, “the KEI compares the Count result (number of times a keyword has appeared in our data) with the number of competing web pages to pinpoint exactly which keywords are most effective for your campaign.”  What better way to research keyword competitiveness?

At first a fan, I did eventually come to my senses.  This formula tracks how many websites are in a given database for a searched term.  But it is not the volume that counts; it’s the distribution.  Here’s an analogy… 

Which way would you prefer to cross a city on foot:

1. A small alleyway, with a thousand thugs lounging in cafes around the city.
2. An equally small alleyway, with a dozen bloodthirsty thugs in the alley bent on stopping you.

KEI would lead you down the equally small alley…the one with very few keyword phrase competitors, but all right in your way, fighting hard for their high search engine rankings.  Is that what you want?  Of course not.  Keyword popularity is not the selection criteria that matters.  The SEO game is not a democracy…at least not yet, but that’s another story.
I had a sort-of related question from a client today:

Say for instance the word “tennis” was hyperlinked all over the web on all different pages and sites yet the links could be linking to 100′s of different places. Doesn’t that make the word “tennis” more competitive because other sites are trying to use it to increase their chances in trying to get it to show up in the search engines?


On the surface, her proposal made eminent sense, but it’s not the total volume that counts, rather the distribution.  Here was my response to her:

That depends.  If There are a million links with the word “tennis” in them, pointing somewhat evenly to 100,000 sites, the most any one site might have pointing would be, just for example, 20 or 25 links with the word “tennis”. On the other hand, there might be only 500,000 links with the word “badminton” in them, pointing to 100,000 sites, but skewed toward a dozen sites that have been battling it out for top rankings, each with 2000 – 10,000 inbound links with the word “tennis”.  It’s not the volume that counts, but the distribution.

Look very carefully at the top 10 ranking websites for a given search term at your favorite search engine…and how well-optimized those sites are for the keyword, how many inbound links they have, what the quality of those links appears to be, etc.  Don’t rely on the KEI formula or any other web-wide aggregate figures for keyword selection.
  

 


Grab The Bookmarketer For Your Site

David Leonhardt’s SEO and Social Media Marketing is proudly powered by WordPress
Entries (RSS) and Comments (RSS).

Close