David Leonhardt’s SEO and Social Media Marketing

Tips for better SEO (search engine optimization) and website marketing …

THE HAPPY GUY MARKETING

 

Archive for the ‘algorithms’ Category

Google is not fair (and is not meant to be)

Tuesday, October 9th, 2012

I was asked the following question recently about Google: “I still don’t understand how other sites post articles that are not original yet they do not get penalized?”

I am sure you have been asked this question many times.  Maybe you have asked it yourself many times.  I have certainly heard it posed in many different ways, why one site gets Panda slapped or Penguin slapped and not another.  As an SEO consultant myself, I have been amazed at how one site with a fairly good link profile, but with some “unnatural” links can get Google’s dreaded “unnatural links letter“, while another site with a much more questionable link profile doesn’t.

But sometimes you have to hear a question posed in many ways, many times before you get that Aha! moment when The Obvious Answer is revealed.  This was that moment.

The Obvious Answer

The Obvious Answer is actually a question: Why do some smokers live 100 years, while others are struck down by lung cancer at 43 or 47 or 54?

It’s just not fair.

Which bring us to the second part of The Obvious Answer: Life is not fair (as I keep telling my kids every time one of them screams out “It’s not fair!”)  And neither is Google.

Let’s review what Google’s ranking goal is, which I can assure you has nothing to do with fairness.  Google’s goal is to provide searchers with what will be most useful to them.  We use search engines to find what we want; Google does its best to deliver.  It does not always succeed (although it obviously does well enough, or we would all be using some other search engine).

My brother, the human search engine

I am reminded now of the purchasing habits of one of my brothers.  Once he takes an interest in buying something, he does endless research.  He is determined to find the best price.  He is determined to find the best features.  He is determined to find the most durable option.

But most of all, he is determined not to discover six days after buying something, that there is could have been an even slightly better option that he missed.

As a result, he often gets better deals than I do.  It’s not fair.

But even with all his research and delaying, he still might not get the very best option.  It’s not fair.

Which means that a vendor or manufacturer with something slightly better might still have missed a sale.  It’s not fair.

And that also means that a vendor or manufacturer got a sale he might not have gotten.  It’s not fair (but they are not complaining, right?).

And when Google ranks web pages, it’s not fair.  And it is not meant to be.  Google’s job, to once again restate the often overlooked or ignored obvious, is to provide searchers with what will be most useful to them.

What SEO is all about

So the job of SEO practitioners is…

Come on, what is the obvious answer?

You can do it.

To make sure our websites are the most useful to searchers.

Now I know that you will say that it is the designer’s and programmer’s jobs to make sure the website is most useful, functioning well, converting well, etc.  True enough.  But it is the SEO’s job to make sure that, for a given search term, the site actually delivers.  Obviously there is some overlap and cooperation required with the designer and the programmer on the technical front, but mostly the SEO needs to make sure the content is what searchers are looking for.

Relevant.

Important.

And, above all, useful.

And the SEO consultant has one additional job, besides making sure the content is most useful – and this is key – making sure the search engines know the content is the most useful.  It is about writing.  It is also about promoting. Yes, all the “content is king” and “quality over quantity” and “avoiding bad neighbourhoods” and “backlink strategies” can be distilled down to this very simple goal.

But what happens if Google doesn’t notice the right things?  What happens if Google does notice the wrong things?  What happens if somebody else is shouting louder?  What happens if someone else makes a more useful web page?  What if Google disagrees that your perfect page is best?

Like I said, it’s not fair.  It’s not supposed to be.  That is The Obvious Answer.

The Practical Answer

Of course, if you’ve been hit be a penalty, such as the “unnatural links letter” or just been demoted by a Penguin slap or hit by Google’s brand new EMD (exact match domain) artillery, and find yourself grumbling that it’s not fair, you will probably find “It’s not supposed to be.” a less than satisfying answer.

It is also a less than practical answer.

The practical answer is to avoid doing anything that the search engines might one day decide is spammy.  Yes, that is a whopper.

And quite impossible.

Once upon a time, you could not be penalized by who linked to you, only by who you linked to.  This made sense; it kept competitors from building piles of spammy links to your site – “negative SEO”.  But with Google’s Penguin and the “unnatural links letter”, times have changed.  Despite Google’s protests to the contrary, I cannot see how negative SEO can be stopped right now.

Not long ago, any publicity was good publicity.  If you could get a mention in the New York Times or the Wall Street Journal, that was amazing.  Still is.

But if you couldn’t get that kind of coverage, you could still spread your message through blog networks, article submissions, etc.  Much less targeted, much lower quality, much more of a shot in the dark.  But 100 percent legitimate.  Sometimes you market with a rifle, sometimes with a shotgun.  Fair enough?

But now if you market with a shotgun, Google will look at all those low quality repetitive links and down the sink goes your website.  No, it’s not fair.  Especially since it is retroactive, penalizing your site for doing in the past what used to make sense back then (and still would make sense if you don’t care about Google rankings).

So it is not always possible to predict what will get you in trouble, but it is pretty clear that quality over quantity is a good rule of thumb.  Stay away from anything mass-produced or mass-disseminated.  Avoid any get-rich-quick (get-links-quick) tactics.  Take the time to create original content – truly original content, not just rehashed repetition.

You still might get tripped up by suddenly changing algorithms.  Watch how Infographics get treated in a year or two. You still might find yourself at some point in the future grumbling “It’s not fair.”  But your odds of being on the winning side of the not-fairness will be much, much greater.

Additional advice? Hang on tight!

 

 

 


Grab The Bookmarketer For Your Site

Google’s Penguin Update…

Friday, May 11th, 2012

…as experienced by more webmasters than I care to count:

Oh, yes. And this is how many of those same webmasters would like to deal with Google’s penguin (sorry, but you do have to watch the full 1:47 video to the end to see the full wrath of the webmasters).

 


Grab The Bookmarketer For Your Site

The Newest Oldest SEO Tool

Thursday, January 28th, 2010

The latest SEO tool is not an automated submission device or some web page analyzing script.  It’s the thesaurus on your desktop.  No thesaurus?  Better get one soon.  Google has just announced that it has made great advancements in reading synonyms.

While even a small child can identify synonyms like pictures/photos, getting a computer program to understand synonyms is enormously difficult, and we’re very proud of the system we’ve developed at Google.

What does this mean for you?

Thesaurus

If you are optimizing for “real estate Kentucky”, you had better not leave off related search words like “homes”, “property”, etc.  These words will be treated as synonyms of “real estate”, and “real estate” will be treated as synonyms of them.  More variations – in other words, more synonyms – looks a lot more like natural language than the forced language of always using the same word just for SEO purposes.

It also means that one website can easier dominate a niche across several searches.  For instance, a page with a great link profile that was ranking very high for “real estate Kentucky” due to an astounding backlink profile, but was ranking at 100 for “Kentucky property”, might suddenly become competitive for “Kentucky property”.  This is just an uneducated hunch, but I suspect that the strength of your backlink profile could help you greatly with searches for synonyms of the terms you are actually optimizing for.

In any case, this is good news for searchers, since their true intent is more likely to be satisfied.  As web marketers, we also want to satisfy them, so make sure you use natural language with a generous use of that thesaurus – both on-page and in the text of your backlinks.

 


Grab The Bookmarketer For Your Site

What PageRank Can Tell Us About SEO and Bounce Rates

Wednesday, December 31st, 2008

Last night I posted about what Google has to say on SEO and bounce rates.   You can  view the post here. A great question was posted by Wilson: “David, I was wondering, why Google want to have two different answer for the bounce rates…? ”

Even broken down into four parts, my response was longer than allowed for comments on this blog, so I decided to make it a post on its own.  Here is my answer to Wilson’s question:

Great question. I will get to that, but let us look at another misunderstood part of the Google algorithm.

We have been wondering for years why Google has three different measures for PageRank. The real PageRank calculation used in their algorithm is a complex logarithmic calculation. All other things being equal, a link from a PR4.12964 page is probably worth many links from a PR3.45294 page, for instance (We have no idea to how many decimal spaces the real PageRank is calculated, not whether this has remained steady over the years or whether it fluctuates over time).

Then there is the PageRank in the Google Directory, which supposedly is on a scale of 8. I can’t find any reference to the 8-point scale in the Directory, but the Wikipedia article on PageRank is a good reference on this point. Interestingly, the Google Directory states that…

 ”The green ratings bars are Google’s assessment of the importance of a web page, as determined by Google’s patented PageRank technology and other factors. These PageRank bars tell you at a glance whether Google considers a page to be a high-quality site worth checking out.”

Note the “and other factors” wording.

Finally we have the famous Toolbar PageRank, a green bar on a scale from one to ten. This is what most webmasters mistakenly refer to as Google’s PageRank calculation. However, it is just an estimation that makes a PageRank of 4.0001 look the same as a PageRank of 4.9999, even though the latter might be worth many times the former. Meanwhile, it makes a PageRank of 4.9999 look much less valuable than a PageRank of 5.0001, even though the two are almost the same. Furthermore, everyone involved in SEO can recount numerous instances where a page “should” have a much higher or much lower PageRank than another page, based on the number and value of incoming links, but the Toolbar PageRank does not reflect that. (For instance, I have noted on many sites that a “links” page with identical link juice to a PR3 content page might nevertheless have a PageRank value of zero.)

What does this tell us about bounce rates?

Just like PageRank, bounce rates is a metric Google shares with its users. PageRank is viewable to everybody; bounce rates are viewable only to the website owner. In both cases, Google is showing a very simple calculation … a number people can use to quickly make comparisons between pages, between websites, between last month and this month, etc.

As I wrote above, “It would be a ridiculously simplistic algorithm that calculates bounces using such simple calculations.” Any serious calculation of bounces applied to a search engine ranking algorithm would have to be such a complex multidimensional equation that it would be useless to you or I as humans viewing it with our eyes (unless you happen to be a mathematical genius – and I mean genius – which I am not by a long shot.

Except to the extent that a search engine chooses to reveal how it treats bounces and other actions in its algorithm, we will never know for certain what plays a role and what does not, nor how big a role each factor plays. This is par for the course with ranking algorithms.

It is also totally possible that Google and the other search engines do not include bounce rates and related user actions yet in their algorithms.  Adam Lasnik’s comments quoted in my previous post are good hints, but they are hardly official.     Google engineer Knut Magne Risvik speaking in Norwegian at Digi.no and saying that  Google can measure how many seconds it takes from when a user clicks on a link to click back to Google, and if it is a short time that visit was a failure, is not quite an official Google statement either.  The only search engine that has released anything official is MSN through its BrowseRank paper … and that is not a statement current practice but of future intentions.

As this very young field matures, Google might also change its Google Analytics definition of “bounce rate”. SEO aside, the raging debate over whether a high bounce rate could sometimes be a good thing (depending on the nature of a website) makes a good case for changing the Google Analytics definition, too.

The summary to all this is that I have to answer Wilson with a simple “I don’t know”.  But, just like defining “bounce rate” and “PageRank”, such simple answers are really a lot more complex than they look.

 


Grab The Bookmarketer For Your Site

Is an SEO’s Place in the Kitchen?

Friday, December 12th, 2008

I wrote this post as a comment on Barry Welford’s blog, and it got so long and involved that I realized it would make a great blog post right here…especailly since it really is the foundion on which I wrote the Sticky SEO ebook.

Bounce rate is a great measurement of performance, of the usefulness of a website.  It is not the only one, as has already been discussed, and on its own would be a poor measurement.  Leaving a site through an affiliate link (or any other link) should not be considered a bounce.  It should be considered an external referral. 

Whenever anybody clicks on a result in Google, there are four potential next actions. 

  1. Bouncing back to Google, especially after only 3 – 5 seconds, is a sign that Google had served up a less-than-useful result.  Not good news for ranking well.
  2. Referring to a deeper link in the site (an interior page) is as Barry says “normally a confirmation that they are finding something of interest”.  Good, job Google; keep ranking that page for the search that was just performed.
  3. Referring to an external link is also a sign that the searcher found something useful on that page, which is why for SEO the New York Times is making a wise decision.  Searcher happy, Google happy.  Keep on ranking.
  4. Closing the browser window.  Yes, that is the fourth option, which means simply that the searcher’s wife just called, “Honey, dinner is ready.”  (Hopefully that won’t affect rankings one way or the other, or else we’ll need a kitchen-centric SEO strategy in the future.

 

 

 

 

 


Grab The Bookmarketer For Your Site

Sticky SEO e-Book released

Wednesday, December 3rd, 2008

After a month of working on it, and at least a month of technical delays, I have finally released Sticky SEO. This groundbreaking SEO guide will help you get prepared for the wave of algorithm changes that will sweep a lot of websites right under the rug.

Yup, a storm is coming and some websites will thrive while others crumble to dust.  It’s all about user metrics and what I call the “Usefulness Algorithm”. Sticky SEO is the answer, and this is the first eBook to give useful strategies and practical tips on how to be one of the websites that will thrive.

I should note that Sticky SEO really is not like any other SEO book.  If you find this blog post searching for “SEO book” or SEO eBook “, and are expecting the same SEO 101, you won’t find it here.  Sticky SEO doesn’t include any of that stuff.  It’s all good – don’t stop adding relevant content and building link after link after link to your site – but this is a different, more exciting story.  This is for website owners who want to pump up their profits today and power up their rankings for tomorrow.

Here is the link:
http://www.seo-writer.com/books/sticky-seo.html

 


Grab The Bookmarketer For Your Site

The more links on a domain the better?

Monday, December 1st, 2008

Dear reader, let me be a heretic once more.

We all know, or at least assume, that having multiple links to the same URL from a domain is an exercise in diminishing returns as far as search engine rankings are concerned.  That is to say, if you score a link to your home page from one page on a domain, any additional links to your home page from other pages on that same domain are worth less.  And the more links to your home page from that domain, the less each one is worth.

This makes sense.  If a domain has 1000 pages, a sitewide link cannot be viewed as 1000 endorsements for your home page.

But the web is a changing place, and in the past few months, services have been cropping up to submit your website to 1000 and even 2000 social bookmarking websites.  These services are similar to all those directory submission services and the article submission services, and they are often offered by the same people.  On the surface of it, there is nothing wrong, but it does require a reaction from the search engines.

But first, a personal rant.  Submitting your home page to 2000 social bookmarking sites is NOT social bookmarking.  It is bookmarking, but it is NOT social.  If it was social, these services would be promoting your page on these sites, networking with other users, and you would end up with several links at any one social bookmarking site (assuming your content is actually interesting).

OK, that was more than just a personal rant.*  In fact, I’ll bet the search engines are noticing the same thing and looking at the same numbers and raising one of their search engine eyebrows right now.  If there are thousands of single-link entries at each social bookmarking website, most of which are essentially paid links, should those each be worth more than each entry that garnered, let’s say 12 Diggs or Zooms?  Those dozen votes clearly are exactly the type of recommendations the search engines look for in their algorithms.  Single links at social bookmarking websites clearly are not.  Each Digg or Zoom should be worth more than each single entry.  In fact, we might even go so far as to say that the more Diggs or Zooms, the more each one should be worth.

What should the search engines do?  Clearly, their algorithms must distinguish between sitewide links and links that appear numerous times independently on the same website.  This is true not just for social bookmarking sites, but also for forums where a resource might be cited in numerous threads over time.

Maybe Google and Yahoo and MSN already do this.  Maybe I’m not being that heretical after all.  Naw, that just would be too out-of-character.

* It qualifies as a rant because I capitalized the “NOT”.  Twice.

 


Grab The Bookmarketer For Your Site

Sticky SEO Imminent

Saturday, November 29th, 2008

I promised a couple months ago that I would follow up the aborted series on BrowseRank with a complete ebook on the topic.  Now that ebook is imminent.  We’ll be releasing it as soon as we clear up a few server issues.  Just to whet your appetite, here’s the image of the cover…

 


Grab The Bookmarketer For Your Site

David Leonhardt’s SEO and Social Media Marketing is proudly powered by WordPress
Entries (RSS) and Comments (RSS).

Close