David Leonhardt’s SEO and Social Media Marketing

Tips for better SEO (search engine optimization) and website marketing …

THE HAPPY GUY MARKETING

 

Archive for the ‘Google’ Category

Google Lets Evil People Block Your Domain

Thursday, February 17th, 2011

Yeah, I thought that title would grab you.  Google announced a new extension to its Chrome browser, an extension that could truly rock the SEO World.  The extension does two things:

  1. It enables searchers to block domains from search results.
  2. It tells Google what domains have just been blocked.

chromeSays Google anti-spam spokesman Matt Cutts, ” If installed, the extension also sends blocked site information to Google, and we will study the resulting feedback and explore using it as a potential ranking signal for our search results.”

This blog post will tell you exactly how to preserve and enhance your search engine rankings in a world where users can send explicit feedback (this Chrome extension is neither the first tool for explicit feedback, nor will it be the last; but it might just be the most powerful, so far).

I should make it clear that I was always a big believer is both explicit and implicit user feedback.  The search engines would be fools not to pay attention to which sites please their visitors when serving up sites to new searchers.

It was just over two years ago that I released Sticky SEO, essentially detailing how you can keep more visitors longer on your website, going deeper into the site.  For the most part, this means pleasing more visitors even more than you already do, since that is what Google looks for.

So what do you do with this Chrome extension?  Well, you want to please your visitors so that they don’t swear, curse and block your domain.

PROBLEM # 1: FREE LOADERS

Searching for free tattoos?  Probably not.
Searching for free tattoos?
Probably not.

There are a lot of people searching for free stuff on the Internet.  You don’t give your stuff away free, but the “free loaders” show up at your website.  “What?  They want a million bucks to dig a hole to China?  I want someone to do it for free.  Bloody rip-off scammers.  Block, block, block.”

There are probably not too many people searching for “dig a hole to China” and expecting free service.  Nor are there many people expecting to get new shoes for free.  Nor gourmet coffee or gift baskets.  Nor metal buildings or intercontinental pipeline installation.  Not even free tattoos or body piercing. But there many niches that include freebie searchers,  for example…

  • website templates
  • resume help
  • music downloads
  • ringtones
  • online games
  • learn Spanish

How do you make sure that people searching for freebies don’t block your website when they discover that you are one of those evil profit-seeking cannibals who wants to feed your family?  You give them what they want, of course.  You add something free to your site.  You give them a free option, or you link to a free option.  Somehow, you make sure you please them.  Remember what your mother said?  “You can never go wrong being nice to someone.”  Well, she should have said that.

PROBLEM # 2: GENERALISTS

Let’s say you sell a very specific item or service that is part of a bigger niche, but people don’t search all that specifically.  In Sticky SEO, on page 14 (until I eventually get around to updating it), I tell the tale of a client who wanted to revamp its website back in 2006.  They sold commercial fitness equipment, but their clients would search just for “fitness equipment”.  The problem was that ten times as many people looking for home gyms also searched for “fitness equipment”.

Life would be easy if people searched for “home fitness equipment”  or “commercial fitness equipment”, but life wasn’t meant to be easy.  What would they do about all this traffic from generalist searchers?

Please them, of course.  Remember what your mother said?  “You can never go wrong being nice to someone.”  Like I said, she should have said that…especially if she knew Google was going to give all those people an easy way to block your domain and tell Google your site sucks.

How to please those generalists?  No point in reprinting page 14 here.  You can read it for yourself.  (Hey, it’s a free download.  Did you think this was a sneaky sales pitch or something?)

Your evil competition wants to eat you.

Evil competitors want Google to eat you.

PROBLEM # 3: EVILDOERS

Yes, the world is an evil place if you look at it right.  Google’s motto is “Do no evil” (or something like that.  But they never said anything about not arming your competition to do evil, did they?  How much do you want to bet that across the Internet’s freelancer markets there will be an SEO arms trade: “100 domain blocks for $15 – from separate IPs in over 20 countries”?  Maybe for $25, who knows?

So how do you deal with that?  No inbound link is supposed to hurt your rankings, so that your competition can’t spam you out of the search results.  But what if a coordinated group of offshore outsourcing in China and India and Greenland gang up on you?

Sorry, I don’t have an answer for you on this one.  But I am sure Matt Cutts will be asked about it sooner or later, and maybe he will have an answer.  Hopefully.

 


Grab The Bookmarketer For Your Site

Look who follows NoFollow links!

Monday, August 31st, 2009

Earlier this year, I speculated on how the search engines treat NoFollow links.  For those who might be a little green, NoFollow links are not totally ignored by the search engines.  For those who really, really green, NoFollow links are believed to be totally ignored by the search engines (because they have the rel=”nofollow” attribute in the link code).

So we ran a little experiment. 

A client of ours had a fully developed website that has never been used.  Not a single link points to this website, so in the eyes of the search engines, it should not exist. 

It was not indexed at Yahoo. It should go without saying that Yahoo displayed no backlinks.

The site was indexed at Google.  (How, why and whether Google should index orphan sites that have not been released to the public is a topic for another post.) Google showed no backlinks, but the site did rank #8 at Google for one very important search, based primarily on the name of the domain. It did not show up in the top 100 for a few other key searches. All searches are for local terms specific to a certain city, so they are moderately low competition.

For three weeks, we posted comments on NoFollow blogs (yes, intelligent comments reflecting the specific content of the blog posts) to create a steady stream of NoFollow links, without creating any DoFollow or “normal” hyperlinks.

Were the NoFollow links followed?

At the end of week 4, we found Yahoo had indexed the website and showed 51 backlinks.  All of these are NoFollow links. The more important searches were all showing in the top 20, one as high as position #6. Remember that these are moderately low-competition, local searches, but this is all on the strength of a few weeks of exclusively NoFollow links.

Google showed no backlinks after 4 weeks.  No surprise there; Google is very sporadic with if, when, how and which sampling of backlinks it chooses to display. The ranking at position #8 had not changed, but a couple other search terms were now ranking at Google, one of them as high as position #11. Again, this is exclusively on the strength of NoFollow blog comments.

What can we conclude about NoFollow links?

NoFollow links still obviously count at Yahoo.  Do they count as much as DoFollow links?  A more complicated experiment might help answer that question.  Anyone feel like taking up the challenge?

NoFollow links also appear to count at Google.  Or perhaps some do and others don’t, depending on other factors Google might use to rate links from specific domains. However, we can be sure that Google does follow at least some NoFollow links.

The conclusion I would draw from this is that people really should not focus on the NoFollow/DoFollow issue. Build links that are officially followable when you can, but don’t let a NoFollow attribute in a page’s links dissuade you from creating a link you would otherwise pursue.

 


Grab The Bookmarketer For Your Site

What PageRank Can Tell Us About SEO and Bounce Rates

Wednesday, December 31st, 2008

Last night I posted about what Google has to say on SEO and bounce rates.   You can  view the post here. A great question was posted by Wilson: “David, I was wondering, why Google want to have two different answer for the bounce rates…? ”

Even broken down into four parts, my response was longer than allowed for comments on this blog, so I decided to make it a post on its own.  Here is my answer to Wilson’s question:

Great question. I will get to that, but let us look at another misunderstood part of the Google algorithm.

We have been wondering for years why Google has three different measures for PageRank. The real PageRank calculation used in their algorithm is a complex logarithmic calculation. All other things being equal, a link from a PR4.12964 page is probably worth many links from a PR3.45294 page, for instance (We have no idea to how many decimal spaces the real PageRank is calculated, not whether this has remained steady over the years or whether it fluctuates over time).

Then there is the PageRank in the Google Directory, which supposedly is on a scale of 8. I can’t find any reference to the 8-point scale in the Directory, but the Wikipedia article on PageRank is a good reference on this point. Interestingly, the Google Directory states that…

 ”The green ratings bars are Google’s assessment of the importance of a web page, as determined by Google’s patented PageRank technology and other factors. These PageRank bars tell you at a glance whether Google considers a page to be a high-quality site worth checking out.”

Note the “and other factors” wording.

Finally we have the famous Toolbar PageRank, a green bar on a scale from one to ten. This is what most webmasters mistakenly refer to as Google’s PageRank calculation. However, it is just an estimation that makes a PageRank of 4.0001 look the same as a PageRank of 4.9999, even though the latter might be worth many times the former. Meanwhile, it makes a PageRank of 4.9999 look much less valuable than a PageRank of 5.0001, even though the two are almost the same. Furthermore, everyone involved in SEO can recount numerous instances where a page “should” have a much higher or much lower PageRank than another page, based on the number and value of incoming links, but the Toolbar PageRank does not reflect that. (For instance, I have noted on many sites that a “links” page with identical link juice to a PR3 content page might nevertheless have a PageRank value of zero.)

What does this tell us about bounce rates?

Just like PageRank, bounce rates is a metric Google shares with its users. PageRank is viewable to everybody; bounce rates are viewable only to the website owner. In both cases, Google is showing a very simple calculation … a number people can use to quickly make comparisons between pages, between websites, between last month and this month, etc.

As I wrote above, “It would be a ridiculously simplistic algorithm that calculates bounces using such simple calculations.” Any serious calculation of bounces applied to a search engine ranking algorithm would have to be such a complex multidimensional equation that it would be useless to you or I as humans viewing it with our eyes (unless you happen to be a mathematical genius – and I mean genius – which I am not by a long shot.

Except to the extent that a search engine chooses to reveal how it treats bounces and other actions in its algorithm, we will never know for certain what plays a role and what does not, nor how big a role each factor plays. This is par for the course with ranking algorithms.

It is also totally possible that Google and the other search engines do not include bounce rates and related user actions yet in their algorithms.  Adam Lasnik’s comments quoted in my previous post are good hints, but they are hardly official.     Google engineer Knut Magne Risvik speaking in Norwegian at Digi.no and saying that  Google can measure how many seconds it takes from when a user clicks on a link to click back to Google, and if it is a short time that visit was a failure, is not quite an official Google statement either.  The only search engine that has released anything official is MSN through its BrowseRank paper … and that is not a statement current practice but of future intentions.

As this very young field matures, Google might also change its Google Analytics definition of “bounce rate”. SEO aside, the raging debate over whether a high bounce rate could sometimes be a good thing (depending on the nature of a website) makes a good case for changing the Google Analytics definition, too.

The summary to all this is that I have to answer Wilson with a simple “I don’t know”.  But, just like defining “bounce rate” and “PageRank”, such simple answers are really a lot more complex than they look.

 


Grab The Bookmarketer For Your Site

Google Leaves Questions About Bounce Rates

Wednesday, December 31st, 2008

Regular readers will know that I have been in a somewhat involved debate on this blog and over at Sphinn on the issue of bounce rates as they might now or later on apply to SEO.  I maintain that is a matter of business necessity that search engines would try to more precisely measure user satisfaction with each result of each search phrase, and that bounce rates would be one metric they could use.  Frequent readers will also know that I do not view “bounce rates” as a simple number or as a static pass-fail type of calculation.  It would be a ridiculously simplistic algorithm that calculates bounces using such simple calculations, in my humble opinion. 

Recently, Web Pro News  reported that Google answers bounce rates questions.  In fact, two separate answers were provided, one that relates to SEO and the other that relates to Google Analytics.  Many webmasters will confuse the two and we all know that’s how false rumors get started — the kind of false rumors that years from now will be reported as fact by many people calling themselves “SEO expert”. 

It is possible that Google Analytics and SEO are related or will be related, but don’t bank on it.  Here is what Adam Lasnik of Google has to say specifically about bounce rates and SEO.

If you’re talking about bounce rates in the context of Google Analytics, I’m afraid you probably know as much as I do. I love the product, but don’t know the ins-and-outs of it very thoroughly.

If you’re talking about bounce rates in the context of Google web search and webmaster-y issues, then we really don’t have specific guidance on bounces per se; rather, the key for webmasters is to make users happy so they find your site useful, bookmark your site, return to your site, recommend your site, link to your site, etc. Pretty much everything we write algorithmically re: web search is designed to maximize user happiness, so anything webmasters do to increase that is likely to improve their site’s presence in Google.

The bottom line is that you want to do all the things that we talk about in Sticky SEO to keep people on your website, to engage them in your website, to send Google and other search engines signals that they found your website to be useful.  And, of course, you want to reduce the number of visitors who send the search engines signals that your website is useless.

Just for information, here is my post on objections to ranking based partially on bounce rates.

 


Grab The Bookmarketer For Your Site

Link bait lesson from Matt Cutts

Tuesday, November 18th, 2008

Matt Cutts, Google’s public face for webmasters and search engine consultants, has shown us how to do link bait.  Oops, I mean, how to do really good quality content.  Yeah, that’s what I meant to say.

Here is the link bait…I mean content:

http://www.mattcutts.com/blog/9-google-mobile-iphone-tips/ 

Note that it is a numbered list, and not a “top 10″ list.  Matt chose a top 9 list, which is just a little offbeat..  Note that there are plenty of illustrations.  And the text and images combined are useful – actually demonstrating how to do something - not just silly stuff (although sometimes I like silly stuff, too).

Matt submitted it to Digg: 

http://digg.com/apple/9_Tips_for_Google_s_New_Voice_Recognition_App_for_iPhone

As of now, it has 42 Diggs. 

Study it hard, becasue even if your content doesn’t get more than one or two Diggs, this is how the Google guru prepares his content, so you can’t go wrong posting something like this on your website. 

There now, Matt just got a link from me as a result of his quality content.  You see?  It works. 

 


Grab The Bookmarketer For Your Site

New Google RankCheck Tool is Released!

Wednesday, August 27th, 2008

Just kidding.  There is no new Google RankCheck tool.  But there should be.  A couple days ago, I reported on how Google has finally blocked automated searches by software such as WebPosition.  This creates a vacuum in the marketplace – a vacuum best filled by…Google!

Yes, I think Google should announce a new GoogleCheck tool, and here is why. 

Google says that automated rank-checking tools should be avoided because it taxes Google’s servers.  OK, let’s take this argument at face value.  Automated checking does add a tremendous volume to the number of searches.  I gave an example the other day of how one website adds 1200 searches.  If that is done responsibly once every 4 – 6 weeks, ad there are a mere 1000 websites searching, the burden is not too big, especially if most of the searches are happening at off-hours when Google’s servers are underused.  However, if 100,000 websites are doing automated searches every day during peak usage hours, perhaps digging deeper into the SERPs, that could start taxing Google’s servers.

Let’s further assume that Google has a hidden agenda.  Let’s assume it does not like automated rank checking because people are getting a free ride from Google – conducting billions of searches without ever visiting Google and being exposed to paid search advertising.  Let’s face it, why should Google give away huge volumes of free search to webmasters without requiring them to view the PPC ads that bring in Google’s revenues?

So how would Google RankCheck help Google?

First, Google could control when and how automated searches occurred.  It could, for instance queue the automated searches for the next available down turn in bandwidth usage, or it could simply schedule it at an appropriate hour.  Problem solved.

Second, it could make money, which is what a corporation like Google is supposed to do.  Instead of giving away tons of free search to webmasters who don’t even visit Google to make the searches, Google could sell the software.  An official Google RankCheck tool would sell much, much better than Web Position.  Google could make a beautiful case, too:

“We are in search.  Manual search of one phrase at a time is free to everyone, including webmasters checking how they rank.  However, if you want to conduct bulk searches, you can purchase Google RankCheck for a modest fee.”

There would be advantages for webmasters, too.  Google could give people the option of viewing rankings in various locales.  For instance, if I want to see how my website is ranking in San Francisco, Chicago and Miami, I could specify that (you know how Google results differ from place to place).

Perhaps there are other advantages for Google or for webmasters.  Why not share your thoughts on that with readers by posting your comment below.

 


Grab The Bookmarketer For Your Site

Google Blocks Automated Rank Checking

Monday, August 25th, 2008

Google has been threatening…er, promising to block rank-checking software such as WebPosition for years, and even mentions them by name in their webmaster guidelines.  It seems like Google has finally decided to honor its promise and block the software.

I was last able to check rankings on August 22.  Since then, nada.  A quick survey around the Web and it sounds like a lot of other people found their automated ranking checks were blocked on August 1, August 5 or August 7.

What does this mean for SEO?  Quite a lot…and amazingly little.

 It means that we cannot check dozens of keywords quickly and painlessly.  Manually checking 50 search phrases for, let’s say, a dozen clients, often going onto the second or third page of Google means…let’s see…1200 manual searches.  Suppose there are two dozen clients.  Suppose there are 100 search terms.  You can do the math and see how time consuming this would be.

However, let us for a moment suppose that we don’t do 1200 manual searches every month.  Suppose instead we do occasional searches to see where a client stands for a few major search phrases?  Or we check different searches on different months as we focus the campaign on different sub-niches?  What if we invest more effort in building rankings than in measuring them?

Yes, we do need to measure.  We need to know if we are moving forward.  We need to be able to show clients roughly the magnitude of the progress.  But perhaps we will be using a smaller basket of keywords and letting the long tail take care of itself.

For me, the main use of rank-checking across a broad range of search phrases was to determine which search phrases or family of search phrases need more focus as we ride the surf of algorithm changes, renewed competition and other happenings.

Of course, clients also require reporting…which we will no longer be able to do to the same level as we had been doing.  So the immediate effect is that over the next month or so, I need to budget a few hours to explain to clients why lists of ranking positions can no longer be the way to measure progress.

 


Grab The Bookmarketer For Your Site

BrowseRank Goes Beyond PageRank

Monday, August 18th, 2008

I am just back from vacation and wading through three weeks of emails, but while I was gone a story broke that I just can’t let pass.  You might have heard me say it before, but sooner or later the search engines will shift their algorithms from focusing just on relevance and importance to include a third pillar: usefulness. 

This story entitled Microsoft Talks about BrowseRank Beyond PageRank shows that Microsoft is well on it’s way to developing just such an algorithm.  The article mentions a few ways a search engine can determine how useful searchers find a result, but there are more that are not mentioned in the article.

  1. Click-thru rates.
  2. Number of people who bounce back to the search page.
  3. Time before a person bounces back.
  4. Number of pages a user visits before bouncing back.
  5. Time spent on the specific page clicked.
  6. Whether the person bothered to scroll down on the page.

Of course, people like me would totally mess up the algorithm; I leave my windows open forever.  And if you think that user behavior is hard to manipulate, think again.  Usability will be now more important for SEO, but also will be coaxing users to spend more time on the website and go deeper in.

But the biggest change we will see is that website owners will have to focus on not letting their visitors bounce back to Google.  Suddenly having links to other useful sites will be a good thing, to the dismay of so many website owners who are terrified of placing a link to anybody else, for fear they might bleed customers, PageRank or both.

As all user search engines move into measuring user behavior, new strategies will be required.  I will report on some of those shortly.

Stay tuned… 

 


Grab The Bookmarketer For Your Site

Google Cache Gets Style

Friday, July 25th, 2008

OK, so it’s not much style, but it’s certainly a cleaner look than the previous mess.  Much easier on the eyes.  For those who don’t know what this is, the Google Toolbar has a handy quicklink to the cached view of any page.  Simply go to the little downward arrow next to the PageRank bar and from the drop-down menu, click cache.  This shows you what Google has on file about the page currently in your browser window.

 

No changes to the look when Google has not cached the page; just a regular search window. 

 

 

 

 

 

 


Grab The Bookmarketer For Your Site

Google Indexes Flash

Tuesday, July 1st, 2008

OK, so before all you Flash-crazed developers get too excited, what Google specifically will index is two things:

1. The text content in .SWF files.
2. URLs can now be followed.

This means that Flash websites can indeed be made SEO-friendly, although I am uncertain how much Flash designers would want text content in their files – specifically enough text content to really make a page SEO-friendly.

In any case, you can read more about this development at the Google blog post on Flash indexing

 


Grab The Bookmarketer For Your Site

David Leonhardt’s SEO and Social Media Marketing is proudly powered by WordPress
Entries (RSS) and Comments (RSS).

Close