David Leonhardt’s SEO and Social Media Marketing

Tips for better SEO (search engine optimization) and website marketing …

THE HAPPY GUY MARKETING

 

Archive for the ‘pagerank’ Category

SEO Shotgun or SEO Rifle?

Tuesday, May 10th, 2011

Or both?

For a huge website (ecommerce, directory, etc.) with many variations of the same product or service, whether by location or by brand, the effort to work individually on each one would be monumental.  For that reason, we often focus on:

a)   The home page, which is naturally where a fair number of links will have to go.

b)   A selection of the most important interior pages (such as those cities which might yield the best ROI) with a purposeful effort to help them rank better for relevant searches.

Some of the activities we do will help just those pages; some will help the entire site.  To understand this better, it helps to understand what types of ranking signals the search engines look for.  They include hundreds of specific signals, but most of them can be grouped as follows:

On-page relevance to a specific search query.

The changes we will make to the template(s) will bring benefits across the site to every page they apply.  In other words, even if we identify 10 city-specific pages on which to focus, every city-specific page will benefit.  If we add text or other elements on a page-by page basis, only the pages we work on will benefit.

Off-page relevance to a specific query.

Links that we obtain to 10 city-specific pages will often (but not always) confer relevancy.  The extent to which this occurs will depend on the content of the page that is linking, the anchor text of the link itself, and a number of other factors.  This relevancy is specific only to the page being linked to.  For instance, a link to the Chicago page of the website confers no relevancy to the London page.

Off-page importance/popularity.

Inbound links to a page also convey “importance” or “popularity”.  They represent a “vote” for the page in the eyes of the search engines.  That importance or that vote is specific to the page that is being linked to.  But, Google’s PageRank algorithm also spread the link-love to other pages that are directly linked. 

For instance, let us assume the Chicago page links directly to other Illinois city-specific pages, such as Rock Island, but not to any Florida city-specific pages.  If we obtain 20 links to the Chicago page, that will greatly boost the popularity of the Chicago page.  It will also boost the popularity of the Rock Island page, but not the Miami page (at least, not noticeably). 

This is why internal linking patterns for a big site like this are so important.

Domain credibility/authority/popularity

This is the exciting part.  Every quality link we build into the domain, strengthens the credibility/authority/popularity of the entire domain.  Every day the domain ages, strengthens the entire domain.  Every time a high-authority site links into the domain, every time there is a social media mention, every time the domain is renewed for a longer period of time…the entire domain – every page – benefits.

So the efforts we make for a few specific pages can benefit them all to some degree.  For a highly competitive sub-niche, that might not be enough.  For a smaller, less-competitive niche, the page might rank well without any direct attention to it.

 


Grab The Bookmarketer For Your Site

Trade Links With PR0 Pages

Monday, February 2nd, 2009

Do you trade links with PR0 pages?  Once upon a time I avoided PR0 pages.  It was usually a sign that a web page was being penalized or suffering from some contagious tropical disease.

But times have changed, and my approach has changed with the times.  In recent months I have seen a lot of “links” pages with PR0 value on the Google Toolbar.  This includes pages that are linked to from every page of otherwise PR4-PR5 websites.  PR0 in the Google Toolbar is n o longer, in my opinion, a kiss of death.

If I am offered a link from a PR0 page, the first thing I do is give it the old eyeball test.  If it has a lot of the wrong kind of links and the wrong kind of spammy words all over it, that ends it there.  But if the page looks good (on topic, manually-maintained, etc.)  apart from the lack of a green bar, I take a look at the home page of the site to see if it has a green bar and to see what the path is to the link page.  Is it linked to from the home page, from the template, from a second level page?  

In other words, I’ll make my own call at roughly what the value of the page is.  And given the error-prone toolbar that is at best an approximation anyway, I am sure my calculation isn’t that far off.  

Is this process more work than just looking at the toolbar?  Yes.  But when the toolbar is blank, the only alternative to this process is to just ignore what could be a good link.  Given how hard we work to find good, on-topic links, I think the work is worth it.

You can easily tweet this post by clicking reTWEET this

 


Grab The Bookmarketer For Your Site

You Need Sucky Links

Monday, November 10th, 2008

I’ve been meaning to blog about your desperate need for sucky links for some time, because I have not seen this aspect of link quality discussed anywhere.

People approach me all the time asking for high-quality links.  Not surprising – who would want low quality links?  But if you ask an SEO consultant to build you only PR6+ links, consider what message that sends to the search engines.

At worst, Google will assume you are buying links to buy PageRank…and we all know how much Google loves link-buying to boost PageRank, don’t we.

At best, the search engines will think your site appeals only to some kind of an elite.  How else would you explain that only high PageRank (high traffic, high-trust, etc.) pages link to your website?  Why do smaller blogs not link there?  How come your website is not included in any normal directories?  Why does this website have no appeal to normal people…and why should we rank it if it has no popular appeal?

No, the search engines won’t ask these questions outright.  But remember that all algorithms are created to simulate what would be normal linking and trust patterns that real people would follow.  Having links only form high quality, top-ranked websites does not look normal.  It’s a giant red flag.

Ironically, the more high-quality links you have, the more poor quality links you need.

CAUTION:
By “poor quality”, I do not mean spammy websites.
I do not mean you should be on pages full of words related to
enhancing body parts and gambling away the kids’ inheritance. 

But I do mean, that you want links form websites with a variety of linking profiles, ones that might be new or might not be running any link-building campaigns, ones that we might consider much less significant.  In short, you want a normal linking pattern.

The ideal way to build links is still the tried and true…

Step one, create awesome content such as useful articles, instructional videos, samples and demos, all the things that are generally called “link bait”.

Step two, publicize this content.  If it really is good, many websites will link to it, including top-rated websites and many smaller less significant websites.  They will do it naturally. So you will have a natural linking pattern.

To answer the obvious question, yes you will surely want to put extra effort into publicizing your content to high-trust, authoritative websites.  But those links are the kind of links that less-significant website owners will follow, read and link to, as well.

So don’t forget to get links form a wide variety of insignificant websites as part of your link-building campaign.  With algorithms designed to simulate something like democracy, votes from “the little guys” count, too.

 


Grab The Bookmarketer For Your Site

BrowseRank Strategies – Quality Web Site Design

Wednesday, September 3rd, 2008

A few days ago I reported on how BrowseRank goes beyond PageRank to rank websites according to user behavior.  Modern search engines tend to rank websites by relevancy and importance, and of course their algorithms can be gamed.  The concept of BrowseRank, which I have been mentioning to clients already for two years, would add a third and almost more important measurement – usefulness.  This, too, can be gamed.  However, most of the gaming would also work to your visitor’s advantage, so the Web will be a better place for it. 

In preparation for BrowseRank and perhaps other search engine measures of website usefulness, this is the first in a series of posts that will help you make your website appear useful in the eyes of the search engines.  You will probably find that these are things you should be doing anyway to increase conversions and profits, but that is not my area of expertise, so here we will look at them from an SEO perspective.

STRATEGY #1 – Design a website that says “Quality” the minute a visitor lands there.

This might seem soooooo obvious, but it needs to be said.  As obvious as it might seem, I come daily across dozens of websites that say “Amateur” or “Crap”.  Here are a few tips to make your website look like a professional website that can be trusted.

  1. Get a professional design that looks at least somewhat modern and in a style that suits your products and target audience.
  2. Lose the square corners.  Some corners are OK, but if your design is based on boxes, it looks like a basement job.
  3. No Adsense-type ads.  Yuck! Honestly, that is the biggest sign of a low-quality website.  A run of Adsense across the bottom is not bad, but the more prominent the PPC ads the cheaper the site appears.  By the way, ads are OK.  The more they look like content or part of the website, the better.  Adsense style ads just look cheap.
  4. Keep it clean.  Clutter looks as bad on a website as it looks here on my desk.  (But I don’t have a webcam to display this disaster to the world, so don’t display a mess on your website!)
  5. Make sure your web pages look good in various browsers and in various screen resolutions.  If 70% of people see a superb website and the other 30% see garbled images and text, they will bounce back to the search engine … which tells the engine that your website is not very useful (and it isn’t if it can’t easily be read by 30% of searchers).
  6. Make sure your website is available, which means good hosting.  I am never shy about recommending Phastnet web hosting.  This blog is hosted there and I have been migrating my sites to them over the years because of the five-star service I get when I need it.
  7. Make sure your code is working properly.  Seeing a PHP error makes the site look broken.  I don’t buy from someone who might be selling me broken goods.
  8. Avoid overly flashy design.  If your visuals call attention to themselves and distract from your message, you will lose people.
  9. Avoid automatic audio playing.  I can guarantee you that 99% of people browsing from a cubicle, as well as others in shared space, will zip back to the search engine in no time flat.  That sends a pretty bad signal to the search engines.
  10. Nix the cover page, especially one that shows a slide show on start-up.  And if you think people can easily scroll to the bottom to click the “skip intro”, it’s easier still to click the “back” button and choose a new website that does not place a barrier to its visitors.

Those are my top 10 web design tips for helping visitors see quality in your website.  Please feel free to add to this list in the comments below. Following these tips is not enough to make them stay on your website, but at least they won’t leave because the design scares them away.  In future “episodes”, I will share with you some additional strategies to help the search engines view your website as “useful”.

I would be remiss if I did not mention that we have some top quality SEO web designers on our team.  :-) 

 


Grab The Bookmarketer For Your Site

BrowseRank Goes Beyond PageRank

Monday, August 18th, 2008

I am just back from vacation and wading through three weeks of emails, but while I was gone a story broke that I just can’t let pass.  You might have heard me say it before, but sooner or later the search engines will shift their algorithms from focusing just on relevance and importance to include a third pillar: usefulness. 

This story entitled Microsoft Talks about BrowseRank Beyond PageRank shows that Microsoft is well on it’s way to developing just such an algorithm.  The article mentions a few ways a search engine can determine how useful searchers find a result, but there are more that are not mentioned in the article.

  1. Click-thru rates.
  2. Number of people who bounce back to the search page.
  3. Time before a person bounces back.
  4. Number of pages a user visits before bouncing back.
  5. Time spent on the specific page clicked.
  6. Whether the person bothered to scroll down on the page.

Of course, people like me would totally mess up the algorithm; I leave my windows open forever.  And if you think that user behavior is hard to manipulate, think again.  Usability will be now more important for SEO, but also will be coaxing users to spend more time on the website and go deeper in.

But the biggest change we will see is that website owners will have to focus on not letting their visitors bounce back to Google.  Suddenly having links to other useful sites will be a good thing, to the dismay of so many website owners who are terrified of placing a link to anybody else, for fear they might bleed customers, PageRank or both.

As all user search engines move into measuring user behavior, new strategies will be required.  I will report on some of those shortly.

Stay tuned… 

 


Grab The Bookmarketer For Your Site

Mature Domains – Ranking Advantage at Google

Tuesday, April 22nd, 2008

Those of us who have been paying attention new about the importance of domain maturity already a couple years ago.  But it looks like 2008 might be the year that the webmaster community starts to realize the importance of the issue, with Google’s United States Patent Application: 0080086467 being publicized.

The bottom line is that it is to your advantage to hold a domain that has been around — and in your ownership — for several years.  Maturity counts, and SEO gets easier as your domain ages.  It is also to your advantage to see links from mature domains, although I don’t think I would waste time checking the ages of every domain I hoped to get a link from (more on this in a moment).

Why are mature domains better?  Like so many things, especially on the Internet where much is ephemeral, a mature domain has stood the test of time and therefore is more likely than average to provide useful information or services.  An established domain is much, much less likely to be a spam site set up to turn a quick profit and disappear.  The bottom line is that a mature domain is more likely to be a trustworthy one.

And trust is what it is about.  When Google sends traffic to your site, it is placing trust in the site.  Maturity is one way Google can measure trust.  However, it is far from the only way.  PageRank is another.   There are likely dozens of measures of trust that Google employs, which is why I would not waste my time checking domain age.  A much better trust test is too see how well a site ranks for its own target search phrases.  If it ranks well, Google must trust it at least a fair amount, and therefore it is a good website to be associated with.

 


Grab The Bookmarketer For Your Site

Link Exchanges: It’s not the size of the PR, but how you use it

Friday, February 8th, 2008

If you plan to haggle over PageRank with me…goodbye.

That’s right, I have kicked the habit.  The size of your PageRank doesn’t impress me any more.  PageRank surely is still real, but an individual page’s PR can often shrink or grow so that neither you nor I can really know its real size.

  • The Toolbar PageRank has always been at best an approximation. 
  • Pages that show with PR3 or PR4 in the Google Directory are now often showing PR0 (PR Shrinkage)
  • Whole sites are now showing PR0, even while they continue to rank as well as when their pages showed PR3 – PR4.
  • Increasingly link-pages or directory-style pages are showing PR0, sometimes after showing PR3 or PR4 just a week earlier.
  • The gray bar used to mean a page was not cached in Google – a sure sign of a penalty or a brand new page.  No longer.  Many pages with PR are now displaying the gray bar.
  • Toolbar PageRank is dead!

Until very recently, I was assuming that the Toolbar was only showing false negatives – that if a page showed PR4 it was a pretty good bet that the PR of that page is at least PR4.  But recent observations have lead me to question this assumption, and perhaps I am jumping the gun, but I believe the toolbar is now showing false positives, too.

What I look for in a link exchange

Rather than PageRank, I look for a few other key items on the page where my client’s link will appear:

  • Most importantly, I want to know the page is cached by Google.  Not only is that absolutely vital for the link counting with Planet Earth’s most important search engine, but it is a fairly good indicator of whether other search engines and real human beings will find the page, too.  Not cached?  I won’t even look at any other factors.  This is the show-stopper
  • Is the page relevant to my topic?  If not, it had better be superb in every other area.
  • Is the page relevant (optimized) to my search phrases?  Again, if not, it had better be superb in every other area.
  • Is this page optimized for words like “link exchange” or “reciprocal links”?  Why not just type “SLEAZE” in big bold letters across the top of the page?  And don’t think the search engines can’t read words like “link exchange” or “reciprocal links”.  This is another factor that comes pretty close to being a show-stopper, too.
  • Is the page part of some automated link machine script?  Let’s face it, you don’t want to send the search engines a message that, “Hey, I can’t get real links from real people who just love my site, so instead I found an automated script to keep me warm at night.”  This is usually a show-stopper, too.
  • Once I see that a page is cached and passes the four eyeball tests above, it’s time to get critical.  The first thing I look for is a page that can easily be found.  If the page is one of 50 categories in a directory whose main page is linked only from the home page, that’s not a very good sign. Two clicks deep, and sharing link-juice with 50 categories?  I don’t think so.
  • I also check that the page is not the last in a series of pages that link one from the other…that’s how many clicks deep from the home page?  Never go to dance with someone if she’ll make you stand in line to dance.
  • I like a page that is either directly linked to from the home page or is linked to from a page that is in the sitewide template.  From an internal linking perspective, this tells the search engines that the page actually counts.  And you want your link to count.
  • Of course, I also look at the quality of the website overall.  Is this a website that likely carries a lot of trust value?  Does it rank well for similar search terms to the ones I am targeting?  Is the link directory full of all sorts of totally unrelated categories, perhaps some of them even unsavory?  If this website sleeps around too much, be careful what you might catch.
  • Is the page a content page?  I can forgive a number of other items for a genuine link in the midst of a page of text.
  • Is my link last on a page with 500 links?  I really prefer pages with 50 or fewer links, but if there are more, I am fine with having our link added near the top of the list, but not at the bottom.

Tell me you have a high-trust website and a linking page that is well linked internally, relevant to my search phrases and clean from the flotsam that shouts out “sleaze”.  But don’t tell me your page has bigger PageRank than mine.  It’s not the size that counts; it’s how you use it.

 


Grab The Bookmarketer For Your Site

Google Toolbar False Positives

Monday, January 28th, 2008

For some time I have been pretty much ignoring PageRank in the Google Toolbar.  I know too many sites that lost big PR on certain pages and not others or lost it across the board, all with no noticeable affect in their rankings. 

More and more I see that link pages on websites register PR 0 (solid white bar) or no PR whatsoever (solid gray bar, which used to mean a site with a penalty) but which I can see by the PR of the rest of the website and the link structure should be all countr by PR2 or PR3 if not more.

But I have been assuming that the Toolbar shows PR lower than reality, never higher.  That is, it gives lots (and lots and lots and lots) of false negatives, but never any false positives.  However, lately my faith has been shaken.  There have been a couple offers of link swaps involving pages that just intuitively should not have such a high PageRank.  Today one of them struck me as odd enough, that I though I would blog about it.

This is a home page of a website that, according to both Google and Yahoo has 2 pages and shows less than 300 backlinks at Google.  Furthermore, it is a blog with just two posts, both from 5 days ago.  How would it get to be PR5, which takes a lot more links than it once did?  And why am I suddenly getting an email for a home page link swap (becasue the savvy owner realises that he has something to capitalize on quickly before it turns to dust!)?

Don’t trust that green and white bar. 

 


Grab The Bookmarketer For Your Site

Link Building by the Specs? No Thank You!

Friday, November 30th, 2007

So somebody needs to build links to help his search engine rankings, and has come up with a very precise list of exactly what he wants.  It includes 19 exact specifications, which perhaps he pulled from a handy article somewhere on the Web.  Here is the list he presented, but this post could be abut any such list… 

1. One way non-reciprocal links only, no link exchanges.
2. THREE WAY Links where all links are in the same theme is OK
3. All links must be permanent.
4. Only 10% can be in directories.
5. No blogs
6. ONLY OUR THEME , (our theme is quite common so you will not have problems).
7. NO hidden links or any site that has hidden links.
8. No directories. No link farms, link-exchange programs, forums, Google banned site, black hat website. No guestbooks, links within forums, links within newsgroups or links from link exchanges etc. and never participate in any commercial web rings.
9. No sites banned by Google.
10. Link page must have a recent Google & Yahoo cache.
11. Must be manually submitted.
12. No Automated software (e.g., Zeus, Arelis or others)
13. All links must be from a different domain and IP address (geographically diverse, different class-c IP address block).
14. Only 10 to 15 links per week per language per site
15. Link pages must be static urls (no variables or parameters in the url)
16. No blacklisted or spam sites.
17. No more than 40 outbound links per page.
18. The link text must be from our keyword list and point to that keywords target page
19. All links must be static and without “nofollow” tags, no redirects, or javascript
20. Links must be on a PAGE with a Google PR of at least 2
21. All links must be on a page of the same language
22. Links must be on domains where we have no link

This post is about why I refuse to build links according to lists like this.  First, I must note that some of the items such as #9 and #7 and #15, for example, all make perfect sense.  These are deal-breakers that make a link useless. 

However, other elements are judgment calls: stipulating how many links per page, the PageRank, that a three-way link is acceptable but not a two-way link, among other factors.  What people hire me is to exercise that judgement.  To decode when a page might be PR1 but incredible on-topic and worth going after.  Or when a page might have 200 links, but with PR4 and lots of real human traffic it is worth its weight in gold.  Honestly, the client can just have his secretary or an offshore link-builder do the manual job of seeking out the links.  He does not need me for that.  What he needs me and my trained staff for is to exercise judgment – judgment that he is overriding with a pre-fab list. 

Does the client really think we have control over how many links are built in a week?  That depends on the response rate and the amount of back-and-forth with various webmasters.

And how much does he want to pay me to track down IP addresses to make sure they are all different?  Or check that the client does not already have a link on the domain?

That’s why I turn down offers to try to fit a strategic process into so comprehensive a list of technical specifications.

 


Grab The Bookmarketer For Your Site

Google cracking down on paid links

Tuesday, June 12th, 2007

First, let me preface this post by saying there is nothing wrong with buying paid links, regardless of what Google says or what you think Google says.  Paid links are called advertising.  “Free” links, which are never actually free, are called public relations.  This has been going on since someone in ancient Egypt first wrote a sandwich board reading “The End is near” and someone else asked, “How much to add ‘Reserve your burial plots today!’”

However, Google does not appreciate links sold strictly to boost PageRank, specifically targeting its ranking algorithm.  This is understandable.

So what is a website owner, intent on promoting his website and his services, to do?  Go for the best links possible, whether they are paid or free, sticking within budget.  If most of your links are paid, that sends quite a red flag that maybe there is nothing on your website of enough value to actually earn links.  In fact, that in itself is a pretty good case for Google to demote your website in its rankings.

On the other hand, if there is a website that could be sending you some targeted traffic, that can show high relevance and offers good link juice, why not pay for the link?  Google will not penalize you for having bought a link or two; Google will penalize you for trying to purchase a re-arrangement of its listings.

 


Grab The Bookmarketer For Your Site

David Leonhardt’s SEO and Social Media Marketing is proudly powered by WordPress
Entries (RSS) and Comments (RSS).

Close