David Leonhardt’s SEO and Social Media Marketing

Tips for better SEO (search engine optimization) and website marketing …

THE HAPPY GUY MARKETING

 

Archive for the ‘rankings’ Category

SEO Strategies for government websites

Friday, March 27th, 2009

In times like these, when companies are cutting costs, not many are hiring SEO services.  But governments aren’t cutting back (quite the contrary, but that’s a rant for another time).  I am just putting to bed a major SEO audit of a government website, so I thought I would share with you some of my observations.

 “What?” you ask.  “Why would a government website need SEO?  They don’t compete for business.  They’re an information website, like Wikipedia.”  In fact, government websites do compete in a number of ways and SEO can be a very powerful tool in reaching the right audience. 

This post recommends SEO strategies to address issues that are particularly relevant for government websites – and in some cases any major information portals, such as university or newspaper websites.  I cannot reveal who our client is, but you should know that it is a government agency that operates bilingually with both domestic and foreign audiences.

Governments have certain natural advantages – the search engines like government TLDs (top level domains, such as .gov or .gc.ca), the sites are huge, they are link magnets because they carry so much official information (they are the authority on so many things) and they typically have a high PageRank.  Just to give you an idea, here are a few key stats for the website I have been working on (I wish this blog had stats 10% as good!:

government-seo-stats

But government websites also have some unique challenges.  Here are a two of those challenges, along with strategies to address them:

SEO against the scammers

Any government agency that has the authority to approve or reject something, is a potential target for scammers.  This might apply to:

  • Licenses
  • Permits
  • Grants
  • Jobs
  • Contracts
  • Status (tax status, citizenship status, business category, etc.)
  • Appeals

Scammers will optimize their websites using words like “free” and “guaranteed” and “easy” and other qualifiers that sound like you can somehow get past due process.  Obviously, a government website does not offer “guaranteed grants” or “free mortgages” or “guaranteed immigration” or “easy access”.  But to protect the integrity of its services, the agency must rank above the scammers for those searches.

We found a few instances where there were several scammers ranking for several such searches better than our government client, so recommendations were made to out-SEO the scammers.  In such cases, the government website must rank well not just for things it wants the public to hear, but for things it would just as soon noit discuss.

SEO for proper direction

Given the size of most government websites, there are usually many levels of information.  People might do searches based on very specific or broad criteria.  For instance:

  • There might be a department with several branches. 
  • One of those branches might be in charge of several areas interest. 
  • Within one of those areas of interest there might be a number of programs. 
  • And one of those programs might include local delivery through offices in various locations. 

The bread crumb trail to get to one of these delivery points would look like this:

Home > branch > interest > program > locations > specific location

So information might go five or more levels deep, with multiple branches to each level.  This means that there are many pages with similar wording that could rank for a specific search.  Here are some of the possibilities.

A person seeking a specific item, such as “health card Bottomsville office” would ideally land at the “Bottomsville” office page.  He might also land on the page listing all the locations, including the link to the Bottomsville page.  Or he might land on the health card program page, where he can follow the link to “health card office locations” and with a couple clicks he gets where he wants.  These are all good scenarios.

On the other hand, someone doing a general search – let’s say a person in Sometown searching for “health card information” would ideally land on the program page, where she might find links to “health card fees”, “health card eligibility”,  “health card office locations”, etc.  However, if she lands on the page for the at the “Bottomsville” office page, she might be confused (because she is from Sometown).  If she is clever, she will notice the breadcrumbs and follow them up…but don’t count on her to notice them, nor necessarily to understand their utility.

If the wrong pages are showing up for certain searches (and we found a number of those), the pages need to somehow be unSEOed, and the correct pages need to be better optimized.

Other SEO challenges

There were other SEO challenges that are not unique to government websites.  In this case, we had to track not just domestic searches, but the rankings across a variety of country-specific search engines.  And, given the bilingual nature of the website we had challenges when certain searches were the same in both languages and the English pages were the only ones that were showing up.

And don’t get me started about the restrictions one has working within the prescribed regulations of a government website.

Beyond all these issues, SEO is SEO, and we provided a report based on best practices, competitive intelligence and working within the constraints of what changes this government agency would be able to make.  Doing SEO on a government website is a complex but rewarding project.

 


Grab The Bookmarketer For Your Site

Don’t Trust SEOs Bearing Pretty Packages

Thursday, February 12th, 2009

gift-boxesMany SEO companies advertise specific packages, such as a gold, silver and bronze SEO package costing so much per month and including so many links of so much PageRank and focusing on so many search terms.

We don’t.

SEO is not a science, it is a sport. It involves strategy. It involves balancing various aspects. It involves flexibility and responding to what others do. It involves competing against others….and each game is different.  One size does not fit all, and the size and shape has to be able to change sometimes on a dime.

Good SEO does not come in pretty packages.

Here are some of the limitations that packages create:

  1. Building so-many links each month. Come the end of the month, there would need to be a counting of links, and if the links have fallen short, a flurry of activity to create links, even if the quality of those links is poor. Don’t get me wrong, you need sucky links. But the fact is that those links might actually be negative for your site .
  2. Building links based on PageRank. Plenty of good links of low quality would be rejected. Why create links you are not being paid for?
  3. Monitoring the number or PageRank of link-building. What should be monitored is the ranking for those search phrases that are being pursued. Over time, changes in strategy are required .
  4. Client knows what to purchase. How is the client to know what package to buy? Even the SEO consultant can only guess at what is required, a sin any sport. As the SEO moves down the field, he is constantly re-evaluating the game and what is needed .
  5. Automation. Let’s face it, if your website attracts the exact same number of links every month, that’s a red flag sign of automation, even if the links were built manually .
  6. Best links will be missed. The best links are those that can’t be predicted, the ones using link-bait and social networking. Anyone offering a package cannot afford to invest the time in these high-quality links .
  7. Limiting the number of search terms. I see the search phrases my clients have to target change all the time. Sometimes they change their product offering or target a new demographic. Sometimes the language that searchers use changes. Sometimes we see that we are getting a surprising amount of traffic from a keyword we were not targeting…and we go after it .
  8. A package is a product. Do you want to sell a product, or do you want to be part of the team. If I was to hire an SEO consultant, I would want him to be part of the team .

Whether in basketball, baseball, hockey or any other competitive sport, a custom package that is flexible and responsive to changes is the only way to go. SEO is no different. Hire an SEO consultant that can outline a custom program and who is not shy to make changes on the run. Make sure your SEO is as flexible as you are; your market is dynamic and the search engines are even more so.

 


Grab The Bookmarketer For Your Site

How Is NoFollow Data Treated By The Search Engines?

Wednesday, January 14th, 2009

In theory, the search engines don’t follow links with the NoFollow attribute attached. That’s what NoFollow means. However, anybody who has been checking backlinks for multiple websites (for example, if you have many SEO clients, prospective SEO clients, competitor websites, etc.) will notice that Yahoo lists many NoFollow links as backlinks (I wrote about this last year, too.). I have seen this at Google (I believe in Webmaster Tools, but my memory is not certain on this point – sorry). 

If the search engines index NoFollow links, it is possible they use the data (otherwise, why waste so much computing resources indexing them?), despite that purpose of the NoFollow attribute being that the links should not count in their algorithms. This post speculates on how the search engines might use this data. 

A Partial History of NoFollow

Seasoned SEO experts can skip this section. It is intended for newbies, and it is only partial because I am sure I am missing out some details.

Before there were blogs, there were guest books. Guest books were like prehistoric Web 2.0 . They allowed website owners to create some form of user interaction with otherwise pamphlet-like websites. They engaged the user. They created stickiness. Best of all, they were set-it-and-forget-it, so many website owners thought “why not?”

Spammers quickly learned that they could drop links in guest books, which were often unmonitored. The extent to which this was happening reached near epidemic proportions to the extent that serious SEO specialists were leery of leaving any links in guest books for fear of having their websites penalized for spamming. Search engines were concerned because any mass linking scheme threatens to skew the quality of the search results they present their clientele – the searchers.

The search engines were let off the hook by the website owners. Those who did not moderate their guest books were disgusted by the spam. Those who did moderate their guest books were frustrated by the spam. For a low- or no-maintenance tool, guest books were proving to be a pain without any obvious benefit (such as increase in sales).

In truth, blogs came along and offered a much better way to engage visitors in a two-way conversation. Blogs offered a venue for opinionated and chatty webmasters to engage with visitors, and the blog CMS was much easier to handle than an “articles” section on the website (especially because many bloggers found they could dispense with pesky technicalities like grammar and even staying on-topic.). Blogs also offered a much more obvious business benefit than guest books – search engine rankings, which could be translated into increased sales.

It wasn’t long before blog comment spam had replaced guest book spam. But this time, the search engines would not be let off the hook. Blogs had so many obvious benefits and so much more invested in them that, instead of petering out, they kept proliferating. Indeed, each blog spawns hundreds or even thousands of pages, each one fertile for dropping a spammy link in a comment. And many blog owners were (and still are) lazy, allowing comments to be automatically posted without moderation. NOTE: This blog is moderated, and I use a DoFollow plugin. If your comment is worthwhile, your link will count. If your comment is not worthwhile, sorry.

Many bloggers became alarmed at all the spammy links, and were worried that they might be penalized for linking to bad neighborhoods. That’s why the search engines created the NoFollow attribute. And if you believe that, I have some superb oceanfront property on the moon that might interest you for a surprisingly reasonable price.

In fact, search engines were once again concerned because as I said earlier any mass linking scheme threatens to skew the quality of the search results they present their clientele – the searchers – and mass automatedblog comment spam was showing no sign of slowing down.

The search engines gave everybody, not just bloggers, a simple means to indicate when an outbound link from their website should not be followed by the search engines because it is not a link in which they have placed trust. Basically, the whole point of NoFollow is to eliminate user-generated links from the algorithms, since those links cannot be considered as “votes” for the sites being linked to by the sites doing the linking.

So Why Are Search Engines Indexing NoFollow Links?

 This is a puzzle. If the search engines created NoFollow to tell their robots not to follow, obviously something has changed since then, because they are following. But do the links affect the rankings? Here are a few theories of how the search engines might be using the data. These are highly speculative, so feel free to throw in your own speculations into the comments below.

  1. One obvious theory is that the search engines are not using this data at all in their rankings.
  2. A second theory is that the search engines are using the links to determine relevancy (a link from a comment on an SEO blog to my website helps the search engines confirm that my website is about SEO), but that the links do not count toward link popularity or PageRank.
  3. A third theory is that the search engines have built into their algorithms a process for selecting which NoFollow links they should include in their algorithm calculations. For instance, they might choose to follow all Wikipedia NoFollow links, but no MySpace NoFollow links.
  4. A fourth theory is that the search engines use NoFollow external links to dampen their trust rating of a website. If a website owner has lots of external links that it is not willing to trust, that is one signal that the linking website itself is not all that trustworthy. Makes sense for MySpace. Bummer for Wikipedia (but I’ve voiced my opinion on Wikipedia’s abuse of the NoFollow attribute before).
  5. A fifth theory is that the search engines use NoFollow internal links to dampen their trust rating of a website. Unlike some of these theories, this one makes sense. After briefly experimenting with internal NoFollow internal links on one of my websites, I removed them all. Think what message it sends the search engines about the quality of your website if you say you can’t trust your own web pages.
  6. A sixth theory is that search engines do not use NoFollow links directly in their rankings, but that they are included somehow in a link profile establishing a website’s level of activity on the Web.

I would like to hear your comments and theories. I should note that I have not researched this post in any great deal, because it really is just speculation. I wrote it while my daughters danced last weekend, and there is no WiFi there. So feel free to add your theories and enlighten me and our readers if you know of any great sources that can shed some light on this.

You can easily tweet this post by clicking Retweet This

 


Grab The Bookmarketer For Your Site

Bounce Rate SEO Fallacies

Tuesday, December 23rd, 2008

Of late there has been a lot of discussion about bounce rates and whether or not the search engines count these in their algorithms.  A few days ago I posted some pros and cons on this issue.  Today I would like to share with you 9 common objections I have seen to using bounce rates as part of the search engine algorithms, and refute 8 of those.

 

As far back as late 2007, there were reports that webmasters were seeing a difference in their rankings for major keywords within a few weeks of drastically changing their bounce rates.  However, none of the tests and reports seem to be complete enough or repeatable enough to constitute “proof”. 

 

As a result, there are plenty of naysayers who believe that such things as bounce rates are not now and probably never will be part of the search engine algorithms. 

 

I am of the opposite view; bounce rates will certainly be a major part of search engine algorithms and probably already are to some degree.  That is in large part – but not completely – the premise behind Sticky SEO.  Let us not forget that Microsoft has been spending a fair amount of energy on what has been called BrowseRank, which is in part based on bounce rates.

 

Objection 1, there is no definition of “bounce rate”. 

 

Response. This is the flimsiest of arguments.  A bounce is when someone leaves a website, going back where they came from.

 

Objection 2, I don’t like how Google Analytics defines a bounce.

 

Response.  Sadly, Google doesn’t ask me for advice, either.  But cheer up, the bounce rate in Google Analytics might not be the same as they use in their algorithm, just as the little green bar is not necessarily the PageRank they use in their algorithm.

 

 

Objection 3, many sites don’t have Google Analytics turned on, so Google would have very incomplete data.

 

Response (scratching my head in confusion).  What does Google Analytics have to do with anything?  This is about Google (or Yahoo, or MSN, or Ask, or some other) tracking their own traffic and how their own users move about and – most importantly – how their users return to their website.

 

Objection 4, what is the threshold for a bounce?  After 5 seconds?  After 10 second?  After 15 seconds? This is a mess!  (This is often part of the how-do-we-define-a-bounce debate.)

 

Response.   A bounce is a bounce, whether it takes a person one second or one hour to bounce back, it is a bounce.  How the search engines choose to treat bounces with varying lag times is another matter.  Let’s be clear; they won’t tell you, just as they won’t tell you how many links on a page they index, how many they follow and how many they count in their ranking algorithms.  Furthermore, it is a moving target.  Just like every other algorithm input, bounce rates and bounce lag times will not be treated in the exact same way one day to the next.

 

Objection 5, what if people quickly click on an external link and leave my site?  They found the site useful because they found a useful link on it, but they bounced.

 

Response.  That is not a bounce, that’s a referral.  A bounce is when someone hits the back button.

 

Objection 6, what if the user quickly closes the window?

 

Response.  That could be any number of things, but it is not a bounce.  Who can guess how the search engines might treat that, or even if they treat it at all?  However, it need not be considered a bounce unless the search engines believe it should be.

 

Objection 7, doesnt a bounce mean the person has found what they want?  Cant a bounce sometimes be good?

 

Response.  Sometimes, perhaps, but rarely.  After 5 seconds, a person has no time to read a page.  After 30 seconds, they might have found something useful.  So lag times matter.  More importantly, the search engines can determine what a person does next.  If a person returns to the search results and clicks on another link, that is a sign they did not find what they want.  If they return to the search results and conduct a similar search, that might also be a sign they did not find what they want.  If they return to the search results and conduct an unrelated search, that might be a sign that they found what they want.  Search engines can weigh various bounces in light of the user’s next action.

 

Objection 8, for some searches, people look for multiple sources, such as comparing prices, comparing products, seeking varying opinions, etc.  Too many sites would be penalized if all those bounces were to be counted in the rankings.

 

Response.  This is an example of false logic.  If someone clicks on one website, then bounces, clicks on another website, then bounces, clicks on another website then bounces…all the high-ranking websites for that particular search query would be equally affected.  Nobody would suffer a ranking disadvantage because rankings are relative.  On the other hand, if one site typically bounces and the others don’t, the bouncy site clearly is less useful than the others and should be demoted.

 

Objection 9.  Cant I just set up a bot to visit all my top competitors and leave their site after varying numbers of seconds to make it appear that their sites are all bouncy?

 

Response.  Yes, you can.  And you can get very creative.  I have even heard of couriers in China travelling from one Internet café to another to click on a particular site as a means of increasing its rankings.  I have no answer for this, other than that the search engines will have to control for this, just as they have found ways to control for automated link-building.

 

So have no fear.  Good websites that provide what their visitors want or who help them find what they want will prosper.  Sticky SEO looks at conversions and stickiness as integral elements to SEO.

 

Cheap sites that do a lot of link-building – bouncy SEO – counting on large volumes of traffic to offset poor conversion rates, will suffer – because the search engines will stop sending them that traffic. 

 

It’s just a matter of time.  Or perhaps it has already started.

 

 

 


Grab The Bookmarketer For Your Site

Sticky SEO on Webmaster Radio

Thursday, December 18th, 2008

Earlier today I was on Webmaster Radio talking about Sticky SEO. You can listen here:

Show: SEO 101



 

 


Grab The Bookmarketer For Your Site

Do Bounce Rates Really Count?

Tuesday, December 16th, 2008

Do Google and Yahoo include bounce rates in their algorithms?  Ever since I released Sticky SEO, it seems there has been a growing debate on whether bounce rates factor into search engine algorithms, or even whether they should in the future.  I think you know where I stand; they probably already do to some degree and they surely will count for much more in the future.  And not just bounce rates, but various other user activities.

I seems that my view is not universally held, but there is a robust debate on this topic.

Some people feel that there really is not a definition of what a bounce is, so that makes it difficult to determine bounce rates.  That just means the search engines have to define what a bounce is, and I gave them some tips here.

Some people feel that a high bounce rate is a good thing –  the person found quickly what he wants and returns to search for something else.  To quote one observer on Sphinn: “If the page is highly relevant to what the searcher is specifically looking for, they can get their info and leave without going to any further pages – fully satisfied. A Big vote for relevance.”

On the other hand, some people feel that if Google is now using bounce rates to rank its PPC ads, why would it not use that same information in its organic listings?

Others have argued that it would be too easy to send robots to the competitions’ websites and create a lot of fake bounces.

This issue is certainly not over, but I simply cannot see the search engines ignoring what I believe is the ultimate measurement of customer satisfaction.  There is no way that a quick return to the search engine is a good thing.  At best it is neutral, if someone is doing research and visiting numerous websites.  But in that case all top-ranking sites would have their bounce rates affected equally, so there would be no disadvantage resulting for any of them – those bounces would not affect rankings.  One way or the other user activity has to be an important measurement the search engines cannot afford to ignore.

 


Grab The Bookmarketer For Your Site

Is an SEO’s Place in the Kitchen?

Friday, December 12th, 2008

I wrote this post as a comment on Barry Welford’s blog, and it got so long and involved that I realized it would make a great blog post right here…especailly since it really is the foundion on which I wrote the Sticky SEO ebook.

Bounce rate is a great measurement of performance, of the usefulness of a website.  It is not the only one, as has already been discussed, and on its own would be a poor measurement.  Leaving a site through an affiliate link (or any other link) should not be considered a bounce.  It should be considered an external referral. 

Whenever anybody clicks on a result in Google, there are four potential next actions. 

  1. Bouncing back to Google, especially after only 3 – 5 seconds, is a sign that Google had served up a less-than-useful result.  Not good news for ranking well.
  2. Referring to a deeper link in the site (an interior page) is as Barry says “normally a confirmation that they are finding something of interest”.  Good, job Google; keep ranking that page for the search that was just performed.
  3. Referring to an external link is also a sign that the searcher found something useful on that page, which is why for SEO the New York Times is making a wise decision.  Searcher happy, Google happy.  Keep on ranking.
  4. Closing the browser window.  Yes, that is the fourth option, which means simply that the searcher’s wife just called, “Honey, dinner is ready.”  (Hopefully that won’t affect rankings one way or the other, or else we’ll need a kitchen-centric SEO strategy in the future.

 

 

 

 

 


Grab The Bookmarketer For Your Site

Sticky SEO e-Book released

Wednesday, December 3rd, 2008

After a month of working on it, and at least a month of technical delays, I have finally released Sticky SEO. This groundbreaking SEO guide will help you get prepared for the wave of algorithm changes that will sweep a lot of websites right under the rug.

Yup, a storm is coming and some websites will thrive while others crumble to dust.  It’s all about user metrics and what I call the “Usefulness Algorithm”. Sticky SEO is the answer, and this is the first eBook to give useful strategies and practical tips on how to be one of the websites that will thrive.

I should note that Sticky SEO really is not like any other SEO book.  If you find this blog post searching for “SEO book” or SEO eBook “, and are expecting the same SEO 101, you won’t find it here.  Sticky SEO doesn’t include any of that stuff.  It’s all good – don’t stop adding relevant content and building link after link after link to your site – but this is a different, more exciting story.  This is for website owners who want to pump up their profits today and power up their rankings for tomorrow.

Here is the link:
http://www.seo-writer.com/books/sticky-seo.html

 


Grab The Bookmarketer For Your Site

Sticky SEO Imminent

Saturday, November 29th, 2008

I promised a couple months ago that I would follow up the aborted series on BrowseRank with a complete ebook on the topic.  Now that ebook is imminent.  We’ll be releasing it as soon as we clear up a few server issues.  Just to whet your appetite, here’s the image of the cover…

 


Grab The Bookmarketer For Your Site

BrowseRank Strategies – Quality Web Site Design

Wednesday, September 3rd, 2008

A few days ago I reported on how BrowseRank goes beyond PageRank to rank websites according to user behavior.  Modern search engines tend to rank websites by relevancy and importance, and of course their algorithms can be gamed.  The concept of BrowseRank, which I have been mentioning to clients already for two years, would add a third and almost more important measurement – usefulness.  This, too, can be gamed.  However, most of the gaming would also work to your visitor’s advantage, so the Web will be a better place for it. 

In preparation for BrowseRank and perhaps other search engine measures of website usefulness, this is the first in a series of posts that will help you make your website appear useful in the eyes of the search engines.  You will probably find that these are things you should be doing anyway to increase conversions and profits, but that is not my area of expertise, so here we will look at them from an SEO perspective.

STRATEGY #1 – Design a website that says “Quality” the minute a visitor lands there.

This might seem soooooo obvious, but it needs to be said.  As obvious as it might seem, I come daily across dozens of websites that say “Amateur” or “Crap”.  Here are a few tips to make your website look like a professional website that can be trusted.

  1. Get a professional design that looks at least somewhat modern and in a style that suits your products and target audience.
  2. Lose the square corners.  Some corners are OK, but if your design is based on boxes, it looks like a basement job.
  3. No Adsense-type ads.  Yuck! Honestly, that is the biggest sign of a low-quality website.  A run of Adsense across the bottom is not bad, but the more prominent the PPC ads the cheaper the site appears.  By the way, ads are OK.  The more they look like content or part of the website, the better.  Adsense style ads just look cheap.
  4. Keep it clean.  Clutter looks as bad on a website as it looks here on my desk.  (But I don’t have a webcam to display this disaster to the world, so don’t display a mess on your website!)
  5. Make sure your web pages look good in various browsers and in various screen resolutions.  If 70% of people see a superb website and the other 30% see garbled images and text, they will bounce back to the search engine … which tells the engine that your website is not very useful (and it isn’t if it can’t easily be read by 30% of searchers).
  6. Make sure your website is available, which means good hosting.  I am never shy about recommending Phastnet web hosting.  This blog is hosted there and I have been migrating my sites to them over the years because of the five-star service I get when I need it.
  7. Make sure your code is working properly.  Seeing a PHP error makes the site look broken.  I don’t buy from someone who might be selling me broken goods.
  8. Avoid overly flashy design.  If your visuals call attention to themselves and distract from your message, you will lose people.
  9. Avoid automatic audio playing.  I can guarantee you that 99% of people browsing from a cubicle, as well as others in shared space, will zip back to the search engine in no time flat.  That sends a pretty bad signal to the search engines.
  10. Nix the cover page, especially one that shows a slide show on start-up.  And if you think people can easily scroll to the bottom to click the “skip intro”, it’s easier still to click the “back” button and choose a new website that does not place a barrier to its visitors.

Those are my top 10 web design tips for helping visitors see quality in your website.  Please feel free to add to this list in the comments below. Following these tips is not enough to make them stay on your website, but at least they won’t leave because the design scares them away.  In future “episodes”, I will share with you some additional strategies to help the search engines view your website as “useful”.

I would be remiss if I did not mention that we have some top quality SEO web designers on our team.  :-) 

 


Grab The Bookmarketer For Your Site

David Leonhardt’s SEO and Social Media Marketing is proudly powered by WordPress
Entries (RSS) and Comments (RSS).

Close