David Leonhardt’s SEO and Social Media Marketing

Tips for better SEO (search engine optimization) and website marketing …

THE HAPPY GUY MARKETING

 

Archive for the ‘rankings’ Category

New Google RankCheck Tool is Released!

Wednesday, August 27th, 2008

Just kidding.  There is no new Google RankCheck tool.  But there should be.  A couple days ago, I reported on how Google has finally blocked automated searches by software such as WebPosition.  This creates a vacuum in the marketplace – a vacuum best filled by…Google!

Yes, I think Google should announce a new GoogleCheck tool, and here is why. 

Google says that automated rank-checking tools should be avoided because it taxes Google’s servers.  OK, let’s take this argument at face value.  Automated checking does add a tremendous volume to the number of searches.  I gave an example the other day of how one website adds 1200 searches.  If that is done responsibly once every 4 – 6 weeks, ad there are a mere 1000 websites searching, the burden is not too big, especially if most of the searches are happening at off-hours when Google’s servers are underused.  However, if 100,000 websites are doing automated searches every day during peak usage hours, perhaps digging deeper into the SERPs, that could start taxing Google’s servers.

Let’s further assume that Google has a hidden agenda.  Let’s assume it does not like automated rank checking because people are getting a free ride from Google – conducting billions of searches without ever visiting Google and being exposed to paid search advertising.  Let’s face it, why should Google give away huge volumes of free search to webmasters without requiring them to view the PPC ads that bring in Google’s revenues?

So how would Google RankCheck help Google?

First, Google could control when and how automated searches occurred.  It could, for instance queue the automated searches for the next available down turn in bandwidth usage, or it could simply schedule it at an appropriate hour.  Problem solved.

Second, it could make money, which is what a corporation like Google is supposed to do.  Instead of giving away tons of free search to webmasters who don’t even visit Google to make the searches, Google could sell the software.  An official Google RankCheck tool would sell much, much better than Web Position.  Google could make a beautiful case, too:

“We are in search.  Manual search of one phrase at a time is free to everyone, including webmasters checking how they rank.  However, if you want to conduct bulk searches, you can purchase Google RankCheck for a modest fee.”

There would be advantages for webmasters, too.  Google could give people the option of viewing rankings in various locales.  For instance, if I want to see how my website is ranking in San Francisco, Chicago and Miami, I could specify that (you know how Google results differ from place to place).

Perhaps there are other advantages for Google or for webmasters.  Why not share your thoughts on that with readers by posting your comment below.

 


Grab The Bookmarketer For Your Site

Google Blocks Automated Rank Checking

Monday, August 25th, 2008

Google has been threatening…er, promising to block rank-checking software such as WebPosition for years, and even mentions them by name in their webmaster guidelines.  It seems like Google has finally decided to honor its promise and block the software.

I was last able to check rankings on August 22.  Since then, nada.  A quick survey around the Web and it sounds like a lot of other people found their automated ranking checks were blocked on August 1, August 5 or August 7.

What does this mean for SEO?  Quite a lot…and amazingly little.

 It means that we cannot check dozens of keywords quickly and painlessly.  Manually checking 50 search phrases for, let’s say, a dozen clients, often going onto the second or third page of Google means…let’s see…1200 manual searches.  Suppose there are two dozen clients.  Suppose there are 100 search terms.  You can do the math and see how time consuming this would be.

However, let us for a moment suppose that we don’t do 1200 manual searches every month.  Suppose instead we do occasional searches to see where a client stands for a few major search phrases?  Or we check different searches on different months as we focus the campaign on different sub-niches?  What if we invest more effort in building rankings than in measuring them?

Yes, we do need to measure.  We need to know if we are moving forward.  We need to be able to show clients roughly the magnitude of the progress.  But perhaps we will be using a smaller basket of keywords and letting the long tail take care of itself.

For me, the main use of rank-checking across a broad range of search phrases was to determine which search phrases or family of search phrases need more focus as we ride the surf of algorithm changes, renewed competition and other happenings.

Of course, clients also require reporting…which we will no longer be able to do to the same level as we had been doing.  So the immediate effect is that over the next month or so, I need to budget a few hours to explain to clients why lists of ranking positions can no longer be the way to measure progress.

 


Grab The Bookmarketer For Your Site

Broken Links and SEO Rankings

Wednesday, June 25th, 2008

Phew! I just finished removing all the broken links from this website. It would have been a fairly small undertaking if not for the blog. The blog creates hundreds of pages and the broken links can appear in comments, posts, sidebars and all sorts of hidden files. And since broken link checkers report all sorts of anomalies, such as RSS links, the list to wade through is quite large.

But it is worthwhile. A website that points to a lot of broken links is one that is not maintained. Put quite simply, if Google has the option of listing two equally relevant websites for a particular search, why would it list the one that appears not to be as up-to-date. I have no empirical evidence to show that broken links hurt ranking (if you do, please let me know), but common sense says that somewhere in the algorithm broken links play a role.

 


Grab The Bookmarketer For Your Site

LinkedIn for SEO

Friday, May 23rd, 2008

In addition to being a great website for networking and reputation management, LinkedIn can also serve as a valuable SEO asset. Your profile allows 3 links to your websites. Use them. A few tips for making your profile rank better within LinkedIn, and most likely with external search engines, too.

  • Complete your profile to 100%
  • Join some groups
  • Build a large contact list
  • Recommend your contacts
  • Ask your friends to recommend you
  • When commenting on blogs, make your LinkedIn profile sometimes the URL for your comment

This is also a great way to create a very credible page that will rank well for your name, including great positive recommendations in your favor. See more about this in my post on SEO tactics for reputation management.

You can view my profile at LinkedIn: David Leonhardt. Note, I am only connecting to people I actually know and have worked with.

 


Grab The Bookmarketer For Your Site

SEO tactics for reputation management

Wednesday, May 14th, 2008

There is nothing more precious than your reputation. What happens when one jealous ex-lover, disgruntled employee or unsatisfied customer decides to get nasty and post something snarky on the Internet about you. And horror of horrors, it shows up #4 at Google or Yahoo when somebody searches for your name or your business name?
That’s when you need an SEO campaign for reputation management. While every campaign is unique, there are a few key steps you should take.

  1. Make sure your own website comes up first.If you have more than one website, first and second is even better.
  2. Maximize the reach of your website(s); optimize two pages on each to show up in the results.
  3. Analyze those positive web pages already in the top 20 for your name and determine which ones could have an extra page optimized for your name.
  4. Analyze which positive web pages already in the top 20 for your name and determine which ones could be pushed above any negative pages through changes to the pages or through link-building.
  5. Analyze those neutral web pages already in the top 20 for your name and determine which ones could be made positive.
  6. Analyze those negative web pages already in the top 20 for your name and determine which ones could be made positive.
  7. Create, optimize and promote profile pages at popular user-generated content websites, such as Squidoo, MySpace and StumbleUpon.
  8. Create blogs in your name. If your main SEO goal is how your name or business name comes up in the search engine, host your blog at BlogSpot and/or WordPress.

Depending on your unique situation, there might be numerous other tactics you can use, as well. This list should help you get started if you want to do it yourself, or if you wish, we can help you with your online reputation management SEO campaign.

 


Grab The Bookmarketer For Your Site

Yahoo and web design quality

Wednesday, April 30th, 2008

A recent patent application by Yahoo makes it clear that it has plans to look at the quality of a web page in terms of layout and design as part of its ranking algorithm.  Careful – I did not say that it does or it will, just that it has plans.Yahoo’s reasoning is solid.  A web page that is full of clutter, where it’s hard to find where to go, if not a page that will please the searcher.  And Yahoo, like all search engines, wants to please the searcher.In its patent application, Yahoo lists 52 elements it might consider when deciding whether a web page is cluttered or not.

  • Total number of links
  • Total number of words
  • Total number of images (non-ad images)
  • Image area above the fold (non-ad images)
  • Dimensions of page
  • Page area (total)
  • Page length
  • Total number of tables
  • Maximum table columns (per table)
  • Maximum table rows (per table)
  • Total rows
  • Total columns
  • Total cells
  • Average cell padding (per table)
  • Average cell spacing (per table)
  • Dimensions of fold
  • Fold area
  • Location of center of fold relative to center of page
  • Total number of font sizes used for links
  • Total number of font sizes used for headings
  • Total number of font sizes used for body text
  • Total number of font sizes
  • Presence of “tiny” text
  • Total number of colors (excluding ads)
  • Alignment of page elements
  • Average page luminosity
  • Fixed vs. relative page width
  • Page weight (proxy for load time)
  • Total number of ads
  • Total ad area
  • Area of individual ads
  • Area of largest ad above the fold
  • Largest ad area
  • Total area of ads above the fold
  • Page space allocated to ads
  • Total number of external ads above the fold
  • Total number of external ads below the fold
  • Total number of external ads
  • Total number of internal ads above the fold
  • Total number of internal ads below the fold
  • Total number of internal ads
  • Number of sponsored link ads above the fold
  • Number of sponsored link ads below the fold
  • Total number of sponsored link ads
  • Number of image ads above the fold
  • Number of image ads below the fold
  • Total number of image ads
  • Number of text ads above the fold
  • Number of text ads below the fold
  • Total number of text ads
  • Position of ads on page

 This is actually a superb website review checklist.  Go through your website and see how it stacks up on most of these items.  Keep in mind that there are reasons you might want to violate some of these principles, but in general you would want your website to meet most of these criteria in order to please your visitors and convert them into customers.  And soon, you might also please Yahoo.

 


Grab The Bookmarketer For Your Site

Mature Domains – Ranking Advantage at Google

Tuesday, April 22nd, 2008

Those of us who have been paying attention new about the importance of domain maturity already a couple years ago.  But it looks like 2008 might be the year that the webmaster community starts to realize the importance of the issue, with Google’s United States Patent Application: 0080086467 being publicized.

The bottom line is that it is to your advantage to hold a domain that has been around — and in your ownership — for several years.  Maturity counts, and SEO gets easier as your domain ages.  It is also to your advantage to see links from mature domains, although I don’t think I would waste time checking the ages of every domain I hoped to get a link from (more on this in a moment).

Why are mature domains better?  Like so many things, especially on the Internet where much is ephemeral, a mature domain has stood the test of time and therefore is more likely than average to provide useful information or services.  An established domain is much, much less likely to be a spam site set up to turn a quick profit and disappear.  The bottom line is that a mature domain is more likely to be a trustworthy one.

And trust is what it is about.  When Google sends traffic to your site, it is placing trust in the site.  Maturity is one way Google can measure trust.  However, it is far from the only way.  PageRank is another.   There are likely dozens of measures of trust that Google employs, which is why I would not waste my time checking domain age.  A much better trust test is too see how well a site ranks for its own target search phrases.  If it ranks well, Google must trust it at least a fair amount, and therefore it is a good website to be associated with.

 


Grab The Bookmarketer For Your Site

Location of Google Data Centers

Tuesday, April 15th, 2008

Hang around any webmaster forum long enough and you will run into the newbie question, “How come I don’t see the same results as my friend in San Francisco or Mexico City?” And the predictable answer, “Because Google serves up slightly different results from different data centers” or “Because Google has updated one of its data centers earlier than another, so just be patient until it updates all its data centers”.

But exactly where are these data centers. Today I present you with some clues, and I will explain why I use the word “clues”.

Here is a map of all the Google data centers around the world:
World map of Google data centers

Here is a map of the Google data centers in North America (Yes, there is one in Canada):
Google data centers in USA

And for our European readers, here is a map of data centers in Europe, from Russia to Ireland:
Google data centers in Europe

These maps were found through an interesting blog post on Google data centers at Pingdom.  These maps are based on a data center list at Data Center Knowledge.

Interestingly, when you search Google Maps, here is what it shows:


View Larger Map 

Just another example of the search engines not delivering their own information as well as they deliver others’?

 


Grab The Bookmarketer For Your Site

Don’t Waste “Useless” Traffic

Monday, April 7th, 2008

Not everybody has this happy problem, but many websites get traffic they cannot use because it serves only a narrow spectrum of people who arrive from a broader search.  People do a search for a broad search, such as “marketing gimmicks” at Google or Yahoo, find your web page about a very specific marketing gimmick for real estate agents, discover that the website does not address their needs to market beauty products or metal bending or accounting, and they go.

Wait.  Stop.  Where do they go?  Back to the search engine?  No, no, no, no. 

From an SEO perspective, you don’t want to send the search engines the message that your page was a poor choice to rank well for the search term “marketing gimmicks”.  If that happens, the search engines might just demote your rank, and you will love the good prospects with the “useless” traffic.  We have no evidence that the search engines are factoring bounceback data into their algorithms, but we do know they are capable and have an interest in doing so.   It’s coming.

Of more immediate concern is all that hard-earned traffic that could be buying something from you is just leaving without spending a penny.  What a shame!  In a case like that, it would be worth having a very prominent affiliate link to a website that sells a broader marketing package with a text like “More Surefire marketing Gimmicks Here”. The result would be to convert some of the “useless” traffic, and to both reduce the bounceback rates and increase the bounceback lag time of those who do go back to Google.
 

 


Grab The Bookmarketer For Your Site

Link Building by the Specs? No Thank You!

Friday, November 30th, 2007

So somebody needs to build links to help his search engine rankings, and has come up with a very precise list of exactly what he wants.  It includes 19 exact specifications, which perhaps he pulled from a handy article somewhere on the Web.  Here is the list he presented, but this post could be abut any such list… 

1. One way non-reciprocal links only, no link exchanges.
2. THREE WAY Links where all links are in the same theme is OK
3. All links must be permanent.
4. Only 10% can be in directories.
5. No blogs
6. ONLY OUR THEME , (our theme is quite common so you will not have problems).
7. NO hidden links or any site that has hidden links.
8. No directories. No link farms, link-exchange programs, forums, Google banned site, black hat website. No guestbooks, links within forums, links within newsgroups or links from link exchanges etc. and never participate in any commercial web rings.
9. No sites banned by Google.
10. Link page must have a recent Google & Yahoo cache.
11. Must be manually submitted.
12. No Automated software (e.g., Zeus, Arelis or others)
13. All links must be from a different domain and IP address (geographically diverse, different class-c IP address block).
14. Only 10 to 15 links per week per language per site
15. Link pages must be static urls (no variables or parameters in the url)
16. No blacklisted or spam sites.
17. No more than 40 outbound links per page.
18. The link text must be from our keyword list and point to that keywords target page
19. All links must be static and without “nofollow” tags, no redirects, or javascript
20. Links must be on a PAGE with a Google PR of at least 2
21. All links must be on a page of the same language
22. Links must be on domains where we have no link

This post is about why I refuse to build links according to lists like this.  First, I must note that some of the items such as #9 and #7 and #15, for example, all make perfect sense.  These are deal-breakers that make a link useless. 

However, other elements are judgment calls: stipulating how many links per page, the PageRank, that a three-way link is acceptable but not a two-way link, among other factors.  What people hire me is to exercise that judgement.  To decode when a page might be PR1 but incredible on-topic and worth going after.  Or when a page might have 200 links, but with PR4 and lots of real human traffic it is worth its weight in gold.  Honestly, the client can just have his secretary or an offshore link-builder do the manual job of seeking out the links.  He does not need me for that.  What he needs me and my trained staff for is to exercise judgment – judgment that he is overriding with a pre-fab list. 

Does the client really think we have control over how many links are built in a week?  That depends on the response rate and the amount of back-and-forth with various webmasters.

And how much does he want to pay me to track down IP addresses to make sure they are all different?  Or check that the client does not already have a link on the domain?

That’s why I turn down offers to try to fit a strategic process into so comprehensive a list of technical specifications.

 


Grab The Bookmarketer For Your Site

David Leonhardt’s SEO and Social Media Marketing is proudly powered by WordPress
Entries (RSS) and Comments (RSS).

Close