Web Log Archive · Index · Part 1 · 2 · 3 · 4 · 5 · 6 · 7 · 8 · 9 · 10 · 11 · 12 · 13 · 14 · 15 · 16 · 17 · 18 · 19 · 20 · 21 · 22 · 23 · Expand · Web Feed

The bad news of the search year 2005 is, that link development becomes a very expensive and labor intensive task, and requires outstanding knowledge and experience. The good news is, that post Jagger a handful of trusted inbound links do count more than a gazillion of artificially traded links in the past.

Trust and contextual dependable linkage has become one of the most important ranking factors. TrustRank, Google's SandBox or Reciprocal Link Penalties are just more or less misleading catchwords, used in countless publications and discussions on major changes of Google's ranking algorithms. Although those topics do deal with a part of the story, they distract our attention by trying to isolate particular symptoms and phenomenons from the big picture.

The worst example is the discussion of the infamous 'sandbox', which indeed does not stand for an initial aging delay or another sneaky attempt to punish new Web sites, because established sites can get 'sandboxed' too. The causes preventing a new site from instant rankings for competitive search terms, or even the site name (which is a money term in many cases), can under certain circumstances be applied to established sites too. Those circumstances include changes in the way a site is promoted, and improved algorithms in conjunction with more computing power, allowing Google to enforce well known rules on a broader set of data (which perfectly explains the "I've changed nothing but my established and well ranking site got tanked" reports).

To gain trust rank and to avoid 'sandboxing', (reciprocal) link devaluation resulting in decreasing search engine traffic, and other barriers as well, one needs to look at the big picture. Creating workarounds to outsmart particular filter components may be a nice hobby for SEO addicts, and it is a valuable tactical instrument for SE experts, but it is by no means a long term strategy.

A search engine's mission is providing its users with relevant commercial or informational results. A search engine marketers mission is placing Web pages at the top of the SERPs. There are only so many top spots on the SERPs, and way more pages trying to get a position under the top ten search results. Outsmarting the ranking algorithms was (and is - at least short dated) a cheap method to make it on the first SERP.

The war between smart search engine optimizers and just as smart search engine engineers lasts since crawling search engines generate organic search traffic. The crux is, that search engines relying on clever algorithms and unbeatable computing power can get outsmarted by just as clever algorithms and way less computing power. To stop the AI escalation, search engines had to integrate human considerations into their ranking algorithms.

Unfortunately, SE engineers weren't able to connect human brains to their computer clusters, because the homo sapiens still comes without a RJ45 connector and lacks a TCP/IP implementation. The next best solution was statistical utilization of structured and trustworthy editorial work stored in machine readable form.

Discovering trustworthy resources on the Web is an easy task for a search engine. Gathering popular sites with an extremely low percentage of linkage from and to known bad neighborhoods, a high but reasonable number of outgoing links at all, and a natural linked/unlinked text ratio, gives a neat list to start with. Handing out a checklist to a Web savvy jury surfing those popular resources leads to a detailed estimation of trust factors applied to topical authorities.

The most important trust (and quality) factor is a resource's linking attitude, probably followed by ease of cheatability, editorial capability and actuality, and topical competence, devotion and continuousness. The discovery of (more) trusted topical authorities by exploring the neighborhood of established and monitored trustworthy resources is an ongoing process in a search engine's quality assurance department.

Topical PageRank and TrustRank are related approaches, and should not be interpreted as mutually exclusive ranking factors. Pretty much simplified, a naturally earned related link passes PageRank, TrustRank, and topical reputation (authority, relevancy), whilst a naturally earned unrelated link passes only PageRank and TrustRank. I've stressed naturally earned, because at least Google has a quite accurate judgement on the intentions of links.

Artificial linkage for the sole purpose of manipulating rankings becomes more or less useless for Joe Webmaster, and risky. Actually very risky, because along with TrustRank comes a set of filters reliably detecting and penalizing artificial linkage like systematic reciprocal link patterns, randomized triangular linkage, and similar link schemes. Whether massive loads of artificial 3rd party links can reduce or even eliminate a site's TrustRank and result in decreased rankings or not is subject of speculation, although some trusted sources report that link mobbing does work. I can think of effective methods to close this loophole, and hopefully the SE engineers have implemented appropriate safety measures.

So what a Web site needs to rank on the SERPs is trusted authority links, preferential on-topic recommendations, amplified and distributed by clever internal linkage, and a critical mass of trustworthy, unique, and original content. Enhancing usability and crawler friendliness helps too. Absolutely no shady tactics applied. In other words, back to the roots of the Internet, where links were used to send visitors to great resources, not crawlers to promotional targets. Search engines don't honor conventional site promotion any more, they remunerate honestly earned page recommendations from trusted places in conjunction with decent site branding instead.

The vital question is "How to get on-topic authority links passing TrustRank?", and the answer is reengineering --respectively reinventing-- TrustRank to some degree, that is making use of the procedure the engines (at least Google) have most probably applied to identify trusted Web resources. This approach includes the analysis of identified trusted resources to adapt their valuable linkage architecture and behavior on own sites, and it goes far beyond making trusted link acquisition the sole beatified SEO tactic.

Let me close with a warning. Submitting a resource to DMOZ and Yahoo's directory, as well as inserting a Wikipedia link, are well known promotional activities. That is, if a site having those inbound links lacks trusted inbound links from other sources, this may not be seen as enough signs of quality and proof of content relevancy. I mean, everybody can buy a Yahoo directory listing, and many SEOs are OPD editors ... those links are way too easy to get and sometimes not earned naturally. It may be a wise decision to wait for other trusted inbound links, and submitting to DMOZ and Yahoo when the engines have picked up and processed links from other sources of TrustRank, like universities, government sites, and well established topical authorities. Patience and pertinaciousness is the key to success. Google has nullified all SEO Blitzkrieg strategies.


Related information:
The Google 'Sandbox' demystified
Defining natural and artificial linkage
Linking is all about traffic, popularity, and authority
Unrelated non-devalued links are dangerous
Automated and/or artificial link promotion has disadvantages
Enhancing a Web site's crawlability
Matt Cutts stating that TrustRank is Google's secret sauce


Thursday, November 24, 2005

Avoid Unintended Delivery of Duplicated ContentNext Page

Previous PageExamples of Legitimate Cloaking


Web Log Archive · Index · Part 1 · 2 · 3 · 4 · 5 · 6 · 7 · 8 · 9 · 10 · 11 · 12 · 13 · 14 · 15 · 16 · 17 · 18 · 19 · 20 · 21 · 22 · 23 · Expand · Web Feed



Author: Sebastian
  Web Feed

· Home

· Internet

· Blog

· Web Links

· Link to us

· Contact

· What's new

· Site map

· Get Help


Most popular:

· Site Feeds

· Database Design Guide

· Google Sitemaps

· smartDataPump

· Spider Support

· How To Link Properly


Free Tools:

· Sitemap Validator

· Simple Sitemaps

· Spider Spoofer

· Ad & Click Tracking



Search Google
Web Site

Add to My Yahoo!
Syndicate our Content via RSS FeedSyndicate our Content via RSS Feed



To eliminate unwanted email from ALL sources use SpamArrest!





neatCMS

neat CMS:
Smart Web Publishing



Text Link Ads

Banners don't work anymore. Buy and sell targeted traffic via text links:
Monetize Your Website
Buy Relevant Traffic
text-link-ads.com


[Editor's notes on
buying and selling links
]






Digg this · Add to del.icio.us · Add to Furl · We Can Help You!




Home · Categories · Articles & Tutorials · Syndicated News, Blogs & Knowledge Bases · Web Log Archives


Top of page

No Ads


Copyright © 2004, 2005 by Smart IT Consulting · Reprinting except quotes along with a link to this site is prohibited · Contact · Privacy