Page 1 of 2 12 Last
  • Jump to page:
    #1
  1. SEO Earthquake!
    SEO Chat Scholar (3000 - 3499 posts)

    Join Date
    Jun 2005
    Location
    Sacramento, CA
    Posts
    3,307
    Rep Power
    25

    Cool What Builds Trust with Google?


    "We're always looking to find a better trust metric." - Matt Cutts
    http://www.theregister.co.uk/2002/10/08/google_explains_new_page_rankings/

    “I found out the types of phrases that [a known BH] is targeting, including a few industries.” – Matt Cutts
    http://www.mattcutts.com/blog/

    Google registered a trademark for the word "TrustRank"
    http://www.greatnexus.com/forums/post-7840.html (interesting post that also mentions "local rank").

    Stanford has a whitepaper that refers to TrustRank as an algo at: http://dbpubs.stanford.edu:8090/pub/showDoc.Fulltext?lang=en&doc=2004-17&format=pdf&compression=&name=2004-17.pdf. Here are some quotes of note:

    - Good pages seldom point to bad ones

    - The care with which people add links to their pages is often inversely proportional to the number of links on the page. That is, if a good page has only a handful of outlinks, then it is likely that the pointed pages are also good. However, if a good page has hundreds of outlinks, it is more probable that some of them will point to bad pages.

    - Interestingly, this approach leads us to a scheme closely related PageRank

    What determines trust? How does G handle it? How does this effect SEO? I strongly suspect trust has everything to do with the “third entity” that we’ve been missing in SEO. And the sandbox.

    Let me explain. There are two known methods of building placement in the SERPs: on page factors and backlinks. But I think there is a third entity which is trust. Trust is harder to determine than the other two areas but it does start pointing out a lot of areas that don’t really fit in the other two categories but do have an influence. Trust also overlaps with the other two areas and may supersede them both.

    I further speculate that this trust ranking (or TR) is what gets you out of the sandbox and helps you in the SERPs. I posted a note for Matt Cutts to explain a bit more about G and trust but who knows when or if he’ll reply. So what do you think? Here’s what I’ve come up with as far as trust issues and Google, so far:

    Domain Age – The older the domain is the more trust it gets. This isn’t really an onpage factor as you can’t do anything to age your site but wait.

    Length of Domain Registration - Regging a domain for 10 years instead of one shows more commitment to the project.

    "Regular" Updates - This doesn't mean every day or even every week, but a continuous improvement to the site. This could be adding content or adjusting layout.

    Backlinks - This one is a hornet’s nest because there are so many different opinions. I’ll stay pretty generic here to avoid moving the post off-topic. Remember this is about trust, not a PR score.

    My thought on this is backlinks are good. Backlinks that are paid for are not as good. Backlinks in directories are not as good. Backlinks that are reciprocated are not as good. Backlinks that are off-topic are not as good. Backlinks that are floating are not as good. "Not as good" does not mean "bad", just that they aren't as good as a link that appears natural and is surrounded by text.

    On the flip side, backlinks from high PR sites are better. As are backlinks from .edu and .gov domains. And backlinks from sites that have a high measure trust. Human edited backlinks are also better (think: DMOZ). And older backlinks are better than new ones (there’s that age thing again!).

    Unique IP - having a unique IP shows that you are serious enough and care enough about your site to pay the extra couple bucks. It also prevents you from being on a shared host that may contain "bad neighbors".

    Contact Information - This is an obvious aid to building a geographic audience that is often overlooked. It also builds trust with customers if they know you have a physical address. I think this will help you build trust with G. Sites that list their contact info get more trust. A street address may be better than a PO Box. Note: I have no proof on this *yet*, it just makes sense to me.

    Sitemaps - Including a sitemap helps G spider your site. It also helps visitors find what they are looking for. I think it also builds trust since you are considerate enough to help the viewer find what they are looking for. This isn’t about the ability to easily crawl, rather just offering the option to visitors shows you have their interests at heart. Note: Again, no proof, just an idea.

    "New" Technologies – Think: CSS, XHTML and RSS. CSS and XHTML may have some onpage effects, but I doubt they affect trust. RSS may show a strong desire to get your quality information out. Does G detect how many sites have picked up your feed?

    Reliable Uptime – Sites that yo-yo up and down lose trust.

    Targeting KWs based on Traffic over Relevance – Your business is unique and even if you sell "widgets that jump" that doesn't mean you should target "widgets". The more specific you are to your market, the better conversion rate you'll get and G won't think you are targeting a KW based on popularity instead of relevance. This may be a major factor in the sandbox.

    Spam - tricks like doorway pages, thin affiliates, (improper) cloaking, pointing multiple domains to the same site, duplicate content, KW stuffing, hidden text, comment spamming, pop-ups, pop-unders, and hidden links. Many of these are onpage factors. BH techniques would also hurt trust. Does anyone have any more examples of this stuff?

    Security - Does having a SSL certificate help?

    Trust by Purchase - What about being listed as a TRUSTe site or with BBBonline?

    Going Off-Topic – Adding pages to unrelated industries or to gain affiliate sponsorship hurts, but Cutts says it best himself: http://www.mattcutts.com/blog/step-into-my-shoes/

    “Bad” Topics - Certain topics will hurt your trust such as alcohol and gambling. Anything illegal, pornography, hate speech and anything that may get filtered by G's SafeSearch.

    Standards - Does adhering to standards help? For example W3C compliance, Bobby, etc. The idea being that you are showing care in your site to not only make sure it will display correctly across the board but also work for people with disabilities.

    Site Submission Software – I can’t recall G ever saying directly to not do this, but I definitely get the idea they don’t approve.

    Too Many Terms in the Domain Name or URL – For example: www.you-should-buy-all-this-stuff-right-now.com or www.domain.com/you-should-buy-all-this-stuff-right-now.html. This hits more with KW stuffed URLs.

    Trust In the Air

    There have been some posts that get a bit “tin foil hat” for me but I also realized I don’t have a clear answer.

    robots.txt - Can blocking pages or directories hurt your trust?

    Google API – Does applying for one hurt? Does using it hurt? Does using it at certain sites (like googlerankings.com) hurt? Using an API makes it ridiculously easy for G to track when you are breaking their TOS, especially the part about “automated queries”. Just by requesting an API you mark yourself a “more than average” web surfer and likely SEO.

    Google Searches - Does searching for your site on G hurt? Especially the link: operator?

    Posting in Forums – Just by posting in here, we often show a clear path to our SEO projects.

    G Sitemaps – By signing up for G sitemaps you also reveal you are a ‘more than average” surfer. Also, some of the stuff G asks for in the XML version is a bit odd.

    Meta – can some Meta tags help or hurt a site?

    Templates – Does G recognize duplicate layouts like they do duplicate content? Also, what about common layout errors when working with templates like forgetting to replace “title goes here” or removing HTML generator Meta information?

    Googlebar Tracking – And the “popularity” associated. It is in the patent.

    SERP Tracking - Click-through rates plus "hang time" if visitors back out too soon the site gets devalued

    Final Advice: Not Trusting Google Hurts Your Rank.

    Trust Google to do what they do. By trying to outwit them you only hurt yourself in the long run. Google has the ability to detect all sorts of link and spam schemes and as time goes on they will develop better ones.

    Also, Cutts mentions that Google is aware of who the SEOers and BHs are and has even started approaching them to present how they work to the Google engineers!

    Comments on this post

    • raz agrees : Excellent Post; should spur a healthy discussion
    • Jocelyn agrees : Nice list of things to think of...
    • randfish agrees : Nice work RMC
    Last edited by rmccarley; Sep 13th, 2005 at 04:55 PM. Reason: Made it pretty!
    14th Colony: The hardest working websites online!

    Looking for links? Join the Union and don't miss English Grammar rules, tips and explanations
  2. #2
  3. Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Dec 2004
    Location
    Northern NJ, USA
    Posts
    322
    Rep Power
    15
    Lots of good issues for discussion here. In my view, trust is as important as content and BLs.

    I do think that there are different aspects to G 'trust'. My speculation is that there is a trust factor that figures into SERPs, and that separately TrustRank effects (or will effect) the trust factor.

    I suspect that each site has a default trust factor (0 to 1) and that known keywords for the site each have their own trust factor.

    We know that each document (page) has a document score as it relates to a keyword, or phrase. Let's assume my phrase is 'ink jet cartridge' and that my default page has a doc score of 20,000 for the phrase which puts it at a rank of # 9. Let's also assume my trust factor is 1.

    My guess is that G multiplies the doc score by the trust factor to get a 'trusted doc score'. So, if my trust factor fell to .6 my doc score would drop to 12,000 which might now put me at #65 for example.

    I also think that the sandbox is no more than a low trust factor assigned to all new sites. Further, I think the trust factor can improve automatically after 3 to 6 months, but if link building is aggressive this could negatively impact the trust factor and result in what looks like a one year sandbox.

    I feel that questioning everything we do in terms of trust is already very important to SEO.

    Of course, I could be all wrong on this and some G engineer may be having a good laugh at my expense. If so, I'm happy I cheered up your day! :-)

    Comments on this post

    • dougedoug agrees : think you could make a tool that follows your formula?
    Last edited by longcall911; Sep 12th, 2005 at 09:07 PM.
  4. #3
  5. SEO Earthquake!
    SEO Chat Scholar (3000 - 3499 posts)

    Join Date
    Jun 2005
    Location
    Sacramento, CA
    Posts
    3,307
    Rep Power
    25
    Originally Posted by longcall911
    We know that each document (page) has a document score as it relates to a keyword, or phrase. Let's assume my phrase is 'ink jet cartridge' and that my default page has a doc score of 20,000 for the phrase which puts it at a rank of # 9. Let's also assume my trust factor is 1.

    My guess is that G multiplies the doc score by the trust factor to get a 'trusted doc score'. So, if my trust factor fell to .6 my doc score would drop to 12,000 which might now put me at #65 for example.
    According to the TrustRank whitepaper I read you are in the right direction, but way oversimplified. The use an algo for the trust value of BLs and assign a score that is then cobined with the regular G algo. Sites that get links from untrustworthy sites can be effected by this, but not as much as those that link to untrustworthy sites. What the whitepaper didn't get into was what the determining criteria was to be considered "untrustworthy" besides linking to "bad" sites.

    I also think that the sandbox is no more than a low trust factor assigned to all new sites.
    I'm starting to think that too. This would explain how some sites get out quickly. Especially since KW competition seems to dig a site in deeper. The question from G's point of view would be are they targeting their market, or the traffic of that KW?

    Further, I think the trust factor can improve automatically after 3 to 6 months, but if link building is aggressive this could negatively impact the trust factor and result in what looks like a one year sandbox.
    This I'm not so sure about. G's algo seems to know which sites are easy to get links from (directories, etc.) and devalue those links. Also, many places that are easy to get links from probably have bad neighborhoods linking to and/or from them, again causing a trust penalty.

    I do think that time is an important factor and there may be a -12 factor applied to new sites (one point per month). If a site is clean it will naturally get out of the box in a year (seemingly the average hold on new sites). If it is special, it can get a positive trust factor that gets it out sooner. And of course if they're up to no good, the penaty sticks longer.

    Of course, I could be all wrong on this and some G engineer may be having a good laugh at my expensive. If so, I'm happy I cheered up your day! :-)
    Agreed. In fact, I think there are some SEOers in here that may be getting a kick out of this too. Or maybe the low response is due to the length of the post.
  6. #4
  7. Permanently Banned
    SEO Chat Skiller (1500 - 1999 posts)

    Join Date
    Feb 2004
    Posts
    1,935
    Rep Power
    23
    Originally Posted by rmccarley
    Or maybe the low response is due to the length of the post.
    literally this is the only line i read in this entire post...

    but to answer the main question...what builds trust with google? ADSENSE

    j/k

    Comments on this post

    • rmccarley agrees : That would be much funier if it wasn't true...
    It's been fun everyone... Time to leave for good. See this for more details.
  8. #5
  9. Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Dec 2004
    Location
    Northern NJ, USA
    Posts
    322
    Rep Power
    15
    Originally Posted by rmccarley
    According to the TrustRank whitepaper I read you are in the right direction, but way oversimplified.
    Yes, I'm definitely oversimplifying 1) I sure don't know the details 2) didn't want to wind up with a post longer than yours :-)

    I don't think we know for sure what builds trust most, but agree the more important likely reasons are a well behaved site, time, and natural links from already trusted sites. By well behaved I mean no link schemes, hidden text, cloaking, etc. The problem is that we don't know where G might draw the line when it comes to linking and what is considered a scheme versus what might be ok.

    If I buy one or two text ads, that might be ok. If I buy a sitewide, regardless of my reasons, that might be a strike against me.

    One way to learn more though is to study the top 3 sites for a given keyword/phrase. Assuming we are looking at least moderately competitive terms, it is probably safe to assume that the top 3 are fully (or near fully) trusted. Doing this across a broad range of kws should reveal a pattern.

    I suspect that what we would ultimately see is older sites with only poor to moderate on-page SEO, average or below average number of BLs, a high percentage of BLs from news articles, press releases, .edu (or .gov ) and a few BLs from major directories. Of course, these BLs would all have accurate text in anchor.

    /*tom*/
  10. #6
  11. Contributing User
    SEO Chat Hero (2000 - 2499 posts)

    Join Date
    Mar 2004
    Location
    Los Angeles, CA
    Posts
    2,071
    Rep Power
    262
    Originally Posted by rmccarley
    ....................Or maybe the low response is due to the length of the post.
    Sorry rmccarley, but your post was simply not as intellectually challenging as; "Help,went from 65th down to 501st position" which would have prompted 85 thought provoking responses like; "Get more back links"...

    BTW, I could only agree with your post early on without rep! I must spread some love around first, I was told!

    raz

    Comments on this post

    • rmccarley agrees : That's ok - I'll rep you!
    #include <Cognac.h> -The only code I know How to Write!
  12. #7
  13. No Profile Picture
    Contributing User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    May 2005
    Posts
    62
    Rep Power
    15
    Come on everyone.
    This thread deserves more input.

    Unfortunately i don't have the knowledge or experience to contribute, but i have kept checking back to see if anyone is sharing their thoughts.
    5 posts is not enough to fuel my learning curve
  14. #8
  15. No Profile Picture
    Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Aug 2005
    Posts
    121
    Rep Power
    14
    IMO ignore the flaming thing Google brings up more SPAM/useless sites then MSN and yahoo put together (it use to be the other way round).

    Personality, i dont think its worth disecting this trust rank algo, the more you bring on board the harder SEO becomes, whats the world coming to? ...How can a search engine trust a website? An automated "thing" (soz Googlebot, i mean bot) has to use a pattern, it may have some so-called smart way of guessing but unlike a human it doesnt have an initiative, and even humans get betrayed with their trust!!

    So using robots.txt and no spider will make Google trust you less? Possibly
    Having a dedicated IP will make it trust you more? The IP doesnt define if a site can be trusted, these spam sites own ranges of IPs.
    Reliable Uptime? Your server has too much hits and traffic as well as a deep google spider and it goes down...Why would Google trust it less, i mean if someone died you can trust them more then an alive person, if a site cant be viewed on the web in that moment of time you can trust that site because you cant access it.
    SSL Certificate? if you dont use an https site you unlikely to have one, so as its not an https site your page going to appear several more pages down the SERPS?
    Google API + Sitemap - surely are services provided by google, so why would these be trusted less?

    I love it, how many algos has Google got? and both Yahoo and MSN has alot better results even though that in comparison such engines are simpler.
  16. #9
  17. Disregard Logic, Be Happy
    SEO Chat Adventurer (500 - 999 posts)

    Join Date
    May 2004
    Location
    USA
    Posts
    502
    Rep Power
    16
    here's some input on your nice post:

    Domain Age – Half-true. A quality site is benefited by an aged domain. It is an accent to an established winner. For a poor site, in the least, this may be the only benefit albeit a diminished one.

    "Regular" Updates - Again, a quality site with weekly (monthly?) updates trumps a low-quality site that is updated three times a day. The main goal of updates is to attract crawlers and/or offer new content for users.

    Backlinks - Obviously relevent are tops, with slightly relevent close behind and off-topic only in place of nothing at all. Also, how does google "know" you have purchased links? very difficult to determine. a huge indicator is off-topic content coming into a site. when 1+1 isn't adding up to 2, perhaps it receives more attention...

    Unique IP - common way of validating site stands on its own. basic and good technique.

    Contact Information - Required and basic again, but surprisingly some sites enjoy making you work to contact them, for unknown reasons.

    Sitemaps - For the crawlers mostly, and I devalue sitemaps for the simple reason that if you're arriving at a site through google it is page specific. they don't take you to the sitemap. they take you to the relevent page.

    "New" Technologies – Most people instantly give any new tech instant credit for boosting rank, turning the site around etc. but it's just new, not neccesarily better. time and experimentation will always hold truth to the fire and see if it burns.

    Reliable Uptime – If you don't have this, stop opting and move to a new host.

    Targeting KWs based on Traffic over Relevance – Traffic and relevence should be intertwined. relevence brings and maintains traffic. they should not be seperated, but instead work together.

    Spam - just search the web. plenty of fodder for young spammers to learn...and lose out on quality sites. unless of course you're really good at it...

    Security - probably doesn't hurt

    Trust by Purchase - see above.

    Going Off-Topic – what is related. baseball - hats - beer. seemingly different things go together well in the real world. but it's google's world we're living in, so be obvious. like baseball - balls - bats.

    “Bad” Topics - unless that's your market

    Standards - good simply for the sake of good technique.

    Site Submission Software – disapprove. if it seems too easy to work, it probably is.

    Too Many Terms in the Domain Name – let the page do the talking, not the URL. and spread it out over the site.

    robots.txt - Don't trust the robots.txt. Google is looking regardless.

    Google API – this is a common tool not known to hurt (well, that I know of

    Google Searches - google is a search engine, right?

    Posting in Forums – it's rumored google watches these forums closely...and that I'm good looking. both are....undetermined.

    G Sitemaps – brownosing.

    Meta – not much help, if any.

    Templates – ugh.

    Googlebar Tracking – widespread and potent. most likely more than most know. just look at the response to the outdated patent...

    SERP Tracking - now we're getting somewhere!
    The Source of All Things Must Be Understood.
  18. #10
  19. SEO Earthquake!
    SEO Chat Scholar (3000 - 3499 posts)

    Join Date
    Jun 2005
    Location
    Sacramento, CA
    Posts
    3,307
    Rep Power
    25
    Originally Posted by dws_Dan
    How can a search engine trust a website? <snip> unlike a human it doesnt have an initiative, and even humans get betrayed with their trust!!
    I don't thing Google the algo trusts, it assigns a score based on what the human engineers tell it trust is.

    Having a dedicated IP will make it trust you more? The IP doesnt define if a site can be trusted, these spam sites own ranges of IPs.
    Right, but by purchasing a unique IP you are no longer associated with the company you keep. I've seen many posts on this and many hosting companies don't alow "adult content" because they don't want to be accosiated with it.

    Reliable Uptime? Your server has too much hits and traffic as well as a deep google spider and it goes down...Why would Google trust it less, i mean if someone died you can trust them more then an alive person, if a site cant be viewed on the web in that moment of time you can trust that site because you cant access it.
    Actually G's webmaster guide does specifically state that a site that goes down will be pulled from the SERPs. I'm not sure if this is a trust issue per say, but definately a reliability one. If the site is down and G posts a link to it, the user gets ticked. It's just poor service.

    SSL Certificate? if you dont use an https site you unlikely to have one, so as its not an https site your page going to appear several more pages down the SERPS?
    I'm wondering if having more security helps with the trust impression. I know it does for human visitors. It shows establishment and bankroll. However, G may consider such sites commercial which could hurt you.

    Google API + Sitemap - surely are services provided by google, so why would these be trusted less?
    Mainly because only SEOers would know about them or use them. If G decided they don't want SEO in the picture one day and banned all sites connected to an API a lot of us would feel it. Do we have our heads in the lions mouth with this?
  20. #11
  21. SEO Earthquake!
    SEO Chat Scholar (3000 - 3499 posts)

    Join Date
    Jun 2005
    Location
    Sacramento, CA
    Posts
    3,307
    Rep Power
    25
    Originally Posted by okguyjames
    Backlinks - Obviously relevent are tops, with slightly relevent close behind and off-topic only in place of nothing at all. Also, how does google "know" you have purchased links? very difficult to determine. a huge indicator is off-topic content coming into a site. when 1+1 isn't adding up to 2, perhaps it receives more attention...
    G is getting better and better at determining which links are natural and which are purchased. They are also quite good at detecting patterns so even 3-way links are "easy" to spot. Cutts talked about this on his blog.

    Sitemaps - For the crawlers mostly, and I devalue sitemaps for the simple reason that if you're arriving at a site through google it is page specific. they don't take you to the sitemap. they take you to the relevent page.
    But would just offering a sitemap show a sincere interest in helping the viewer get what they want? And if so, would that translate to rank? The SERP wouldn't take you to the sitemap, but it may say "Gee, these guys put up a site map and I think thats swell. Have some more trust!"

    "New" Technologies – Most people instantly give any new tech instant credit for boosting rank, turning the site around etc. but it's just new, not neccesarily better. time and experimentation will always hold truth to the fire and see if it burns.

    Reliable Uptime – If you don't have this, stop opting and move to a new host.
    I agree 100% on both points.

    Targeting KWs based on Traffic over Relevance – Traffic and relevence should be intertwined. relevence brings and maintains traffic. they should not be seperated, but instead work together.
    Right, but if a KW nonrelated to your site draws a ton of traffic and you throw up a page to capture that "market share" would G call you out on that? I would hope so! Going a step further, say there is a KW that gets a ton of traffic that's almost related to your site? Will G call you out on that?

    Going Off-Topic – what is related. baseball - hats - beer. seemingly different things go together well in the real world. but it's google's world we're living in, so be obvious. like baseball - balls - bats.
    I thing G's lexicon plays a big part in this along with co-occurance (see randfish's threads on this).

    Standards - good simply for the sake of good technique.
    I was reading G's webmaster resources today and it does state your code should be "correct" whatever that means...

    Too Many Terms in the Domain Name – let the page do the talking, not the URL. and spread it out over the site.
    Agree.

    robots.txt - Don't trust the robots.txt. Google is looking regardless.
    Yes I've seen posts on this. What does it mean? Does G look to see if you are up to something and slap you if you're hiding something relevant? How would they determine that?

    Google API – this is a common tool not known to hurt (well, that I know of
    Not so far, see post above.

    G Sitemaps – brownosing.
    ROFL - well, whatever work, right?

    SERP Tracking - now we're getting somewhere!
    Where?

    Comments on this post

    • longcall911 agrees : tried to give you rep for your work, but can't.
  22. #12
  23. Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Dec 2004
    Location
    Northern NJ, USA
    Posts
    322
    Rep Power
    15
    In addition to the above specifics, I do believe that one day G will be analyzing content to determine whether a page truly delivers usable information, or rather, has it been built strictly for AdSense revenue.

    In other words, do we trust that this site is what it says it is?

    When that day comes, G will have conquered spam. And that is the goal behind trust.

    /*tom*/

    Comments on this post

    • rmccarley agrees : Yes, the goal is antispam. I should have pointed out G's motivation behind all this. After all, one more sentence couldn't hurt much! ;)
  24. #13
  25. Free the SB
    SEO Chat Skiller (1500 - 1999 posts)

    Join Date
    May 2004
    Location
    DC region
    Posts
    1,832
    Rep Power
    21
    Mccarley:

    This is a very thoughtful piece. It's no wonder you described "trust" being similar to PageRank. It seems to me Cutts used trust in a content to reflect how "better" sites could rank more highly.

    In that context the positives in your description should reflect elements of the google patent. Similarly, someone else (Surfdude?) published a list of patent-like items to address that reflect positive ranking attributes for Google.

    Too many points for me to comment on...and I would like to give you max rep for this but it will have to wait because of recent I sent your way.

    Anyways a thoughtful effort.

    Dave

    Comments on this post

    • rmccarley agrees : Thanks!
    255 chars s*ck
  26. #14
  27. SEO Earthquake!
    SEO Chat Scholar (3000 - 3499 posts)

    Join Date
    Jun 2005
    Location
    Sacramento, CA
    Posts
    3,307
    Rep Power
    25
    earlpearl aka "Dave" (whatever that means!),

    Thanks for the constructive criticism. I would like to see the post you reference (Surfdude - you out there?).

    Your post did bring up something for me. Trust is the foundation of Gs principles and the direction of where they want the Internet to go influences those principles. So whatever they don't like is considered spam. My thing is, commercialism isn't "better" or "worse" than content. Sometimes I search for pizza because I want to order one - not get the history of it.

    In the end, G's algo is only as good and useful as the humans behind it. As long as they push their agenda of relevancy there will be spam.

    The race continues...
  28. #15
  29. Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Dec 2004
    Location
    Northern NJ, USA
    Posts
    322
    Rep Power
    15
    Originally Posted by earlpearl
    It's no wonder you described "trust" being similar to PageRank. It seems to me Cutts used trust in a content to reflect how "better" sites could rank more highly.
    I think there are two separate but related aspects to 'trust'. One is TrustRank the other is a trust factor.

    G has in fact published some info on TrustRank, but I don't have it in front me so I'll wing it. As I recall, some number of people where reviewing some number of 'core' sites and assigning a TrustRank to each. Like PR, TR is supposed to transfer via links. So, with the 'seed' planted, G can then watch TrustRank spread throughout the net. I assume that if you get some, that would be a very good thing.

    If you do get some, what happens? Of course we don't know, but my guess is that it raises the trust factor for your page or perhaps your entire site. If so, does it effect all kws or only those in anchor from the trusted link? Who knows?

    The point is, my best guess is that TrustRank will/does effect the trust factor which is then multiplied by the doc score to arrive at a trusted doc score.

    If this is even close to true, it is also possible that 'DistrustRank' could be seeded just as TrustRank has been, to knock you down some for linking to bad neighbors :-)

    /*tom*/
    Last edited by longcall911; Sep 14th, 2005 at 11:02 AM.
Page 1 of 2 12 Last
  • Jump to page:

Similar Threads

  1. My Google Nightmare, The Horror Story Continues
    By netcultur in forum Google Optimization
    Replies: 0
    Last Post: Jul 9th, 2005, 03:14 PM
  2. What is Google - An Opinion
    By fvkg in forum SEO Help (General Chat)
    Replies: 9
    Last Post: Jul 6th, 2005, 05:00 AM
  3. Google is now the biggest Internet company!
    By faremax in forum Search Engine Optimization
    Replies: 42
    Last Post: Feb 24th, 2005, 02:14 AM
  4. Article: new Google feature called Google Suggest
    By dirtdog1960 in forum SEO Chat Articles
    Replies: 0
    Last Post: Dec 12th, 2004, 01:09 AM
  5. Fooling Google
    By Victor in forum Google Optimization
    Replies: 18
    Last Post: Jul 29th, 2004, 11:10 AM

IMN logo majestic logo threadwatch logo seochat tools logo