#1
  1. No Profile Picture
    Contributing User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Jun 2003
    Location
    Prague - Czech
    Posts
    39
    Rep Power
    16

    Solution forum for Florida update


    The patent how it described Claus seems to be activated during the Florida update. 90% of described new SERPS counting/functions seems to work now.

    http://www.webmasterworld.com/forum3/15073.htm

    I found one expectation till now:
    not all pages from the same host even domain are thrown away from SERPS.

    All negative sides mentioned by Claus works now:
    "It could be that blogger sites on blogger.com, tripod sites, web rings, and the like will see less impact in serps from crosslinking. Mirror sites, and some other types of affiliate programs, eg. the SearchKing type, will probably also suffer."

    For my SEO I use very often web rings and crosslinking.

    I have not a top page but many same level page for each group of key words, some times with part of similar content. Most of my pagerang was build by crosslinking and links from high-rank-pages but not from the same subject area.

    My competitors who are using three-page structure (top homepage) are not suffering from Florida update.

    My friend and competitor use a typical three-page structure one strong homepage and another less important pages. His link page is very old bat very good. His page rank is generated mostly by sites linking from other competitor domains in the same subject area. No ring structure no crosslinking. Now he is first on all important key word in all languages.

    Solution
    1. rebuild your sites to a three web structure
    2. if you own more domains change the webhosting of each domain to gain different IP address for crosslinking.
    3. try to contact your competitors in the same subject area to exchange links with them
    4. pray that it works
    Last edited by e-climb; Nov 28th, 2003 at 08:44 PM.
  2. #2
  3. SEO Strategist
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    May 2003
    Location
    Singapore
    Posts
    124
    Rep Power
    17
    I have to reduce the value of keyword density and reduce keyword multiple place or not. Please let me about the new formula ,or do you have any best example for apply for ranking well.
  4. #3
  5. No Profile Picture
    Xia
    Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Nov 2003
    Posts
    206
    Rep Power
    16
    The URL doesn't seem to work ( even without the double http:// ), all I get is an under construction page..
  6. #4
  7. Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Feb 2003
    Posts
    434
    Rep Power
    17
    Xia,

    You have to be signed into WMW to be able to view that thread.
  8. #5
  9. No Profile Picture
    Contributing User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Jun 2003
    Location
    Prague - Czech
    Posts
    39
    Rep Power
    16
    The original version from Claus interpretation part1:
    "The new Google patent

    In this thread of a similar name zafile pointed me towards a news.com article on a new Google patent. The link in the news.com article turned out to be wrong, but i found the patent anyway. Specifically, it is this one:

    6,526,440 Ranking search results by reranking the results based on local inter-connectivity



    You can find the details at the US Patent & Trademark Office, Patent Full-Text and Image Database - search for 6,526,440. Text from this patent - published to the public domain by USPTO - is also quoted in this post, where necessary to make a point. There are no copyrights on patent texts, but i have tried to keep quotes to a minimum anyway.

    As the title suggests, this is a patent that deals with ranking and re-ranking results.

    The patent was granted on February 25, 2003, and filed January 30, 2001 - so Google researchers have known about it for at least two years already. Still, a patent grant means that the source description is published. This is the reason for (as well as the "Google News" of) this post.

    I have spent a few hours studying it, and it clearly has implications for users of this forum. I'll get to the nitty-gritty of it, but let me point out the major points first.

    It's not an easy read. And there are 7 unknowns as well as some scope for flexibility and judgement (either by trial-and-error or by manual or automated processes). It's really interesting though.



    --------------------------------------------------------------------------------

    What is it?

    It's a patent. Nothing more and nothing less. A description of some procedure for doing something. This does not mean that it will ever be put to use, as lots of patents are granted and never used. Patents don't come with a release date, but some elements of the confusion we are seeing now could be explained by this.

    Chances are, however, that this one will be put to use. Having spent a few hours on it, i must say that it makes some sense. It is intended to provide better and more relevant results for users of the Google SE, and at the same time (i quote the patent text here) :

    ... to prevent any single author of web content from having too much of an impact on the ranking value.



    Sounds serious, especially for the SEO community. And it probably is, too. But don't panic. Notice that it says "too much of an" and not "any". It's still a ranking derived from links, not a random Google rank.



    --------------------------------------------------------------------------------
  10. #6
  11. No Profile Picture
    Contributing User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Jun 2003
    Location
    Prague - Czech
    Posts
    39
    Rep Power
    16
    original interpretation by Claus part2:
    What does it do?

    We know about the Page Rank algorithm. This is the tool that Google uses to make sure that the pages it has indexed are displayed to the user with the most important pages first. Without being too specific it simply means that, for each and every page Google calculates some value that ranks this page relative to the other pages in the index.

    This is something else. Rephrase: This is the same thing plus something else. It is, essentially, a new way to order the top results for any query.



    --------------------------------------------------------------------------------

    The ultra-brief three-step version:

    What the new patent implies is a ranking, then a reranking, then a weighting, and then a display. It goes something like this:

    1) The usual pagerank algo (or another suitable method) finds the top ranking (eg.) 1000 pages. The term for this is: the OldScore.

    2) Each page in this set is then going through a new ranking procedure, resulting in the LocalScore for this page.

    3) Finally, for each page, the LocalScore and the OldScore are normalized, assigned a weight, and then multiplied in order to yield the NewScore for that page.

    In this process there will actually be "two ranks", or rather, there will be three: The initially ranked documents (OldScore ~ Page Rank), and the reranked documents (LocalScore ~ Local Rank). The serps will still show only one set of documents, but this set will be ranked according to the "NewScore ~ New Rank" which is (sort of) a weighed average of PR and LR.



    --------------------------------------------------------------------------------

    Confused?

    Don't be confused by the fancy words. It's more straightforward than it seems. In other words, this is what happens:

    a) you search for a keyword or phrase - as usual
    b) pagerank finds the top 1000 results (or so) - as usual
    c) localrank calculates a new rank for each page - this, and the rest is new

    d) each page now has two ranks (PR and LR)

    e) the two sets of ranks are multiplied using some weights.
    f) the multiplication gives a third rank.

    g) each page now has one rank; the NewRank (sort of equal to PR times LR)

    h) pages are sorted according to the NewRank
    i) and finally displayed with the best "NewRanking" ones on top.

    - better?



    --------------------------------------------------------------------------------

    What does it mean to me then?

    Well, if you are one of the few who knows all about chea.. hrm... optimizing for Google, then it means that the world just got a bit tougher. And then again, perhaps not, there are still tricks in the bag for the very, say, experienced. Nuff said, just a feeling, i will not elaborate on that.

    It will become harder, it seems. If not for anything else, then only because you now have to pass not only one, but two independent ranking filters. In stead of optimizing for PR you will now have to optimize for both PR and LR.

    Let's assume, as a very simple example only, that values 0,1,2 are the only values for both PR and LR: If You get a PR of 2 and a LR of 0, then the NewRank will be 0. If you get a PR of 0 you will not even make it to the top set of 1000 that will ever get a LR calculated. On the other hand, if you get a PR and a LR of 1 then you're better off than the guy having a top PR but no LR.



    --------------------------------------------------------------------------------

    Got it - what's that LR thing then?

    It's a devise constructed to yield better results and (repeat quote):

    ... prevent any single author of web content from having too much of an impact on the ranking value.



    I have been looking at the patent for a while and this intention could very well be inforced by it. That is, if "authorship" is equal to "domain ownership", or "some unspecified network of affiliated authors".

    Here goes:

    The LocalScore, or Local Rank, is both a filter and a ranking mechanism. It only considers pages among the 1000 or so selected by the PR.

    a) The first step in calculating Local Rank for a page is to locate all pages that have outbound links to this page. All pages among the top 1000 that is.

    b) Next, all pages that are from the same host as<tis page or from "Similar or affiliated hosts" gets thrown away. Yes. By comparing any two documents within the set, the one having the smallest PR will always be thrown away, until there is only one document left from the (quote) "same host or similar" as the document that is currently being ranked.

    Here, "same host" refers to three octets of the IP. That means the first three quarters of it. In other words, these IPs are the same host:

    111.111.111.0
    111.111.111.255

    "Similar or affiliated hosts" refers to mirrors or other kinds of pages that (quote) "contain the same or nearly the same documents". This could be (quote) "determined through a manual search or by an automated web search that compares the contents at different hosts". Here's another patent number for the curious: 5,913,208 (June 15, 1999)

    That is: Your on-site link structure means zero to LR. Linking to and from others on the same IP means zero. Near-duplicate pages means zero. Only one page from your "neigborhood", the single most relevant page, will be taken into account.

    c) Now, the exact same procedure is repeated for each "host" in the set, until each "host/neigborhood" has only one page left in the set.

    d) After this (tough) filtering comes another. Each of the remaining pages have a PR value that they are sorted according to. The top k pages pass, the rest gets thrown away. Here "k" is (quote) "a predetermined number (e.g., 20)."

    So, although you positively know that you have 1,234 inbound links, only the top "k" of these, that are not from "your neighborhood" or even "part of the same neigborhood" will count.

    e) The remaining pages are called the "BackSet". Only at this stage can the LR be calculated. It's pretty straightforward, but then again, the filtering is tough (quote, although not verbatim, but deviations keep current context):

    LocalRank = SUM(i=1-k) PR(BackSet(i))m

    Again, the m is one of those annoing unknowns (quote): "the appropriate value at which m should be set varies based on the nature of the OldScore values" (OldScore being PR). It is stated, however, that (quote) "Typical values for m are, for example, one through three".

    That's it. Really, it is. There's nothing more to the Local Rank than this.



    --------------------------------------------------------------------------------

    What about the New Rank then?

    This is getting a very long post you know... Well, luckily it's simple, the formula is here, it's as public as the rest, you can't have a patent that's also a secret (quote):

    NewScore(x) = (a+LocalScore(x)/MaxLS)(b+OldScore(x)/MaxOS)

    x being your page
    a being some weight *
    b being some weight *
    MaxLS being maximum of the LocalScore values, or some treshold value if this is too small
    MaxOS being maximum PR for the original set (the PR set)

    * Isn't this just beautiful (quote): "The a and b values are constants, and, may be, for example, each equal to one"



    --------------------------------------------------------------------------------

    Wrap up

    Inbound links are still important. Very much so. But not just any inbound links, rather: It is important to have inbound links spread across a variety of unrelated sources.

    It could be that blogger sites on blogger.com, tripod sites, web rings, and the like will see less impact in serps from crosslinking. Mirror sites, and some other types of affiliate programs, eg. the SearchKing type, will probably also suffer.

    My primary advice from the algebra right now is to seek incoming links from "quality unrelated sites" yet still sites within the same subject area. Unrelated means: Sites that are not sharing the first three quarters of an IP or are in other ways affiliated or from "same neigborhood" (at the very least not affiliated in a structured manner). Quality means what it says.

    Links from direct competitors will suddently have great value, as odd as it sounds.

    Note: Spam. Cloaking. Shadow domains. Need i say more? I'm sure they will fix that anti spam filter sometime, though.

    Candidate for longest post ever... here goes, let's see if the mods like it.

    /claus
  12. #7
  13. Contributing User
    SEO Chat High Scholar (3500 - 3999 posts)

    Join Date
    Mar 2003
    Location
    Louisiana, USA
    Posts
    3,874
    Rep Power
    24
    sounds impressive. and while i dont get all the math involved, it seems as though basically what you are saying is that cross-linking is bad. (i realize there's a lot more to it than that, but i'll concentrate on that part of it for now). if i understand everything you said at least to some degree, i would say my page should still show up at least in the top 1000 (used to be number 1). i have plenty of inbound links from all sorts of sites - all of which are unrelated to each other (unrelated meaning that they are not in the same neighborhood as me or to each other). in addition, there are several links to my site from sites within that top 1000. in fact, there are several links to my site within the top 20. (which irks me). so... either i am missing something in your post (which is certainly possible), or it doesn't seem to apply to me.
  14. #8
  15. No Profile Picture
    Registered User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Nov 2003
    Posts
    12
    Rep Power
    0
    Originally posted by dazzlindonna
    sounds impressive. and while i dont get all the math involved, it seems as though basically what you are saying is that cross-linking is bad. (i realize there's a lot more to it than that, but i'll concentrate on that part of it for now). if i understand everything you said at least to some degree, i would say my page should still show up at least in the top 1000 (used to be number 1). i have plenty of inbound links from all sorts of sites - all of which are unrelated to each other (unrelated meaning that they are not in the same neighborhood as me or to each other). in addition, there are several links to my site from sites within that top 1000. in fact, there are several links to my site within the top 20. (which irks me). so... either i am missing something in your post (which is certainly possible), or it doesn't seem to apply to me.
    I am throwing this out as something I have observed from the one keyword combo that effected me, I had two clients I did seo for and 3 other rankings were my own sites. also 2 other sites
    were lost, I think both were done by another company.

    so the 7 effected had one common denominator, backlinks.
    all my sites i used one of my high pr sites to boost them, and the clients sites. also the other two sites from the other company had this stupid back link from a free classified section, which in return was linked to one of my sites, stupid me, looking for pr a while ago, and put my link up on it. anyways this is hard to explain,

    but all the sites were crossed linked in some way. so part of the new google algo is to determine this, allowing for all link exchanges to be void, in return making it next to impossible to get pagerank. all it takes is one of your competitors to do a banklink search, get on the same link page, and boom, your both off the google search.

    also it has a phrase filter, for longer searches like las vegas web design company, has to be broken up.

    This problem arises when you have to get your links off these pages, lmao, easy to sign up. hard to get off of.

    anyways i am sure that doesnt make sense. but hey it makes sense to me.

    also, now you have to wait till the next update after you delink some pages to see your results. lmao, i am not too impressed with google. oh well. should be fun
  16. #9
  17. No Profile Picture
    Contributing User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Mar 2003
    Posts
    88
    Rep Power
    17
    Thanks for the great post Claus - it's definetly shed a light on what might be going on.

    Personally, I think Google has really missed the boat on this one. On the most basic level, they will possibly "penalize" websites that use popular hosts. If I host my website on a shared server that is using shared IPs there could theoretically be tens of thousands of websites in my "neighborhood". So even if I have a legitimate link coming from one of these sites Google is going to disregard it.
  18. #10
  19. Professional SEO
    SEO Chat Adventurer (500 - 999 posts)

    Join Date
    May 2003
    Location
    Finland
    Posts
    710
    Rep Power
    17
    Interesting theory.

    However, I'm wondering if the LocalRank has been put into action, why it is applied only to english-written pages? Most SEO's taking part into "post-florida discussions" forget that this stuff affects only english written pages. In my opinion, applying LR would be language independent, the changes we see aren't. So the LocalRank is not here (yet).
  20. #11
  21. web designer
    SEO Chat Super Hero (2500 - 2999 posts)

    Join Date
    Aug 2003
    Location
    designing a web site in columbus ohio
    Posts
    2,997
    Rep Power
    50

    Question


    Originally posted by cab2k5

    also it has a phrase filter, for longer searches like las vegas web design company, has to be broken up.
    what do you mean here?

    also...
    speaking of a web design company...
    if a web design company creates web sites....and also handles hosting for web sites [same ip neighborhood]....and also puts a credit link on each web site to the web design company [backlink]....then the web design company will be a natural victim of same ip backlinks....right? [even though this would be a very natural thing to do.]
  22. #12
  23. No Profile Picture
    Registered User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Nov 2003
    Posts
    12
    Rep Power
    0
    Originally posted by relaxzoolander
    what do you mean here?

    also...
    speaking of a web design company...
    if a web design company creates web sites....and also handles hosting for web sites [same ip neighborhood]....and also puts a credit link on each web site to the web design company [backlink]....then the web design company will be a natural victim of same ip backlinks....right? [even though this would be a very natural thing to do.]
    i thought of that also, it would only be a natural victim to same ip backlink if the websites were coming up in the first results for a specific keyword phrase.

    after doing some more research, there was 8 sites that were effected, 7 were linked to a real estate section of my site. and the other 2 were linked to a free classified section website, a spam heavy guest book type site, which in return linked to my
    real estate section site. it funny b/c now all of these guestbook type sites come up on the main page of the search, where the company was spamming for pr.

    they all had in common this stupid guest book type spad link site.
    also when i do a search for the web design search term that got effected, the top sites also had a site like this in common, so stupid link exchange pr booster site.

    the phrase filter, if you have a high pr will effect specific phrase terms like the ones in overture search term tool, like "las vegas web design company", for some reason it has to do with the other top results, in that search phrase which will trigger the filter,

    anyways, i dont have the answers just spouting what i have noticed.

    think you can have clean backlinks, and dirty backlinks, like the link pages, and if 3 or 4 in the top ten have the dirty backlinks,
    then they all get penalised for the specific search.

    wont know for a month till those sites get delinked with the new index . i upped a clean page with only one pr 5 link which is clean to see if i can target that search term with that site. give me an idea.


    problem.

Similar Threads

  1. Replies: 7
    Last Post: May 4th, 2004, 08:24 PM
  2. A solution for florida update.
    By asher02 in forum Google Optimization
    Replies: 61
    Last Post: Dec 31st, 2003, 09:33 AM

IMN logo majestic logo threadwatch logo seochat tools logo