Page 1 of 2 12 Last
  • Jump to page:
    #1
  1. Contributing User
    SEO Chat Adventurer (500 - 999 posts)

    Join Date
    Apr 2011
    Posts
    946
    Rep Power
    1386

    CTR, Bounce, Rankings and Google's Invisible ReCaptcha


    Just thinking. We've often argues here about whether or not user behaviour (CTR, time to bounce, etc.) influences your rankings and had pretty convincing points made for both sides.

    But now that Google reckons it can tell bot and human apart (their claim with the roll out of the Invisible ReCaptcha) is this a game changer. If they can now tell a genuine human they can now assess how humans react to a site in a way that bots can't fake regardless of historical arguments is there a good chance user behaviour will become, beyond debate, a ranking factor?

    Comments on this post

    • KernelPanic agrees : Excellent talking point
    • whiterabbit agrees : OoOoo, I didn't even thing about how they may be applying this technology elsewhere; great talking point indeed!
  2. #2
  3. No Profile Picture
    Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Mar 2017
    Location
    Sydney, Australia
    Posts
    191
    Rep Power
    366
    Hi Doodled,
    G only finished beta testing of Invisible ReCaptcha a few weeks ago. Given G's track record, I suggest it is more unlikely than likely to become part of any algorithm and it won't happen for a long time. Eg. Ten years after the Smartphone was introduced and its impact on G's algo is still only minimal.

    Let me summarize the recent SEO Chat posts on whether Google uses any "engagement" signals (CTR, time on page, bounce rates, dwell time, pogo sticking, etc.) in its ranking algorithm.

    G denied it all here in Mar 2016.

    Video: Google Q&A+ #March 2016

    From Google you have Andrey Lipattsev Search Quality Senior Strategist at Google (Ireland). Andrey has been with Google since August 2010.

    He said Google does NOT use engagement metrics as ranking signals at all. At various times he refers to these different problems with them:

    • They are poor ranking signals because they lack consistency of interpretation
    • Google can't measure them across the web
    • They would be too easy to spam

    Some one thought "dwell time" may be a signal without telling us which definition of dwell time he believed in. It seems dwell time definitions all fall at one of these hurdles:

    • Google can't measure and/or does not report them
    • Only 8% of searchers actually bounce back to the SERPs

    Mar 2017: The State of Searcher Behavior Revealed Through 23 Remarkable Statistics
    Author: Rand Fishkin, MOZ.

    Quote: "#22: What percent of Google queries result in pogo-sticking (i.e. the searcher clicks a result, then bounces back to the search results page and chooses a different result)?"

    "...8% of searches that followed this pattern of search > click > back to search > click a different result."

    A SE cannot place any ranking value on a signal that is only used in 8% of searches.

    It seems to me that a lot of the "engagement" signal hypotheses are based on false logic.

    Eg:
    • More time on a site = good
    • Low bounce rate = good
    • High CTR = good

    Why?
    • If I want a phone number and its on the first page I visit, I click it and am gone in less than 10 secs. That's a good result, not a bad one.
    • If I land on a product page that contains all the info I want, why should I visit another page?
    • If I click a link in a newsletter, an email, a forum or social media page to a particular article I want to read, why should I click to another page on the site?

    CTR in search is primarily determined by a page's ranking in the SERPs. #1 place receives 10-15 times the CTR of the 10th ranked page. What use is this in a ranking algo?

    Site CTR as reported in Analytics is an average of all sources of traffic. Eg:

    • online ads
    • social media
    • email campaigns
    • other referrals
    • direct entry
    • entry form other SEs SERPS
    • entry from G SERPS

    Every one of these traffic sources except generic SE referrals will result in high bounce rates. What value do any of these traffic sources have to do with any SE algorithm relevance?

    What does your definition of a site engagement signal suggest we do to influence a page's ranking in the SERPs?

    How can a site CTR metric have any relevance to a search algorithm when there are so many unrelated variables in play?

    Sorry troops, but I can't see any logic in any discussions based on engagement statistics of these types.

    I'd be delighted and grateful for someone to update me on any info I've missed on this crucial SEO topic.

    Comments on this post

    • KernelPanic : Bing admits it uses CTR, you're wrong
    Last edited by JohnAimit; Jul 21st, 2017 at 05:46 AM.
  4. #3
  5. Digital Marketing
    SEO Chat Mastermind (5000+ posts)

    Join Date
    Aug 2008
    Location
    Florida
    Posts
    7,158
    Rep Power
    4636
    Originally Posted by JohnAimit
    He said Google does NOT use engagement metrics as ranking signals at all. At various times he refers to these different problems with them:

    Mar 2017: The State of Searcher Behavior Revealed Through 23 Remarkable Statistics
    Author: Rand Fishkin, MOZ.
    1) Google lies
    2) You can find more places where someone at MOZ says UB metrics are ranking signals than you can find where they say it isn't.

    Regardless, what does your own data and your own testing tell you?
  6. #4
  7. No Profile Picture
    Contributing User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    May 2017
    Posts
    20
    Rep Power
    0
    • If I want a phone number and its on the first page I visit, I click it and am gone in less than 10 secs. That's a good result, not a bad one.
    • If I land on a product page that contains all the info I want, why should I visit another page?
    • If I click a link in a newsletter, an email, a forum or social media page to a particular article I want to read, why should I click to another page on the site?
    Wasn't Google's PageRank algorithm based on the assumption that a web surfer would keep following interesting links till he or she got bored/landed on a useless page?

    PageRank can be thought of as a model of user behavior. We assume there is a "random surfer" who is given a web page at random and keeps clicking on links, never hitting "back" but eventually gets bored and starts on another random page. The probability that the random surfer visits a page is its PageRank. And, the d damping factor is the probability at each page the "random surfer" will get bored and request another random page. One important variation is to only add the damping factor d to a single page, or a group of pages. This allows for personalization and can make it nearly impossible to deliberately mislead the system in order to get a higher ranking. We have several other extensions to PageRank, again see [Page 98].
    The Anatomy of a Search Engine

    Comments on this post

    • Doodled agrees : Yes, this is Google's "Needs Met" approach - i.e. a signal
  8. #5
  9. No Profile Picture
    Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Mar 2017
    Location
    Sydney, Australia
    Posts
    191
    Rep Power
    366
    Hi KP,
    Sorry to hear that Google has been lying to you again. Why do you think it keeps picking on you?

    Confused about your reference to Bing. This is the Google SEO forum. The post is about G and I made no reference to Bing at all. How can I be wrong about something I never mentioned?

    Thanks for the tip about using my many SEO clients' data and testing to discuss the issues. No doubt, I'll get into these shortly...
    Last edited by JohnAimit; Jul 23rd, 2017 at 04:27 AM.
  10. #6
  11. No Profile Picture
    Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Mar 2017
    Location
    Sydney, Australia
    Posts
    191
    Rep Power
    366
    Hi Pacman000,
    Sorry, but I don't see the relevance of your reference to G's PageRank patent to this discussion of "engagement" signals.

    Would you care to clarify?
    Last edited by JohnAimit; Jul 23rd, 2017 at 04:29 AM.
  12. #7
  13. No Profile Picture
    Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Mar 2017
    Location
    Sydney, Australia
    Posts
    191
    Rep Power
    366
    Hi All,
    To those who believe "engagement" signals like CTR, bounce rate, time on site/page, pogo sticking, dwell time, etc. are important in G's algo...

    Can you clarify for us some critical definitions pertinent to this discussion?

    Are you referring to these engagement metrics as they apply to:

    • Averages for a website?
    • Averages for a specific web page?
    • Metrics for all traffic sources or only those for generic search acquisitions?
    • Metrics for specific search queries?

    There are completely different answers that may be relevant to each of the above definitions of engagement metrics.

    We've been down these "engagement" discussion pathways so many times without the benefit of clarity of terminology and got nowhere in the resultant confusion.

    How about we delve down into the weeds with definition clarity so we can discuss/debate and share experiences from each of many clearly defined positions?
    Last edited by JohnAimit; Jul 23rd, 2017 at 04:31 AM.
  14. #8
  15. No Profile Picture
    Contributing User
    SEO Chat Adventurer (500 - 999 posts)

    Join Date
    Sep 2016
    Posts
    998
    Rep Power
    2238
    Hi John
    Originally Posted by JohnAimit
    Sorry, but I don't see the relevance of your reference to G's PageRank patent to this discussion.
    Doodled started the thread with
    Originally Posted by Doodled
    But now that Google reckons it can tell bot and human apart (their claim with the roll out of the Invisible ReCaptcha) is this a game changer. If they can now tell a genuine human they can now assess how humans react to a site in a way that bots can't fake regardless of historical arguments is there a good chance user behaviour will become, beyond debate, a ranking factor?
    Then
    Originally Posted by pacman000
    Wasn't Google's PageRank algorithm based on the assumption that a web surfer would keep following interesting links till he or she got bored/landed on a useless page?
    It's obvious to me that pacman000 is making a comparison between the fact that PR was maybe an attempt mathematically to emulate somewhat, human behavior in calculating the PR value.

    That's my take on it.....
  16. #9
  17. No Profile Picture
    Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Mar 2017
    Location
    Sydney, Australia
    Posts
    191
    Rep Power
    366
    Hi Knowonespecial.
    I can't interpret Doodled's intention with his post.

    It just seems to me that PageRank and engagement signals are completely unrelated discussions.

    I'd be happy to discuss PageRank separately but merging that with engagement signals will, I believe get us nowhere.
  18. #10
  19. Contributing User
    SEO Chat Adventurer (500 - 999 posts)

    Join Date
    Apr 2011
    Posts
    946
    Rep Power
    1386
    Right ... yes ... well. I didn't want a rerun of the same old debate. That debate has valid points made by both sides. My experience tells me (as it does KernalPanic) that Google has been using user behaviour signals for years. I too suspect Google lies and throws out false negatives in order to protect its secrets!

    But I wanted to try and avoid discussing the past because with Invisible ReCaptcha we're talking about "from this point on" regardless of what you might have believed (rightly or wrongly) before.

    This thread was to see if there was a broader agreement now (not then). But perhaps not if many people believe, like JohnAimit, that it will take years for them to incorporate their ReCaptcha technology into their algo in a reliable way.

    So probably best to just move on
  20. #11
  21. No Profile Picture
    Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Mar 2017
    Location
    Sydney, Australia
    Posts
    191
    Rep Power
    366
    Hi Doodled,
    This is an important reference that Pacman000 posted. Only just had time to reply in detail to it. (Sorry Pacman000)

    The Anatomy of a Search Engine

    With respect however, I believe he grabbed an allegorical description rather than the important meat about how G works. The guts of that is in sections 4.1 - 4.5.

    This tells us that G's core was based on:

    • Assessing ranking values for individual words on each page (included on-page and off-page elements)
    • PageRank values were based on the individual words in link text from page to page
    • Page "quality" was implied from the method of determining PageRank
    • The ranking function was designed so that no particular factor could have too much influence

    There was no domain ranking value in the description. That was implied by the expectation that most external links went to a domain's Home page. It seems the only links of ranking significance for a specific search query were those that included one or more of the search query words in link text.

    It is easy to check that G's ranking is still based on it assessing the values of individual words.

    I think we also know that non-word based ranking signals like https and mobile pages only have a small, fixed ranking value. Too large a signal and they could cloud G's word based ranking signals.

    Let's apply some logic based on the above...

    It seems to me that any "engagement" signal that cannot be attributed to individual search words cannot help G's understanding of individual words in a document.

    That would eliminate:

    • Site average engagement metrics
    • Metrics not related to generic search referrals

    Let me pose questions for folk with a deeper knowledge of database management, indexing and storage.

    It seems G's system was based on a relatively static situation. i.e The frequency with which it re-indexes individual pages. We don't know what this is but Home pages for most sites is weekly but for deep pages it could take months.

    Search engagement signals would be a constantly changing parameter and the changes per page should also vary immensely. Eg. While most external links may go to a site's Home page, it should be relatively common for internal pages to attract more generic SE referrals than Home.

    So, we need to consider the mechanics and frequency of collecting engagement metrics. How would this work?...

    It seems G will need a new and very large d/b and indexing system not described in the 1997 document, above.

    Take a 26,000 page website with 88,000 generic SE refs per month. If G is re-indexing all a site's pages on average of every 3 months it will need a new d/b storage and computation for 27 billion word lists, given that 50% of search queries are 4 words long.

    Engagement signal limitations include:

    • Bounce rate: High or low can each be interpreted as a good quality signal - it depends on the searcher's intent
    • Time to bounce: It seems G can only measure the 8% of searchers who return to the search results
    • Dwell time: Again limited by only 8% of searchers actually returning to the original SERPs.
    • CTR: This can only be applicable to the first 10 results. CTR is primarily determined by a page's ranking, title and description. What relevance do these parameters have to delivering relevant results?

    The bottom line question to me would be, what improvement in search result relevance/quality may ensue from this huge additional expense in storage and computation?

    Any thoughts?

    PS. I'm no Google apologist. It seems to me that if G was caught lying about its algo it would be inundated with class action law suits from ten's of thousands of litigants all over the world. Of course it gives lots of vague answers to protect its magic sauce but an out right lie seems unlikely and a bad business option to me.
    Last edited by JohnAimit; Jul 28th, 2017 at 03:25 AM.
  22. #12
  23. No Profile Picture
    Contributing User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Nov 2014
    Posts
    12
    Rep Power
    0
    Yes KernelPanic, You are right. Google is not showing all their cards as they need to keep their ranking algorithm secret. I read many blogs and articles of leading authors and personalities in SEO industry that User Experience is indeed a good ranking signal. Google uses it because Google itself says that, now good Usability and User Experience are very important for SEO. So how they leave it behind as ranking signal.

    Comments on this post

    • KernelPanic agrees
  24. #13
  25. No Profile Picture
    Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Mar 2017
    Location
    Sydney, Australia
    Posts
    191
    Rep Power
    366
    Hi johnp72014,
    G has confirmed that it does use various User Experience signals in its algo. (Eg. Mobile pages, slow load speed, hacked websites.)

    However, in this post, the OP questions G's use of "engagement" signals in its algo. (CTR, time on page, bounce rates, dwell time, pogo sticking, etc)

    G has denied using these in its algo for reasons that include:

    • They are inconsistent signals (The same signal can indicate a good or bad result depending on the search query.)
    • Can't be measured (Eg. The time on page of the last page accessed cannot be measured. Only 8% of searchers return to the SERP page.)
    • Do not improve result quality. (G's primary goal is to interpret the search query words. Only engagement signals for SE referral traffic can help with this.)

    I think it is reasonable to say that most articles written about SEO are either incorrect, out of date or confusing/misleading. Part of the problem is that these days, many SEO authors are writers, not SEOs and have done very little, if any SEO and are employed to churn out link bait SEO articles. Then there are so many SEO blog/newsletters where there is no SEO accuracy required.

    Great care is needed by the reader to cull the wheat from the chaff.

    I hope that SEO Chat can become a forum where professional SEOs can share knowledge, exchange experience and help cull out the SEO dross with logic and experience.

    When folk believe G lies, please give us a reference to who lied and about what.

    I know that "G. lied" is a popular belief but I've not seen it in my 24 years in the industry. I suspect many of these beliefs may result from terminology misinterpretations and/or misinformed authors.

    As a measure of the SEO inaccuracy out there, you may want to read this 2013 survey of 500 SE Roundtable readers, one year after G implemented its first Penguin change:

    May 2013: One Year Later, Only 9% Claim Google Penguin Recovery
    Author: Barry Schwartz

    Penguin was all about hitting external link manipulation schemes. The survey summary findings:

    • Only 12% were not impacted by Penguin
    • Fully recovered 1 year later- 9%
    • No recovery 1 year later - 61%

    Ever since the year "dot" G has said do NOT try to manipulate external links. Yet, 91% of the folk in this survey employed SEO tactics that hurt their clients to the point where 1 year later 61% could not recover.

    It seems either they did not believe G or they thought temporary link spamming was worth the risk.

    In my book, these types of SEOs are a fool to themselves and an expensive danger to their clients.

    There are a lot of SEO blogs with a "quality" image that is undeserved. Many SEO authors play on the name of the blog site. There is usually no SEO accuracy or quality control on these open forums. IMHO, the quality and accuracy of SEO articles is crashing because there are now so many professional journalists writing on SEO when they have no real knowledge of the topic.

    IMHO, the worst articles include the SEO algo ranking surveys published every year or so by various entities and the "Google's Top (number) Ranking Signals" articles. The best way to propagate mis-info about SEO is to conduct a popularity poll that includes input from the same 91% of SEOs who do not know enough to avoid a Penguin ranking penalty or who believe a "top "X" ranking factors" article.
    Last edited by JohnAimit; Jul 31st, 2017 at 04:28 AM.
  26. #14
  27. Digital Marketing
    SEO Chat Mastermind (5000+ posts)

    Join Date
    Aug 2008
    Location
    Florida
    Posts
    7,158
    Rep Power
    4636
    Originally Posted by JohnAimit
    ... in this post, the OP questions G's use of "engagement" signals in its algo. (CTR, time on page, bounce rates, dwell time, pogo sticking, etc)

    G has denied using these in its algo...
    MOZ, Larry Kim and Matt Cutts all disagree with you John.
    Last edited by KernelPanic; Jul 31st, 2017 at 06:10 AM.
  28. #15
  29. Digital Marketing
    SEO Chat Mastermind (5000+ posts)

    Join Date
    Aug 2008
    Location
    Florida
    Posts
    7,158
    Rep Power
    4636
    Originally Posted by JohnAimit
    I know that "G. lied" is a popular belief but I've not seen it in my 24 years in the industry.
    Google is 18 years old, how did you know Google doesn't lie before they were launched?
Page 1 of 2 12 Last
  • Jump to page:

Similar Threads

  1. Typical Bounce Rate for those with High Rankings?
    By ericwoo in forum Google Optimization
    Replies: 14
    Last Post: Dec 27th, 2012, 11:54 PM
  2. Google's Buy of reCAPTCHA Hurt Internet Security? (PC World)
    By RSS_News_User in forum Technology News
    Replies: 0
    Last Post: Sep 17th, 2009, 12:03 PM
  3. Google Will Buy reCAPTCHA To Help Scan Books (NewsFactor)
    By RSS_News_User in forum Technology News
    Replies: 0
    Last Post: Sep 17th, 2009, 11:04 AM
  4. Making Page Invisible on Google (due to error)
    By mbot in forum Suggestions & Feedback
    Replies: 0
    Last Post: May 13th, 2005, 10:50 AM
  5. Invisible in Google - Why?
    By timothy911 in forum Google Optimization
    Replies: 12
    Last Post: Jun 7th, 2004, 02:15 AM

IMN logo majestic logo threadwatch logo seochat tools logo