#1
  1. Contributing User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Sep 2016
    Posts
    63
    Rep Power
    4

    affection of visitor, time on site


    hello all
    is there anybody know that how much visitor per day and time on site will effect on website ranking?
    is there any article or tools or experience exist to show or compare it with backlink???
    Last edited by elegantuniverse; Apr 30th, 2017 at 10:53 AM.
  2. #2
  3. Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Aug 2015
    Location
    Pennsylvania
    Posts
    136
    Rep Power
    136
    There is no formula of how many visits your site should get per day in order to improve ranking! You can have 10 visits a day and achieve position 1 on Google, and you can also get 500/day and end up on page 5 or buried somewhere deep in Google's search result.

    I would not focus on how many visits your site gets or article your site has, but on providing good quality, content focused article for your users, and not for your search engines. I am not saying you shouldn't think of search engines. You should but when writing content, focus on providing an answer, a how to, more knowledge to your visitors while still using your targeted user focused keyword.

    As far as tools, there are a lot of good Tools that will help you check on back links (if I understood your question correctly), keyword research and much more like ahrefs, semrush, moz!

    Peace!
  4. #3
  5. Contributing User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Sep 2016
    Posts
    63
    Rep Power
    4
    how about time on site? does it effective on search result?
  6. #4
  7. Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Aug 2015
    Location
    Pennsylvania
    Posts
    136
    Rep Power
    136
    As far as I am concerned, and with some testing, Yes it does! Search engines will see that someone stayed for a long time, 2 or more minutes, on your site means they enjoyed your post, found what they were looking for, and search engines achieved their goal, which is to provide searchers with the best search result related to their query.
  8. #5
  9. No Profile Picture
    Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Mar 2017
    Location
    Sydney, Australia
    Posts
    232
    Rep Power
    500
    Originally Posted by roma64
    As far as I am concerned, and with some testing, Yes it does! Search engines will see that someone stayed for a long time, 2 or more minutes, on your site means they enjoyed your post, found what they were looking for, and search engines achieved their goal, which is to provide searchers with the best search result related to their query.
    With respect,
    Google has said it does not use these sorts of metrics as ranking signals.

    Identifying a page's relevance to specific search words is the basis of a SE algorithm.

    What would happen in the ranking situation if they landed on one page spent 5 seconds on that then clicked through to another where they spent the 2 minutes? The second page can't be used to assess any search word relevance.

    A short time spent on a search landing page is probably the norm. Unless it one of the small percentage of sites that are laid out as a blog, the search landing page is probably where time spent is short.

    This is a very common situation with product catalogues sites, shopping sites, directories, classifieds and more. You seldom see the pages of detailed content that people actually spend time reading ranking high in SERPs.

    What about all the site visitors whose needs are satisfied in 10 seconds or less? People who only want:

    • A phone number
    • An address
    • A map
    • Name of an employee
    • How to spell your name
    • Links to your social media pages
    • Your hours of business
    • Whether you have a product in stock
    • What your prices are

    Time on page is not a signal of relevant content for these info needs.

    Google coined the phrase "micro-moments" to reflect how mobile phones are changing what people do on the web. Ref:

    Apr 2015: How Micro-Moments Are Changing the Rules

    If time on page was a ranking factor then all the mobile users would be dragging down people's rankings.

    IMHO, these sorts of "engagement" signals are too inconsistent to offer any use as a ranking signal.
  10. #6
  11. Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Aug 2015
    Location
    Pennsylvania
    Posts
    136
    Rep Power
    136
    JohnAmit! I agree with your points 100, but please note that I was referring to someone coming to your site then bounce right back out from any page they landed on and not just the landing page.

    What I was referring to is the bounce rate related to someone coming to your site and not finding what the title and description lead them to believe they will find on your site (looking for an SEO tutorial but found a sales pitch instead), a two complicated and busy site, and not when someone looking for a quick info, finds that info (however small it is) on a page of your site that answers his query, then leaves your site (the whole process could've taken less than a minute), or someone filling out a small 'Contact us' form then clicks on the submit button will not be seen as a bounce, thus, it will not affect their ranking.

    Everything will depend on the site's niche and the query. For example, the bounce (or time on site) rate of a website related to a sport score will not affect its ranking/score and its high bounce rate might be good because the site served its purpose.

    My comment about Bounce rate was referring to bouncing back out of the website right after landing due to bad design, slow and too busy website, poor quality and not relevant content or thin content, or some other technical

    Can Google actually and truthfully know your bounce rate? not really but what it can do is tell when users had a bad/good experience with your site, and then make changes to your ranking. Again, this will depend on the site's niche.
  12. #7
  13. No Profile Picture
    ODI
    Contributing User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Apr 2017
    Location
    Delhi
    Posts
    39
    Rep Power
    0
    As i know, the time plays the important role in SE algorithms, how much time user stay on website and how many sessions he/she created and any event did the perform or not, how many pages they visits and on which page they spent more time, it matters.
    Its my own views you can disagree if i am wrong and correct me if i am wrong.
  14. #8
  15. Contributing User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Sep 2016
    Posts
    63
    Rep Power
    4
    In my opinion google compares your site and your competitor's sites on every subject and its index , there is no doubt when many users ip are spending too much time on specific site will help that site to ranking, it means that website's contents are useful for users
  16. #9
  17. No Profile Picture
    Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Mar 2017
    Location
    Sydney, Australia
    Posts
    232
    Rep Power
    500
    Hi Roma64,
    Somehow I missed your questions in this thread. Sorry for the delayed reply.

    Some of your comments are already covered by G's algorithm without recourse to bounce rates and time on page.

    Eg. A page title and description that does not match the page's content. Such a mismatched page should have trouble ranking high in the SERPs.

    Then, I'm having trouble working out how/why G is going to make other content quality determinations you suggest.

    Your Egs.
    • "looking for an SEO tutorial but found a sales pitch instead"
    • "a too complicated and busy site"

    Why should a sales pitch on an SEO tutorial page be considered bad? I tell all my clients, every page on their site should include multiple calls to action (sales pitches). I've never seen this hurt their rankings.

    How would you define a site that is "too complicated and busy"? Wouldn't these attributes be detrimental to new sites that used lots of "widgets", "plug-ins" and social media inter-action? I don't think G could afford to bias its results against modern CMS and e-commerce systems to favour old-fashioned hand coded sites, do you?

    Let's pause to reflect on what G is trying to do...

    It is trying to analyse words in a search query to then deliver the most relevant answers from the 130 trillion pages it has indexed.

    It still has trouble delivering relevant results when too few search words are used. That is partly why the average search query length has grown to 4+ words.

    If G. starts giving too much weight to non-word signals to boost page rankings, it risks compromising its ability to interpret the word signals accurately. That WOULD cause results to deteriorate. It is also going to result in variable boost signals depending on the numbers of words used in a search query. Eg. They would be at their relative strongest for one word search queries and their impact would deteriorate as the search query became longer. I.e. One and two-word search query results would likely be a total farce.

    (My guess is that resolving these issues is largely behind why G has been delaying the implementation of its mobile index.)

    At present IMHO, the above is why when G. does use non-word signals, the positive ranking values are usually fixed and very small. "Bad" signals can result in demotion of page types, not a boost for sites that don't use them.

    Egs.
    Positive Signals
    • https signal
    • mobile enabled signal

    Negative Signals
    • slow page load speed - probably only kicks in when load speed is over 1 minute
    • culling/demoting certain types of low value content pages in SERPs

    You suggest that G can tell whether users had a good or bad experience on a site. How?

    You also suggest G can assess good or bad design. What parameters would G use to assess them?

    The single most important user experience attribute people want is load speed on both desktop and mobile. At present G does not even use this as a positive ranking signal on desktop and mobile's boost is so small that it is easily outranked with other signals.

    If "complicated" is defined by the number of files needed to load a page, at present this only seems to have an impact if this attribute contributes to an overly long load speed.

    You raise interesting points that give us the opportunity to consider the logic, capabilities and limitations of G.

    When we get so little definitive info from G., I've always found that exploring algorithm rankings with logic based on what we do know can more often lead us closer to the G. algo truths.

    What do folk think?
    Last edited by JohnAimit; May 15th, 2017 at 04:22 AM.
  18. #10
  19. Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Aug 2015
    Location
    Pennsylvania
    Posts
    136
    Rep Power
    136
    JohnAimit! That is a great post and I enjoyed reading it. My "looking for an SEO tutorial but found a sales pitch instead" example was a poor one. What I was referring to here is "Deception" where a user clicks on a link in SERPs expecting product/service 'A' but when landing on the page, he gets '20'.

    For the "a too complicated and busy site" comment, I was referring to Bad UX (for mobile). I myself haven't heard from Google, nor read that UX is a ranking factor, so don't quote me on that. However, I saw UX mentioned at least 12 times if not more (I didn't get a chance to fully read Google's Quality Rater Guidelines.
  20. #11
  21. No Profile Picture
    Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Mar 2017
    Location
    Sydney, Australia
    Posts
    232
    Rep Power
    500
    Hi Roma64,
    "Google's Quality Rater Guidelines" have nothing to do with how G's algo works and the way it ranks pages.

    These guidelines exist because there are many factors that G's algo can't assess, for which its capabilities are very limited or where results can vary dramatically across its 192 country search engines. That is why it needs large numbers of human search quality evaluators to check the results outcomes of G's 1,600 algo changes it made last year.

    1. Assessing Mobile or Desktop with Time on Site, Click-throughs or UX.
    I can't see how these can ever be used as primary or major ranking factors because they do not assist with interpreting how the words used in the search query can be "valued" on matching web pages.

    If G doesn't focus on evaluate the words first, results quality immediately starts its journey to hell in a hand-basket.

    The other practical problem is that G can't assess many UX factors at present and one wonders if it ever will.

    If G ever gives major ranking boosts for page load speeds then an unwanted consequence is likely to be that 10 year old web pages will jump up dramatically in the results. Will that make the results better? Probably not.

    Example:
    The main UX factor for mobile and desktop is page load speed. This will be influenced by many factors that are unrelated to the production of websites, like the network speed of the mobile user's service provider, all site user's demand on the web hosting server, location of the server and many more. Perhaps the techies around here can confirm whether interactive sites like e-commerce or forum sites that may need to handle a bunch of processing requests going on would be slowed down vs a catalogue site of the same products with no server CPU demand?

    2. Deceptive Pages in SERPs
    Are you saying that pages that list 20 products/services are deceptive? Eg. Directory pages and shopping cart category pages.

    That sort of result is not necessarily deceptive.

    The guidelines give these examples of deceptive pages (P37):

    • "Pages or websites which misrepresent the website's owner or purpose, such as by impersonating a different site (e.g., copied logo or branding of an unaffiliated site, URL that mimics another site's name).
    • Pages or websites which appear to be deliberate attempts to misinform or deceive users by presenting factually inaccurate content (e.g., fake product reviews, demonstrably inaccurate news, etc.)."


    3. The Raters/Assessors
    From the guidelines (P6): "It is very important for you to represent users in the locale you evaluate. You must be very familiar with the task language and location in order to represent the experience of users in your locale."

    In 2008 G said in its official blog that it had evaluators in over 100 countries.

    I interpret this to also mean the raters will be using the country specific G search engine relevant to their country location.

    4. How Location can Totally Change Results SEO Tactics
    The guide says (P63) "For many or most queries, the user location does not change our understanding of the query and user intent."

    The trouble is it seems "most" search queries aren't relevant to most SEO clients I've worked for. Location has been crucial for all of them.

    Location seems to kick in as G's primary ranking factor with any search for a:

    • product
    • product category
    • type of retailer
    • reseller
    • wholesaler
    • manufacturer
    • service category
    • specific service
    • type of business
    • type of deals
    • events
    • jobs
    • and more...


    Example 1:
    Search term on google.com.au: plumbers

    As far as I checked (top 300 places) all results were for Au plumber related pages. Not one page from any other country ranked, no matter what links or SEO activities they may have undertaken.

    Example 2:
    Search term on google.nu: plumbers

    This is the G country specific SE for the Pacific Island of Niue, population 1,600.

    No Niue plumber are listed in the search results. It seems there is no plumber's website within 2,000km of Niue.

    If time-on-site, search clicks, content quality or UX signals boosted overseas plumbers' sites from non-local countries into the local country results, they would worsen the search results quality.

    SEO advice for Niue would simply be publish some .nu TLD web pages that include the words "plumbers" and "Niue" in the body text. You would not even need title or description tags to rank #1.

    5. SERP Results Domination can Vary Dramatically by Country
    If folk are primarily searching in the USA, they may be unfamiliar with the magnitude of differences in International SEs generated by the combination of location ranking boost and lower levels of search competition.

    Example:
    If your client is targeting Au clients in B2B markets, it can be possible to dominate the results.

    Today I checked 22 search queries for the same B2B clients on G.com and G.com.au. (Two identical websites but published on two country TLDs.)

    G.com = Average of 1.4 site pages per search query in the top 10 SERPs
    G.com.au = Average of 3.8 site pages per search query in the top 10 SERPs

    On G.com.au, 36% of search queries resulted in between 4-8 site pages in the top 10.

    This is a great result for the client but perhaps not so good for the searcher.

Similar Threads

  1. My web site visitor decrease day by day
    By khokon-bd in forum Search Engine Optimization
    Replies: 10
    Last Post: May 10th, 2012, 07:24 AM
  2. Replies: 0
    Last Post: Nov 15th, 2010, 05:02 PM
  3. How can get visitor on site
    By jutt420 in forum Search Engine Optimization
    Replies: 14
    Last Post: Jul 30th, 2010, 01:00 AM

IMN logo majestic logo threadwatch logo seochat tools logo