#1
  1. No Profile Picture
    Registered User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Mar 2005
    Posts
    6
    Rep Power
    0

    Session ID's for paging


    I am utilizing session variables to accomplish paging...my question is...will
    a bot be able to detect all my different pages if the paging is done with session variables.
    Meaning i have an array as a session variable and i am passing the values within that array
    to the subsequent pages. Thank You
  2. #2
  3. Web Designer / Web Monkey
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    May 2003
    Location
    London
    Posts
    202
    Rep Power
    15
    As long as the session is not in the URL
    for example

    http://www.domain.com/pageviews.asp?id=23&session=234feksdfweorfkofwspo2 342odko2k3opk2dok

    If the session is parsed in the URL, the bot will not be able to index the file, as sessions are considered to imply the page only has temporary content or secure content.

    It really depends on the site...

    I hope that answers your question...
  4. #3
  5. No Profile Picture
    Registered User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Mar 2005
    Posts
    6
    Rep Power
    0

    No session variable in url


    So if there is not a session_ID or variable being passed through the url then a
    bot should not have a problem with it.....okay... i would really appreciate anymore input
    on this topic but thank you robbielockie for your reply
  6. #4
  7. Web Designer / Web Monkey
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    May 2003
    Location
    London
    Posts
    202
    Rep Power
    15
    It really is that simple, as long as there is no session id the bot will access and cache the page.

    Google can index files with variables, eg. www.domain.com/viewfile.asp?id=32&prod=23

    It is my understanding that the bot will not index files that have more than two variables, but this could more more or less, but Its around two last time I checked.

    The spider(bot) is getting more an more advanced every day, reason the bot stays away from files that have too many variables is that the bot can get stuck in a loop and end up crashing the server, this is why they tend to stay away from sites that rely heavily on dynamic URLS!

    So in summary, as long as your urls are session id free, the bot WILL index them!

    Robbie

    Originally Posted by ssbrooks
    So if there is not a session_ID or variable being passed through the url then a
    bot should not have a problem with it.....okay... i would really appreciate anymore input
    on this topic but thank you robbielockie for your reply
  8. #5
  9. No Profile Picture
    Registered User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Mar 2005
    Posts
    6
    Rep Power
    0

    no sesssions in url


    Thank you very much...I cannot believe that all you have to do is keep sessions out of your url...thank you for your help
    Originally Posted by robbielockie
    It really is that simple, as long as there is no session id the bot will access and cache the page.

    Google can index files with variables, eg. www.domain.com/viewfile.asp?id=32&prod=23

    It is my understanding that the bot will not index files that have more than two variables, but this could more more or less, but Its around two last time I checked.

    The spider(bot) is getting more an more advanced every day, reason the bot stays away from files that have too many variables is that the bot can get stuck in a loop and end up crashing the server, this is why they tend to stay away from sites that rely heavily on dynamic URLS!

    So in summary, as long as your urls are session id free, the bot WILL index them!

    Robbie
  10. #6
  11. Talking Monkey
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Mar 2003
    Location
    Mountain View, CA / Warren, NJ
    Posts
    465
    Rep Power
    16
    I would also throw out a cautionary note that you should likewise ensure that all nessecary data is being passed via GET and not POST. MY gut tells me there's more to this than meets the eye, especially if you're using .NET, which is notorious for packing things up into a form post.

    Can you post the page or at least a sample URL?
  12. #7
  13. Web Designer / Web Monkey
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    May 2003
    Location
    London
    Posts
    202
    Rep Power
    15
    thats ok!

    Yes, If you want your pages to be indexed thats all you need to remember.

    Oh that and you need to have external links pointing to your site and new pages.

    Sometimes, when there is so much information infront of your things seem complicated, but often with SEO its the simplest things that work the best! ;)
  14. #8
  15. Web Designer / Web Monkey
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    May 2003
    Location
    London
    Posts
    202
    Rep Power
    15
    Good thinking nutkeis!!
  16. #9
  17. No Profile Picture
    Registered User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Mar 2005
    Posts
    6
    Rep Power
    0

    Unfortuantely....


    unfortunately....i am working for a business and am told i cannot post the page
    to this forum...but if i use a GET doesn't it post everything to the url and that is not
    what i want right...i am little confused about your advice could you please elaborate...
  18. #10
  19. Talking Monkey
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Mar 2003
    Location
    Mountain View, CA / Warren, NJ
    Posts
    465
    Rep Power
    16
    Ok, let's explore this a little more.

    When a spider such as Googlebot indexes your site, it isn't actually clicking on your links in the same way a human with a web browser is. Instead, it will retrieve one page (let's say your home page) and add the contents of that page to a database. Then it will parse through the text of that page and find the URLs of all links on that page. It then adds those URLs to the DB as, let's say "URLs to be indexed". Then it repeats the process for each URL in its "URLs to be indexed" list. This process would not be unlike a human viewing a page, finding a link, copying the link target, closing their browser, opening a new one, and pasting the full link URL into the address bar.

    Therefore, in order to ensure that a page is indexable, you must be certain that 1. There is an HTML link coming into that page from another already indexed page, and 2. The url used by that link contains all data nessecary to display the page.

    You can see, then, why Googlebot cannot follow a form post - it violates both of the above rules.

    So the solution to your problem of pagination is simple, but not nessecarily easy. You have to figure out a way of passing all nessecary information to the page using only the querystring. You can do this by passing the full SQL as one argument and the page as the second, and in your code run the query then advance a number of records equal to the value of the page argument multiplied by the number of records to display. That's the easiest, but for long SQL statements the value of the argument may be too long for Google to recognize and follow.

    The tertiary rule I should mention here is to keep your vital QS paramaters to 3 or less, and keep the contents of those params as short as possible.

    Sorry for the verbose response, but I hope this helps.
  20. #11
  21. No Profile Picture
    Registered User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Mar 2005
    Posts
    6
    Rep Power
    0

    Very Helpful


    Okay... now i can see that this not working correctly. I
    copied a url that utilized paging and into a new browser and go nothing..so i can see that
    this doesn't work properly..thank you
  22. #12
  23. Talking Monkey
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Mar 2003
    Location
    Mountain View, CA / Warren, NJ
    Posts
    465
    Rep Power
    16
    No problem... let me know if you have any more questions.

    ADDENDUM:
    The problem we're seeing here is very typical of sites developed in .NET, but I'm sure it extends to other platforms as well. The new .NET controls are great things from a programming perspective, and so you can be sure that your developers will use them given half a chance. Make sure you understand the inner workings of everything that your site does, and kill the DataGrids wherever you can.
    Last edited by nutkeis; Mar 1st, 2005 at 11:03 AM.

Similar Threads

  1. PHP Session ID's and Google
    By wineo in forum Google Optimization
    Replies: 14
    Last Post: Oct 21st, 2004, 04:38 AM
  2. Session IDs
    By pk_synths in forum BING/Yahoo Search Optimization
    Replies: 11
    Last Post: Apr 15th, 2004, 08:45 PM
  3. Search Engine Strategies -Munich (part one)
    By Webby in forum Search Engine Optimization
    Replies: 8
    Last Post: Nov 26th, 2003, 09:42 PM
  4. Session Id Question Please
    By clasione in forum Web Design, Coding and Programming
    Replies: 2
    Last Post: Apr 27th, 2003, 11:41 AM

IMN logo majestic logo threadwatch logo seochat tools logo