#1
  1. No Profile Picture
    Registered User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Aug 2009
    Posts
    15
    Rep Power
    0

    Noscript or not?


    Hi all, this is kinda a duplicate post, but the previous received no replies, so I will try restating my question.

    First off, my website is heavily javascript dependent with ajax calls to retrieve the actual content.

    As such I thought I was supposed to or could for the benefit of Search Engines which I believe do not support JavaScript add a <noscript> element which contains 'static' url links to the pages with the content rendered into basic HTML tables (essentially the same content as users would see).

    However from what I can tell Google (and Bing) both have not indexed those urls within my noscript tags. Any thoughts as to why? (also those urls are exposed in my sitemap.xml)

    Could it simply be that I have not waited long enough for google to index these? (its been live for just under a month).

    FYI: The website in question is borrownet.com
  2. #2
  3. Contributing User
    SEO Chat Super Genius (4500 - 4999 posts)

    Join Date
    May 2007
    Posts
    4,515
    Rep Power
    1985
    It's not really a true indicator, but what I would recommend is visiting the cached page and view the source code. If your noscript is in there, then it means Google has at least been to your page since you added the code.

    If it is not there, then check your server logs to be sure it hasn't been back and just not updated its cache of the page.
  4. #3
  5. No Profile Picture
    Registered User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Aug 2009
    Posts
    15
    Rep Power
    0
    yeah the cached version does indeed have the noscript, still not sure why the crawler isn't following the links inside it. (at least I think its not...seeing as those pages do not appear indexed).
  6. #4
  7. No Profile Picture
    Registered User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Aug 2009
    Posts
    15
    Rep Power
    0
    hmm, maybe its as I expected..the search engine (in this case bing/google) are ignoring the noscript content. Doing some googling..I found this:
    ...(from: http://www.stonetemple.com/blog/?p=80 )
    .... its an old post..not sure if it still holds true, but I think I will have use different solution rather then rely on the noscript tag/sitemap.xml

    But unfortunately, like so many great potential ideas, there are serious problems with this approach. It originates with the fact that this has been used be spammers as a way of stuffing unrelated links and content on a page. The fact that the NOSCRIPT content is not seen by most users would make it a way to hide content.

    The result is that the search engines cannot really trust the content found within a NOSCRIPT tag.Here is a forum thread at WebmasterWorld where Brett Tabke says that NOSCRIPT content links do not pass Page Rank. He also indicates that any content within the NOSCRIPT tags may not even be indexed.
  8. #5
  9. No Profile Picture
    Registered User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Aug 2009
    Location
    New York
    Posts
    13
    Rep Power
    0
    Interesting. That's useful to know.

    So have you figured out a workaround?
  10. #6
  11. No Profile Picture
    Registered User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Aug 2009
    Posts
    15
    Rep Power
    0
    well, I think what I'm going to try is rework my dynamic links (that currently result in ajax calls) so that they are initially loaded as static hrefs that point to the same pages as in my noscript tags (which is results in the same content..just without the all bells and whistles)

    Then in the javascript onload method call rewrite the links to again use the ajax calls instead...this way if the browser supports javascript the links will magically be fixed to work for users, while at the same time the search engines which won't process/execute the onload routine..will be left with original static links.

    The only downside I see is that users might be able to click the links before the onload method has rewritten the links.
  12. #7
  13. Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Jan 2007
    Posts
    347
    Rep Power
    27
    Here is a solution for you if indeed these pages contain information that is valuable to you users to see.
    Load all you links into a div and then call each div one at a time, cycling through the links. there are many many ways to do this, dynamic drive has a few examples that you might find handy. If what I am saying is not making sense I'll elaborate further but i think this should be sufficient to get you going down a better path. i would like to note that some scripts are very light weight while others are not so much.

    also if you have more than 20 or so links you may need to use a little php to call random links and have them displayed on each page or better yet links that are page specific
  14. #8
  15. No Profile Picture
    Registered User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Aug 2009
    Posts
    15
    Rep Power
    0
    hmm, how can you call a div.. a div is an object? obviously I don't quite grok what your saying..

    but my plan again is just something like:

    Code:
    onload(){
      setDiv('foobar',"<a href='javascript:null' onclick='doSomething()'>Foobar</a>");
    }
    which rewrites the 'element' foobar to now contain the proper anchor for human consumption.

    I suspect we're saying the same thing..but..not sure..
  16. #9
  17. Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Jan 2007
    Posts
    347
    Rep Power
    27
    no, not at all...


    here is the script, you can copy it right into your header but I suggest putting it in an external js file....

    Code:
    		  <script>
    
    var delay=3000
    
    var ie4=document.all
    
    var curindex=0
    var totalcontent=0
    
    function get_total(){
    if (ie4){
    while (eval("document.all.content"+totalcontent))
    totalcontent++
    }
    else{
    while (document.getElementById("content"+totalcontent))
    totalcontent++
    }
    }
    
    function contract_all(){
    for (y=0;y<totalcontent;y++){
    if (ie4)
    eval("document.all.content"+y).style.display="none"
    else
    document.getElementById("content"+y).style.display="none"
    }
    }
    
    function expand_one(which){
    contract_all()
    if (ie4)
    eval("document.all.content"+which).style.display=""
    else
    document.getElementById("content"+which).style.display=""
    }
    
    function rep_content(){
    get_total()
    contract_all()
    expand_one(curindex)
    curindex=(curindex<totalcontent-1)? curindex+1: 0
    setTimeout("rep_content()",delay)
    }
    window.onload=rep_content
    		  </script>

    and here are your div's....

    PHP Code:
    <div id="content0"here is your first content..</div>
    <
    div id="content1"here is your 2nd content..</div>
    <
    div id="content2"here is your 3rd content..</div>
    <
    div id="content3"here is your 4th content..</div
    of course they do not have to be called "content" .. call them what ever you like as long as you make the correct changes to the js
    Last edited by 1fast72nova; Aug 28th, 2009 at 07:23 PM.
  18. #10
  19. No Profile Picture
    Registered User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Aug 2009
    Location
    New York
    Posts
    13
    Rep Power
    0
    Awesome! I like this place. Wish I had some mojo to give.
  20. #11
  21. No Profile Picture
    Registered User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Aug 2009
    Location
    delhi
    Posts
    6
    Rep Power
    0
    Originally Posted by foobarballo
    Hi all, this is kinda a duplicate post, but the previous received no replies, so I will try restating my question.

    First off, my website is heavily javascript dependent with ajax calls to retrieve the actual content.

    As such I thought I was supposed to or could for the benefit of Search Engines which I believe do not support JavaScript add a <noscript> element which contains 'static' url links to the pages with the content rendered into basic HTML tables (essentially the same content as users would see).

    However from what I can tell Google (and Bing) both have not indexed those urls within my noscript tags. Any thoughts as to why? (also those urls are exposed in my sitemap.xml)

    Could it simply be that I have not waited long enough for google to index these? (its been live for just under a month).

    FYI: The website in question is borrownet.com



    Hi freind

    as for my information you must have include that javascript code and shift javascript code in another file it will help if function is not working properly u can just exclude it check it whether it is working or not individually .
  22. #12
  23. No Profile Picture
    Registered User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Aug 2009
    Posts
    15
    Rep Power
    0
    Well I have made the changes, now I just need to wait and see if they will now index the content properly.

    (I'm using my approach where I just load the page with static links initially (its only about 4 links), then I just rewrite the links via javascript.

    I think you guys went off on a tangent with respect to divs, I have no issue loading & populating content via javascript/divs..the issue was that the SEs had no way to get to the content since they do not honor JS..as such I needed solution to present links to SEs in way that would not hamper users.

    now my main Navigation links present static links to any system that does not support Javascript without the use of noscript. this should solve the problem...crossing fingers..guess I will likely need to wait a few weeks before I find out..

    Cheers
  24. #13
  25. Contributing User
    SEO Chat Discoverer (100 - 499 posts)

    Join Date
    Jan 2007
    Posts
    347
    Rep Power
    27
    Oh, I got what you're saying but that is kinda showing the SE a different content than what the user sees.. at least If i'm understanding correctly... instead of setting your link to display none you are setting it to display one link until the page loads then display a different link that the client would see? I'm not sure i'd go that route but that's just me. May not matter anyways, as long as the link is spidered
  26. #14
  27. No Profile Picture
    Registered User
    SEO Chat Explorer (0 - 99 posts)

    Join Date
    Aug 2009
    Posts
    15
    Rep Power
    0
    exactly, the content for the SE is pulled via via standard http, where the entire page is pulled (minus the headers & navigation), in an 'ugly' table, where as when the user comes he pulls the essentially same content but via ajax.

    One of the main differences is, the users can filter the content they see with subcategories etc, ...but the SE currently see the entirety of the content, one page at a time.

    I may add the filter links as well eventually for the SE, but I will wait and see first.

Similar Threads

  1. SEO Experiment: NOSCRIPT Tags in Webpages
    By sunnymonkey in forum Google Optimization
    Replies: 3
    Last Post: Jun 21st, 2005, 06:49 AM
  2. links within noscript tags
    By donkeyderby in forum SEO Chat Articles
    Replies: 2
    Last Post: May 13th, 2005, 02:55 PM
  3. Noframes and Noscript
    By hexed in forum Google Optimization
    Replies: 4
    Last Post: May 17th, 2004, 12:11 PM
  4. noscript, spammy?
    By symetrix in forum SEO Help (General Chat)
    Replies: 3
    Last Post: Jun 11th, 2003, 09:59 PM

IMN logo majestic logo threadwatch logo seochat tools logo