#1
  1. Design for your VISITORS
    SEO Chat Adventurer (500 - 999 posts)

    Join Date
    Mar 2007
    Location
    Somewhere Else Lately
    Posts
    583
    Rep Power
    53

    Too Much Content, How to Combine Safely


    Hey all, been a while since I've posted! I've never been so busy!

    Anyways, I've got a blog website I review products on. It started with only 5 or 6 products, but now I have 16.

    I've created a comparison table at the top of the home page that summarizes each of the products. Below the table is the extended version of each review. Many pages of the site rank very well, with the home page ranking highest for the toughest terms.

    The problem is the page has become way too long. I don't want to lose any rankings (including long tail stuff) by eliminating some of the reviews (content) from the home page and moving them to a second page.

    I was considering creating a summary or taking the first couple lines of each review and having a "continue reading" button or "read more" etc. However, I'm stuck on how to handle this best way. Load time has also been affected by the length of the page. I was considering using javascript to load the remainder of the reviews after the summary upon a click and the content would load on page without having to reload the whole page. I'm concerned that this would need to be done inside a tag that would make the text considered "hidden" and don't want to get and filters or penalties either.

    I'm looking for any other possible solutions that will allow me to keep all of the text that's currently on the homepage index-able, but not necessary visible unless requested, to increase load speed, and avoid loss of rankings or implementation of any G filters or penalties.


    Thoughts?
    Fair Enough.
  2. #2
  3. Contributing User
    SEO Chat Good Citizen (1000 - 1499 posts)

    Join Date
    Apr 2009
    Location
    83% (Not Provided WTF)
    Posts
    1,314
    Rep Power
    381
    why do you say its too long? Is it hard to navigate?

    why not use page anchors
  4. #3
  5. Twitter: @barlow1984
    SEO Chat Adventurer (500 - 999 posts)

    Join Date
    Sep 2009
    Location
    Leek, UK
    Posts
    791
    Rep Power
    103
    There are loads of show and hide techniques around, that could be a good way of doing it. Using either AJAX or Javascript. Just keep the content of it accessible to the spiders though.
    Life is like riding a bicycle - in order to keep your balance, you must keep moving.
  6. #4
  7. Design for your VISITORS
    SEO Chat Adventurer (500 - 999 posts)

    Join Date
    Mar 2007
    Location
    Somewhere Else Lately
    Posts
    583
    Rep Power
    53
    Originally Posted by barlow1984
    There are loads of show and hide techniques around, that could be a good way of doing it. Using either AJAX or Javascript. Just keep the content of it accessible to the spiders though.
    Will the spiders look at that as me being manipulative? There are plenty of BH SEOs that hide text this way all time. I don't want to be classified as such.... will I or could I?
  8. #5
  9. Twitter: @barlow1984
    SEO Chat Adventurer (500 - 999 posts)

    Join Date
    Sep 2009
    Location
    Leek, UK
    Posts
    791
    Rep Power
    103
    As long as the user is able to view the content then I don't see a problem.

    A brilliant example (that I want to steal for myself) can be found at www.nationwide.co.uk

    Look at their links in the footer, its like a mini sitemap. Really good for the user and juicy for the spider. IMO not black hat because it's not only their for the spider.

    EDIT - they moved it from the homepage recently, go to the saving section page, it's used on there.
  10. #6
  11. from the horses mouth
    SEO Chat Hero (2000 - 2499 posts)

    Join Date
    Nov 2004
    Location
    UK
    Posts
    2,105
    Rep Power
    556
    If you have any text that is pulled in by Javascript after page load (AJAX basically) then that text isn't going to be seen by the spider.

    it's not blackhat, but it does mean the text won't be indexed.
  12. #7
  13. Design for your VISITORS
    SEO Chat Adventurer (500 - 999 posts)

    Join Date
    Mar 2007
    Location
    Somewhere Else Lately
    Posts
    583
    Rep Power
    53
    Originally Posted by barlow1984
    As long as the user is able to view the content then I don't see a problem.

    A brilliant example (that I want to steal for myself) can be found at www.nationwide.co.uk

    Look at their links in the footer, its like a mini sitemap. Really good for the user and juicy for the spider. IMO not black hat because it's not only their for the spider.

    EDIT - they moved it from the homepage recently, go to the saving section page, it's used on there.
    Can you explain what this used to do a bit more in detail for me?

    Nothing was hidden that I could tell. I saw a footer with all of the links on the initial page load >> http://www.nationwide.co.uk/savings/default.htm - why do you think they moved it from the home page? Probably doubtful, but maybe they were experiencing a filter or drop in rankings...
  14. #8
  15. Twitter: @barlow1984
    SEO Chat Adventurer (500 - 999 posts)

    Join Date
    Sep 2009
    Location
    Leek, UK
    Posts
    791
    Rep Power
    103
    I doubt it, it's a site I keep an eye on (as I work in the same arena) they have revamped the homepage completely but the footer links do have a show hide function, it appears to be shown as default rather than hidden though.

    As channel 5 says if its within the script then thats not ideal but I'm pretty certain it can be put within a DIV instead which is very SEO friendly.
  16. #9
  17. Design for your VISITORS
    SEO Chat Adventurer (500 - 999 posts)

    Join Date
    Mar 2007
    Location
    Somewhere Else Lately
    Posts
    583
    Rep Power
    53
    What do you guys think about me using the <noscript> tag to include the text I will be removing and allowing users to see only based on request.

    See this plugin http://wordpress.org/extend/plugins/...re-right-here/ - it does what the default WP "Read more" function does, but does not redirect to the post page, it brings the text right to the current page in sync with the natural flow of the text.

    I could use a <noscript> to add the text I removed right in the same place where it used to be in the source so it get's spidered.

    Is this manipulative? I've spoke with Fathom before about using the <noscript> tag to manipulate the results displayed for your meta description tag to ensure you're always getting a solid Google generated tag for your URL based on thousands of potential keyword combinations, since it's impossible to create a meta description tag that fits for more than a hundred keywords and still makes sense. However, we never tested this thought and also never researched its chances of harming your rankings instead of helping your conversions.

    Can anyone provide some insight on the <noscript> tag or provide another solution? The plugin I mentioned above does what I want the readers to see... If I didn't already have all the positions I want, I'd have done things this way the first time, but my site has now picked up some trust from G, and I don't want to lose it...
  18. #10
  19. Contributing User
    SEO Chat Super Genius (4500 - 4999 posts)

    Join Date
    May 2007
    Posts
    4,515
    Rep Power
    1986
    Noscript would be a viable option to include the text called by javascript or ajax in what the spiders see, but the problem there is that it wouldn't help your load times. If that's truly a concern, then you'll need to find another way to go about it.

    Is it mainly that you don't want to add any new reviews, or that you want to downsize what you currently have?

    Some good link building and you wouldn't have to worry about it anyway.....
  20. #11
  21. Design for your VISITORS
    SEO Chat Adventurer (500 - 999 posts)

    Join Date
    Mar 2007
    Location
    Somewhere Else Lately
    Posts
    583
    Rep Power
    53
    Originally Posted by jsteele823
    Noscript would be a viable option to include the text called by javascript or ajax in what the spiders see, but the problem there is that it wouldn't help your load times. If that's truly a concern, then you'll need to find another way to go about it.

    Is it mainly that you don't want to add any new reviews, or that you want to downsize what you currently have?

    Some good link building and you wouldn't have to worry about it anyway.....

    I've got the links, I've got the ranks, I don't want to break something that doesn't need fixing "ranking wise" - but I feel I could be losing visitors and conversions due to the extended loading time (which I also have a couple of other thigns I'm implementing to speed that up), and more importantly that visitors may not be getting an intro on each product like they should because they're tired of reading after the first 8 reviews and/or their finger is starting to cramp/fall asleep from scrolling too much!

    Why wouldn't using a JS/AJAX plugin that loads the extra content w/o reloading the whole page help my loading times (if you say becasue the <noscript> tag's text will be about the same amount of data...I get into that below)??

    If the text for each review is 600 words now and I cut down the posts into thirds, leave one third on the home page (full time) and the other two thirds are shown only when "asked for" (and are stored in the individual post's page, not in the <noscript> tag) this would help load time would it not? Even if it doesn't help load time, it would in fact, reduce the length of the page so readers get a chance to see all of the products without getting tired, and only have the see the full review on those that peak their interest in the first few hundred words. I suppose this is quite a bit more important to me than the load time, looking back I may not have been very clear about it.

    Bottom line, are you in fact saying that the <noscript> tag is SEO/Google friendly in your opinion? Are you also saying that the tet withing the <noscript> tags, visible on the actual page or not, since it's code, will just be matching the previous load time because it'll be approx the same amount of data being transferred? Are you also in fact clear that the script will not be pulling the data from the <noscript> page but that the text in the <noscript> tags in only there for Google since I can't make this for visitor anymore due to an overload of information?

    THANKS ALL!
  22. #12
  23. Twitter: @barlow1984
    SEO Chat Adventurer (500 - 999 posts)

    Join Date
    Sep 2009
    Location
    Leek, UK
    Posts
    791
    Rep Power
    103
    Not sure if this is a viable option:

    http://www.webdeveloper.com/forum/showthread.php?t=168061

    Noscript tags should work fine if you want to have the data deep within your javascript or AJAX. Its standard practise as it's also used as an accesability function.

    As for your page load times, hidding content (unless you change your page to load the hidden content last) wont help this. Without having a look at the page it's difficult to suggest anything further to help this other than reduce the amount of content on the page, mainly images etc.
  24. #13
  25. Design for your VISITORS
    SEO Chat Adventurer (500 - 999 posts)

    Join Date
    Mar 2007
    Location
    Somewhere Else Lately
    Posts
    583
    Rep Power
    53
    Originally Posted by barlow1984
    Not sure if this is a viable option:

    http://www.webdeveloper.com/forum/showthread.php?t=168061

    Noscript tags should work fine if you want to have the data deep within your javascript or AJAX. Its standard practise as it's also used as an accesability function.

    As for your page load times, hidding content (unless you change your page to load the hidden content last) wont help this. Without having a look at the page it's difficult to suggest anything further to help this other than reduce the amount of content on the page, mainly images etc.
    I've found a solution that won't require the <noscript> tags, I'll post it when I'm done implementing it, also the extra content would load last since since it'd only load by request.
  26. #14
  27. Twitter: @barlow1984
    SEO Chat Adventurer (500 - 999 posts)

    Join Date
    Sep 2009
    Location
    Leek, UK
    Posts
    791
    Rep Power
    103
    Much better way of doing it, we have one that has the content in divs, but I don't know the ins and outs of how it works though.
  28. #15
  29. No Profile Picture
    Contributing User
    SEO Chat Skiller (1500 - 1999 posts)

    Join Date
    Jan 2008
    Posts
    1,834
    Rep Power
    427
    Dunno if I am reading this right because I just drank some whiskey. What about the Wordpress more tag?

Similar Threads

  1. Replies: 3
    Last Post: Oct 19th, 2005, 02:14 AM
  2. What value, if any, do you get out of content ads?
    By earlpearl in forum Google Adwords
    Replies: 7
    Last Post: May 15th, 2005, 08:03 PM
  3. Replies: 5
    Last Post: Apr 23rd, 2005, 11:09 AM

IMN logo majestic logo threadwatch logo seochat tools logo