Page 1 of 2 12 Last
  • Jump to page:
    #1
  1. No Profile Picture
    Registered User

    Join Date
    Apr 2017
    Posts
    23
    Rep Power
    0

    Red face Deliver different version of website to Google robots


    Hello community.

    I am working on an enterprise website project as an SEO manager.

    They were using Angular.js by now but I made them switch to SSR (server-side rendering) since Google robots were not able to render the pages, follow the links and there were other huge indexing/SEO issues.

    They are convinced to deliver SSR version of website Only to Google bots, Bing bots and other major search engine robots by identifying them by user-agent.

    The reason that they only choose the robots to receive SSR version is limited server resources. As you know, SSR requires huge amount of server resources since the content of the application is rendered on the server, not the client.

    They won't accept to switch to SSR version for both search engine robots and real users anytime soon.

    I want to put this to debate and kindly asking if you have any similar experience and the outcome.

    My concerns:
    • Google robots are receiving a different version of website from what the users receive.
    • Since the website is based on JS, some of the pages are not directly accessible through directly inpputing the URL.



    Feel free to ask for more information.

    Regards,
    Shayan Davoodi~
  2. #2
  3. No Profile Picture
    Moderator
    SEO Chat Skiller (1500 - 1999 posts)

    Join Date
    Sep 2016
    Location
    USA
    Posts
    1,832
    Rep Power
    2791
    The biggest potential problem I see here is if the two different methods of rendering deliver different content to Google and the Public. If it does deliver different content there could be repercussions.

    Notice I stated, "there could be repercussions". As long as the intent is to provide a better UX (user expierence) and the end results of the rendered page is not deceptive, meaning Googlebot and the Public see the same information there should be no problem.

    White Hat cloaking, which is what you are describing is ok as long as your site is doing as per the above statement.

    Pros
    Fixes the issue of Google not being able to render and properly index your angular js pages.

    Cons
    Extra code to detect Googles many bots and they have lots of them. The possibility for errors increase and if you miss one of Google's bots, then Google will see both pages.


    My suggestion would be to convert everything to SSR. Put the entire site on a CDN, (content delivery network), and Cloudflare comes immediately to mind. Doing this prevents any possibility of Google thinking you are doing deceptive cloaking. Google and everyone else sees exactly the same page. Your site will load with CDN delivery speeds because it is cached on the web and delivered from cache. Other alternatives include using a Dedicated Server, which is expensive.
    Last edited by KnowOneSpecial; May 9th, 2018 at 07:24 AM.
  4. #3
  5. No Profile Picture
    Registered User

    Join Date
    Apr 2017
    Posts
    23
    Rep Power
    0
    Thank you Don.

    As I explained above, we are delivering exactly the same content to both search engine robots and real people. That's why we are using SSR.

    But SSR requires too much processing resources that CDNs like CF don't provide.
    They provide storage for your static content and some solutions for having a faster and more secure connection between the original server and client.

    Actually, Google robots are receiving the exact same content as other users do.
    The only thing that is different is that users render some of the page content inside their own browser (that's what CSR- client-side rendering do for us) but Google bot receives the content which is already rendered by the server.

    The amount of users that we are serving is so huge and the company currently doesn't afford to secure the budget.

    Do you believe that if we deliver the same content to both search engines and people, there would be no problem?
  6. #4
  7. No Profile Picture
    Moderator
    SEO Chat Skiller (1500 - 1999 posts)

    Join Date
    Sep 2016
    Location
    USA
    Posts
    1,832
    Rep Power
    2791
    Originally Posted by shayandavoodi
    Do you believe that if we deliver the same content to both search engines and people, there would be no problem?
    As long as both your users and Google see the same content there will be no problems.

    Originally Posted by shayandavoodi
    The amount of users that we are serving is so huge and the company currently doesn't afford to secure the budget.
    If that is the situation, you have more users than your current server can service, then you need to seriously consider upgrading your servers. This would be the best solution in my opinion.
  8. #5
  9. SeoRaptor
    SEO Chat Good Citizen (1000 - 1499 posts)

    Join Date
    Mar 2016
    Location
    France, Civens
    Posts
    1,206
    Rep Power
    1907
    By choosing the cloaking option you are choosing a very dangerous path, even if your intentions are good a simple error could bring trouble on you and your site. As it seems you have trafic it's definitively something I would think about twice before taking any decision. Please make sure that you understand the risks, cloaking is far more dangerous than building "wrong" links or over optimizing pages.

    you have more users than your current server can service, then you need to seriously consider upgrading your servers.
    I stand by KOS, your issue is linked to your sever, you should upgrade it. If you are on a VPS try looking at dedicated servers, some can be very cheap but they have the downside of being provided "naked" and you would have to configure it to meet you needs (+ set up backup, etc...) but when it comes to a certain need of computing power they are cheaper than other alternatives. Check out previous gen server such as soyoustart if you can't afford the lastest tech, in lots of cases they are far sufficient.
    Last edited by Pierre Benneton; May 9th, 2018 at 03:34 PM.
    Owner of Bennetonable | AdWords certified - Analytics qualified
    On-Page SEO Cheat Sheet | "My opinions are my own. Feel free to disagree & think above the fold."
  10. #6
  11. No Profile Picture
    Registered User

    Join Date
    Apr 2017
    Posts
    23
    Rep Power
    0
    As long as both your users and Google see the same content there will be no problems.
    That would be perfect.

    I stand by KOS, your issue is linked to your sever, you should upgrade it. If you are on a VPS try looking at dedicated servers, some can be very cheap but they have the downside of being provided "naked" and you would have to configure it to meet you needs (+ set up backup, etc...) but when it comes to a certain need of computing power they are cheaper than other alternatives.
    Hello Pierre.
    Thanks for your comment.
    Actually I'm not speaking about upgrading from a shared hosting to a dedicated VPS.
    I'm talking about tens of thousands of Dollars per month.

    I expressed the need of using CDN and they are considering it.
    If they agree with me on using a CDN, we can swap the budget that we dedicated on storage resources to processing resources (CPU and RAM) but I still don't think it will be enough.

    By choosing the cloaking option you are choosing a very dangerous path, even if your intentions are good a simple error could bring trouble on you and your site. As it seems you have trafic it's definitively something I would think about twice before taking any decision. Please make sure that you understand the risks, cloaking is far more dangerous than building "wrong" links or over optimizing pages.
    I don't think this solution is considered as "cloaking" since the output that a regular user with Javascript-enabled browser sees, is the same with the output that Google-bot sees.

    The point is that I'm not sure if it's considered as cloaking by Google or not.

    We have no option but giving it a test.

    I wish we are not going down to face the might of Google

    I will share the results as an article and I will share it here.

    Thanks a lot.
  12. #7
  13. No Profile Picture
    Moderator
    SEO Chat Skiller (1500 - 1999 posts)

    Join Date
    Sep 2016
    Location
    USA
    Posts
    1,832
    Rep Power
    2791
    Originally Posted by shyandavoodi
    Actually I'm not speaking about upgrading from a shared hosting to a dedicated VPS.
    I'm talking about tens of thousands of Dollars per month.
    Sounds like you're trying to run a Search Engine to compete with Google, Bing or Yahoo

    Ok.. you are obviously leaving out a few things. This missing information will probably and dramatically change the answers we would provide for you. So lets ask some questions... ( I really hate playing the 20 questions game. )

    How many simultaneous users do you need to service at one time on your site, 50 users 100 users 1000 users 10000 users at one time or more ????

    How many ms sql or my sql databases will you need ? (forgot to ask this question the first time)

    How much storage does each user require ?

    How much bandwidth do you need for the average user ?

    What content are you serving to your users... text , lots of pictures or videos. One sure fired way to answer this question is to just provide the url of the site. If it is a problem providing that info in public forum then pm us with the site domain name.
    Last edited by KnowOneSpecial; May 10th, 2018 at 12:50 PM.
  14. #8
  15. SeoRaptor
    SEO Chat Good Citizen (1000 - 1499 posts)

    Join Date
    Mar 2016
    Location
    France, Civens
    Posts
    1,206
    Rep Power
    1907
    I'm talking about tens of thousands of Dollars per month.
    Wow that's not something we come across everyday even outside SEOchat. If this is really your budget I won't even ask if the server is optimized because I imagine you have someone / several people specialized in server administration and optimization.

    Giving us some insight regarding KOS questions is really important because the advice we will provide can be very different depending on the situation.
  16. #9
  17. No Profile Picture
    Registered User

    Join Date
    Apr 2017
    Posts
    23
    Rep Power
    0
    Oh I really didn't mean to be offensive or make you upset.
    I may have failed to perfectly describe the situation.

    How many simultaneous users do you need to service at one time on your site, 50 users 100 users 1000 users 10000 users at one time or more ????
    Around 20k concurrent users and it's probably gonna be more by having more organic results.

    How many ms sql or my sql databases will you need ? (forgot to ask this question the first time)
    How much storage does each user require ?[/COLOR]
    How much bandwidth do you need for the average user ?[/COLOR]
    I don't have the information.

    What content are you serving to your users... text , lots of pictures or videos. One sure fired way to answer this question is to just provide the url of the site. If it is a problem providing that info in public forum then pm us with the site domain name.
    It's a mix of text, images, location-based personalized results based on Google maps API and videos.


    Although I don't think these qusetions are related to my issue, please let me know if you need more information.

    All I need to know is if delivering SSR version of website to Google robots and JS version to real users causes any problem or not.
    Knowing that both will result in the exact same visual result but the source-code that they receive is different.

    For example,

    Actual user will see
    Code:
    <title ng-bind="PageHeader"></title>
    in the source code while he will see "Blue cafe" in the browser tab as <title>.
    Google will see
    Code:
    <title>Blue cafe</title>
    in the source code and it will see the completely rendered version (SSR).

    As you can see, the final output is the same, but in slightly different way.
  18. #10
  19. No Profile Picture
    Moderator
    SEO Chat Skiller (1500 - 1999 posts)

    Join Date
    Sep 2016
    Location
    USA
    Posts
    1,832
    Rep Power
    2791
    You're not being offensive...don't take it that way.

    We are just trying to figure out what resources you need to serve your user base.

    For instance.... see the picture below. I really don't think the owner of Seo Chat is spending 10k a month for a server solution, I may be wrong.

  20. #11
  21. No Profile Picture
    Moderator
    SEO Chat Skiller (1500 - 1999 posts)

    Join Date
    Sep 2016
    Location
    USA
    Posts
    1,832
    Rep Power
    2791
    I just want to mention a few things which just dawned on me... I see your miss-understanding....

    The below code snippets are Nginx, Asp, and last PHP.

    Code:
    <title> ng-bind="PageHeader"></title>
    
    <title> <% Response.Write(PageHeader) %></title>
    
    <title> <?php echo $PageHeader;  ?> </title>
    
    
    All Google or anyone else will ever see is the following. Assuming that the value of your variable is "My custom Page Title"

    Code:
    <title>My custom Page Title </title>
    I am going to explain this using php but the process is the same even for the others, meaning the logic flow.

    The way things work are as follows...
    The server receives a request for a page. If the page is written in php the file will have the extension of PHP. The server recognizing that this is a php file and knows it has to process the script into html before it sends it to the user's browser. At no time is the php script view-able by the user or Google, period. So Google does not see any different content. Why, because the script is rendered into html before it is sent to the visitor.

    Now this is a very simplistic explanation, and only that. So you are worried about something that cannot happen, period. There is no duplicate content period, ever, nada. If folks could see the actual script then your server has problems and or there is a coding error !
    Last edited by KnowOneSpecial; May 12th, 2018 at 07:09 PM.
  22. #12
  23. No Profile Picture
    Registered User

    Join Date
    Apr 2017
    Posts
    23
    Rep Power
    0

    Smile


    Code:
    <title> ng-bind="PageHeader"></title>
    Sir, this has nothing to do with Nginx. It's just an Angular.js client-side rendered title tag which shows the title tag based on the variables that are passed from client's browser to the application.
    I am aware of the way that server-side scripting languages render the web pages.

    If a website is built using Angular.js, if you view the source of its web pages, you will see something like this:

    Code:
    <title> ng-bind="something"></title>
    But if you check the browser tab's title, you will see an actual title.
    That's what your browser renders using the data that Angular.js sends to it.

    The main point is that in my case, for preventing Google robots from having a blank title like what mentioned above, An application on the server renders a special version for Googlebot and delivers it only to Googlebot (and other major search engine crawlers).

    The possible risks of this way of treating search engine robots is that:
    They may think we are using some black-hat SEO method by delivering different content to search engine crawlers rather than what we deliver to actual people.

    But in fact, we are trying to deliver the same content to both agents but in different words so each will understand by its own logic.
    Googlebot is not known to render Angular.js so good, so we are trying to translate the REAL content of our web pages in some pattern that it can understand it.

    Actually I think this is the first time that a big-scale website is doing such thing, that's why experience around the web is so poor.
    I guess I'm gonna have to give it a shot.

    Any other suggestions/experiences/thoughts/logics/etc. is so welcome.

    Regards,
  24. #13
  25. No Profile Picture
    Registered User

    Join Date
    Apr 2017
    Posts
    23
    Rep Power
    0
    For instance.... see the picture below. I really don't think the owner of Seo Chat is spending 10k a month for a server solution, I may be wrong.

    Ah and about this section of your quote,

    1- SEOchat forums are using vBulletin script which is a server-side language. It requires less resources that a server-side-RENDERIN language since it takes what user sends to server and responds real-time. Much less resources are required since server-side languages like PHP are less resource-hungry than SSR applications like node.js based on usage. You can do the same thing with PHP in 1 second and with Node.js in 0.1 second. The amount of process is what matters here.

    2- 500 concurrent users can't be compared to 20k !
  26. #14
  27. SeoRaptor
    SEO Chat Good Citizen (1000 - 1499 posts)

    Join Date
    Mar 2016
    Location
    France, Civens
    Posts
    1,206
    Rep Power
    1907
    I'm no rendering specialist but I remember reading something about this sometime ago so I went and look for info on the Google official forums. And found a couple threads that might interest you, but before that what does the fetch and render tool spits out for your pages? If it works then you should be fine see the two following threads.

    https://productforums.google.com/for...rs/B9L2whd5PR8 _ Question on rendering AngularJS site for googlebot
    I'm out of my confort zone here but you must be far more educate than me on subjects such as Angular JS, Angular 2, PhantomJS, etc...
    The article also links to another Google official blog post about Ajax and Angular issues see the following link https://webmasters.googleblog.com/20...-web-apps.html

    If the fetch and render tool does not return your content or you are afraid that the bot doesn't work exactly the same way (it is most surely the case) than you could prerender, store (cache or proxy) the pages and serve them to Google directly instead of rendering them on the fly. not sure exactly how this would work, I'd personally test your solution on a random domain before putting such changes live, you never know as you said this is not something we do encounter everyday and I personally reach my limits in terms of coding to help you more. In this particular case, sorry I'm going to repeat myself in case someone else reads this thread, IN THAT PARTICULAR CASE, you might consider pre rendering your pages.
    In the end serving a fully cached HTML page instead of querying a DB on page load is not fundamentally different.

    By the way, your pages need to be accessible through their own Urls to be index by Google and Google does not index URL containing a # (don't know about the /#/ ) so make your best to you avoid using them if you can.

    There are online services / companies that specialized in this brombone is one of them, don't know what they worth never tested then, but external solutions exist.

    If you do test Google bot rendering capabilities please come back to us and let us know your results, this situation as raised my curiosity. Sorry I couldn't help you more.

    By the way when KOS was talking about Seochat Trafic, If you look at maximum users online it's 26K but in the end who cares it's not the question here.
    Last edited by Pierre Benneton; May 14th, 2018 at 07:32 AM.
  28. #15
  29. SeoRaptor
    SEO Chat Good Citizen (1000 - 1499 posts)

    Join Date
    Mar 2016
    Location
    France, Civens
    Posts
    1,206
    Rep Power
    1907
    As I told you I'm no code expert but are the following options something you have already taken into account (especially the second one)?

    https://codereview.stackexchange.com...bot-experiment
    and
    https://blog.thecodecampus.de/angula...-the-hashbang/


    As always do you testing on separate domains and servers
    Last edited by Pierre Benneton; May 14th, 2018 at 07:55 AM.
Page 1 of 2 12 Last
  • Jump to page:

Similar Threads

  1. Google search results stops displaying http version of website
    By Sunny Sharma in forum New User SEO Questions and Answers
    Replies: 16
    Last Post: Oct 20th, 2016, 11:41 AM
  2. How Google knows if is there any mobile version of a existing website?
    By softqubetechno in forum Google Optimization
    Replies: 1
    Last Post: Mar 17th, 2014, 01:10 AM
  3. Nanotech robots deliver gene therapy through blood (Reuters)
    By RSS_News_User in forum Science News
    Replies: 0
    Last Post: Mar 23rd, 2010, 02:01 PM
  4. Nanotech robots deliver gene therapy through blood (Reuters)
    By RSS_News_User in forum Science News
    Replies: 0
    Last Post: Mar 21st, 2010, 02:00 PM
  5. Robots.txt for https version
    By mike08 in forum Google Optimization
    Replies: 0
    Last Post: Nov 11th, 2009, 04:03 AM

IMN logo majestic logo threadwatch logo seochat tools logo