Will content loaded by AJAX affect SEO/Search Engines?
Asked Answered
P

7

15

i wonder if content loaded dynamically by AJAX affect SEO/ability for search engines to index the page?

i am thinking of doing a constantly loading page, something like the Tumblr dashboard where content is automatically loaded as the user scrolls down.

Plenty answered 21/6, 2010 at 10:58 Comment(0)
C
2

Short answer: It depends.

Here's why - say you have some content that you want to have indexed - in that case loading it with ajax will ensure that it won't. Therefore that content should be loaded normally.

On the other hand, say you have some content that you wish to index, but for one reason or another you do not wish to show it (I know this is not recommended and is not very nice to the end user anyway, but there are valid use cases), you can load this content normally, and then hide or even replace it using javascript.

As for your case where you have "constantly loading" content - you can make sure it's indexed by providing links to the search engines/non-js enabled user agents. For example you can have some twitter-like content and at the end of it a more button that links to content starting from the last item that you displayed. You can hide the button using javascript so that normal users never know it's there, but the crawlers will index that content (by clicking the link) anyway.

Ceremony answered 21/6, 2010 at 11:12 Comment(3)
will the search engine point users to the other page then? hmm maybe what i can do is a normal pagination, then hide it using JS while loading more contentPlenty
Yupp, normal pagination that is hidden from the user with js and is replaced by lazy loading the content is exactly what you need.Ceremony
Is it still relevant?Diplopia
D
9

A year later...

A while back Google came out with specifications for how to create XHR content that may be indexed by search engines. It involves pairing content in your asynchronous requests with synchronous requests that can be followed by the crawler.

http://code.google.com/web/ajaxcrawling/

No idea whether other search giants support this spec, or whether Google even does. If anybody has any knowledge about the practicality of this method this I'd love to hear about their experience..

Edit: As of today, October 14, 2015, Google has deprecated their AJAX crawling scheme:

In 2009, we made a proposal to make AJAX pages crawlable. Back then, our systems were not able to render and understand pages that use JavaScript to present content to users. ... Times have changed. Today, as long as you're not blocking Googlebot from crawling your JavaScript or CSS files, we are generally able to render and understand your web pages like modern browsers.

H/T: @mark-bembnowski

Dogfight answered 27/7, 2011 at 16:7 Comment(0)
S
5

Five years later...

Latest update on SEO AJAX:

As of October 14, 2015

Google now is able to crawl and parse AJAX loaded content. SPA or other AJAX rendered page no longer needed to prepare two versions of websites for SEO.

Softball answered 8/12, 2015 at 2:57 Comment(0)
A
2

If you have some content loaded by an Ajax request, then, it is only loaded by user-agents that run Javascript code.

Search-engine robots generally don't support Javascript (or not well at all).

So chances are that your content that's loaded by an Ajax request will not be seen by search engines crawlers -- which means it will not be indexed ; which is not quite good for your website.

Almetaalmighty answered 21/6, 2010 at 11:2 Comment(0)
C
2

Short answer: It depends.

Here's why - say you have some content that you want to have indexed - in that case loading it with ajax will ensure that it won't. Therefore that content should be loaded normally.

On the other hand, say you have some content that you wish to index, but for one reason or another you do not wish to show it (I know this is not recommended and is not very nice to the end user anyway, but there are valid use cases), you can load this content normally, and then hide or even replace it using javascript.

As for your case where you have "constantly loading" content - you can make sure it's indexed by providing links to the search engines/non-js enabled user agents. For example you can have some twitter-like content and at the end of it a more button that links to content starting from the last item that you displayed. You can hide the button using javascript so that normal users never know it's there, but the crawlers will index that content (by clicking the link) anyway.

Ceremony answered 21/6, 2010 at 11:12 Comment(3)
will the search engine point users to the other page then? hmm maybe what i can do is a normal pagination, then hide it using JS while loading more contentPlenty
Yupp, normal pagination that is hidden from the user with js and is replaced by lazy loading the content is exactly what you need.Ceremony
Is it still relevant?Diplopia
W
1

Crawlers don't run JavaScript, so no, your content will not be visible to them. You must provide an alternative method of reaching that content if you want it to be indexed.

You should stick to what's called "graceful degradation" and "progressive enhancement". Basically this means that your website should function and content should be reachable when you start to disable some technologies.

Build your website with a classic navigation, and then "ajaxify" it. This way, not only is it indexed correctly by search engines, it's also friendly for users that browse it with mobile devices / with JS disabled / etc.

Wentzel answered 21/6, 2010 at 11:1 Comment(0)
M
1

Two years later, Bing and Yahoo search engines also now support Google's Ajax Crawling Standard. Information over the standard can be found here: https://developers.google.com/webmasters/ajax-crawling/docs/getting-started.

Mugwump answered 24/7, 2012 at 14:50 Comment(0)
R
1

The accepted answer on this question is no longer accurate. Since this post still shows in search results, I'll summarize the latest facts:

Sometime in 2009, Google released their AJAX crawling proposal. Other search engines added support for this scheme shortly thereafter. As of today, October 14, 2015, Google has deprecated their AJAX crawling scheme:

In 2009, we made a proposal to make AJAX pages crawlable. Back then, our systems were not able to render and understand pages that use JavaScript to present content to users. ... Times have changed. Today, as long as you're not blocking Googlebot from crawling your JavaScript or CSS files, we are generally able to render and understand your web pages like modern browsers.

Rozamond answered 15/10, 2015 at 1:28 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.