Google indexing of my AngularJS application
Asked Answered
D

3

14

There was a blog post from google that they are indexing javascript applications. However, my AngularJS Application www.fore-cite.com seems not to be indexed at all. Is there anything special that I have to do in order to enable indexing and make the content searchable?

Downtime answered 8/1, 2015 at 21:35 Comment(6)
Have you used the webamstertools that google provides to see indexing information and test-crawl pages? Are you using HTML5 mode in angular? Is your webserver correctly returning index.html when directly requesting specific routes?Chez
I've asked google webmaster tools to render the page, but it didn't help. What is the html5 mode in angular? What do you mean by directly requesting specific routes? AngularJS also kicks in when index.html is asked. What shall I do there?Downtime
For example, if you ahve an angular route pointing to example.com/about, your webserver should return index.html, and then angular will render the /about route.Chez
html5 mode is a setting in angular that you can enable to allow for viewing your pages using example.com/about rather than example.com/#/aboutChez
I get the html5 mode. But for the index.html: Do I have to do something special for this to happen?Downtime
Yes, it's a setting in your webserver. For example, with apache, you would use .htaccess to redirect all requests to folders that don't exist to index.html, otherwise only the homepage would be indexable.Chez
C
19

The Google crawler does execute javascript on the pages that it crawls. With AngularJS, there are a few steps you have to take to make sure that your application is getting crawled and indexed properly.

HTML5 Mode

You must use html5 mode.

Webserver Setup

For the html5 mode to work properly, you must configure your webserver so that requests to directories that don't exist get rewritten to index.html.

Sitemap

Google does not properly follow links in angularjs apps yet, therefore you must create a sitemap for all of your routes. This sounds like a pain to do, however, with proper build processes this can be a very automated process. (gulp, grunt, etc.)

Cons

This of course only applies to the google crawler. Other search crawlers such as Bing may not support javascript applications yet, though I wouldn't be surprised if this changes over the next year or two (if it hasn't already.)

Other considerations

One commonly missed problem with indexing angular apps is things like pagination and content that shows up after clicking a button. If these actions do not change the url, google will not crawl it. For example, say you have a page with a table using pagination and it has 3 pages. Google will only crawl the first page unless each page has a different url route such as /table/page/1 /table/page/2 /table/page/3

Chez answered 8/1, 2015 at 23:31 Comment(20)
With this approach the deep links don't work anymore. If the user enters a deep link as the Url, the server will be asked. We'd need, however, AngularJs to answer the request. Is there a solution for this problem?Downtime
Here is a nice blog article describing how to do this rewriting and server configuration directly in django: blog.kevinzhang.me/posts/…Downtime
At your first comment, deep linking should work just fine. Can you elaborate?Chez
I now got everything to work. Started with the html5mode and then I started to understand why the webserver setup is important. Finally I've found a solution for a setting directly in django and not the webserver. Thanks a lot for you answer. It helped a lot. Btw. the project where this was necessary is: www.fore-cite.comDowntime
Could you explain why html5mode is a must? Why doesn't using the /#! and ?_escaped_fragment_= work?Pigmy
The point was to not have to provide alternative content for crawlers.Chez
@Downtime Your site is not indexed though. Didn't the solution work?Wisnicki
I implemented the solution as suggested, but it doesn't seem to work. Do you have an idea what the problem might be? @WisnickiDowntime
did you add a sitemap and upload it to webmaster tools?Chez
@Downtime did you add the site to webmaster tools? When you do that try "render as google bot" it will show you if google understands the JS or not. Im very curious as I have a similar project now.Wisnicki
I did exactly that. Render as google worked fine, i.e. the page was displayed correctly. In spite of that the crawling seems not to work. Currently I've no idea how to analyse and to look for a solution... If you're interested, let's do a skype or exchange mail to brainstorm about this problem. My email is: [email protected]Downtime
A SO chatroom would likely be better, so that others can participate.Chez
This wouldn't work on a 'Github Pages' setup would it?Saber
I have the exactly same problem. The Fetch as Google seems working as expected with page rendered properly. But the cache page in google is not javascript rendered, the page's content is not properly indexed. and all the html5 route/url are not appeared in the index. I have all the pages listed in sitemap.xml, no helping.Izettaizhevsk
How does this apply to angular2?Vomiturition
I don't expect it to be any different.Chez
@zyxue With this setup it worked in one simple case, but I didn't manage to make this work in a more complex application. My conclusion is: use AngularJS 2 and isomophic JavaScriptDowntime
@paweloque, so you mean that your setup now can make google crawl and index multiple pages in your angularjs app? I thought by a simple case, there is still multiple pages (or urls) in the app, right?Lawabiding
No, with Angular2 and isomorphic JS, the server renders the page and returns to the crawler. In a normal browser, the app then comes to live on the browser, but the first view is already rendered.Downtime
@Downtime i think he was referring to "With this setup it worked in one simple case," not the angular 2 partChez
L
0

You might want to use Angular Universal. It looses the benefit of rendering the angular application on the client however it will allow proper search engine indexing (not only google).

you will have to decide if the trade off is worth it for your requirement

Angular Universal Official Website

Lissalissak answered 14/12, 2018 at 15:39 Comment(0)
S
-1

Nowadays you wont get good SEO with AngularJS.

First thing is to get your pages crawled. See this simple JS indexing experiment.

To diagnose your site you can use Chrome 41 used by Googlebot to index JS pages. See this.

Singhal answered 8/1, 2018 at 21:22 Comment(2)
This contradicts the other, accepted answer, so do you have a source for "not getting good SEO with AngularJS"?Complicity
There is no contradiction. If Google can index it doesn't mean it will index. One of the sources is in the first link. You can google more. Read it, you will know what to look for e.g this AngularJS SEO case.Singhal

© 2022 - 2024 — McMap. All rights reserved.