Does onload animation affect SEO?
Asked Answered
D

7

9

Let's suppose I use some onload animation for my pages, for example:

$(document).ready(function() {
    $('html.myhtml').css('overflow', 'auto').fadeTo(0, 0, function() {
        $(this).css('visibility', 'visible').animate({
            opacity: 1
        }, 200);
    });
})​

and start with an inline style to make it hidden in the first place:

<html class="myhtml" style="visibility:hidden; overflow:hidden">

Initially the page would be served as blank, and then animated with fadein. I want to know:-

  • Does this affect the SEO in any way?
  • Is this practice fine or are there some weighty arguments not to do so?
Ditmore answered 26/9, 2012 at 17:57 Comment(1)
What are you fading in to? Are we talking about an awesomely written article OR maybe eye candy graphics like cute kittens riding on unicorns OR what? A search engine ultimately wants the best results for the user and so should you. So if your fading into the information I was trying to find in the first place, whether it was an article or kittens, then you should be fine.Systemic
S
5

Does it effect SEO?

If I had to answer this with a yes or no answer then I'd say: NO

Is this practice fine or are there some weighty arguments not to do so?

We could argue about the animation all day and still not have for sure answer. What purpose does an animation fade have for a search engine? None. So hence its supposedly for the user's enjoyment? What purpose does an animation fade have for a user? None. So if we go with the 'Design for users not for search engines' model I would probably remove the animation. This is my opinion.

Back to the SEO question, does it effect SEO? Not really no but that depends on the search engine and your audience. If I am a person who uses a screen reader I may not benefit from your page as my screen reader will fail. If I have javascript disabled it will hurt my user experience (I personally browse with FF NoScript plugin).

I know you said users without javascript have no business on your site but nonetheless you should take this into account and handle it somehow. Also Googlebot does not have javascript or session cookies enabled while it crawls. Secondly if one of your js fails you may want it to gracefully fall back to something useable for the user or at least some instructions letting them know like 'Welcome! We have fancypants animations going on here that your browser doesn't support! Please enable javascript'.

Forced animations in general are annoying to a user, especially when they are repeating every page load. Adding page load is bad for Google SEO since speed is now a factor in ranking.

Like I mentioned the main Googlebot does not crawl with javascript enabled or session cookies. They have different crawlers for different purposes, like some just for mobile and some for js and some for flash. It is worth noting that having an animation/popup/or anything on load will be captured by 'Google Instant Previews' and shown to the user in the results (in your case it might look like a blank page). And like WDever mentioned, in general it is safer to use text-indents or negative margins as your initial state rather then visibility/display/overflow for this sort of thing.

This is how I would do it (here's a live preview with 4 second animation delay to test with and without js enabled):

<html>
<head>
<style>
.myhtml {visibility:hidden; overflow:hidden;}
</style>
<script>document.documentElement.className='myhtml'</script>
</head>
<body>

 1. html is not hidden initially and no class
 2. css styles register .myhtml class with the hidden stuff you want
 3. the script tag just before the BODY tag will fire and add the class to html thus hiding things for those with javascript enabled. Everyone else who has JS disabled sees the page properly.
 4. at the bottom of the page your jquery fires animating the page

<script>
$(document).ready(function() {
    $('html.myhtml').css('overflow', 'auto').fadeTo(0, 0, function() {
        $(this).css('visibility', 'visible').animate({
            opacity: 1
        }, 200);
    });
})​
</script>
</body>
</html>
Systemic answered 5/10, 2012 at 20:31 Comment(2)
Well, actually I think you are right, in saying that it doesn't matter... and if so then why indeed should one bother implementing it. After doing some retests, I noticed those hiding and showing tricks often make it look choppy and indeed delay the page load. UNLESS we are talking about very simple layout/text pages like in your posted example that we may be sure won't overload the performance.Ditmore
Why not use your stylesheet to set .myhtml to display:none and then add a <noscript>.myhtml {display:block}</noscript> . Seems easier.Chromate
S
8

It won't effect it. I have personally tested google bots readings via microdata due to an identical concern. Google now actually has visibility into javascript interactions to some degree, and even swf files. So you should be in the clear.

Sacrosanct answered 26/9, 2012 at 18:0 Comment(9)
Thanks, interesting to know. If you have one, a link would be helpful to understand the types of JS interactions that google sees/doesn't see.Respirable
I don't have such a link, part of why I had to manually test this myself. But as far as I can tell google has access to whatever is there on page load and then some. The only thing google seems to really not see is content loaded with ajax.Sacrosanct
IMO, google bots wont care much about css styles (visibility:hidden)... they will still crawl over ur page. on the other hand, if your content is loaded via ajax, then the bot will see nothing, and you'll be left in the cold...Backslide
@Harsh well I can corroborate your opinion with my testing, you are 100%.Sacrosanct
@Sacrosanct : yup... that is how it should be. sounds logical too... The bot is basically a text-parser. so it would crawl over anything on that web-page. it does not care about the CSS or "presentation" layer (actually, it DOES care, but lets save it for another day... )Backslide
this is another day, and here is a related article: vineetgupta22.wordpress.com/2012/03/04/does-css-effects-seoBackslide
I agree that it won't effect the SEO. But what does "google bots readings via microdata" actually mean? Are you talking about Rich snippets and meta microdata? That stuff was built to be hidden data in the first place so why is that valid as proof in this case? support.google.com/webmasters/bin/…Systemic
because the entire region containing microdata was hidden with css (including the parent), therefore googlebot read and indexed content hidden with css. I have since also tested this without micro data, by posting a striped down page with 2 elements. A hidden one and a visible one. Both contained a single sentence. Both sentences appeared in the description on google search. So that is a second evidence that google indexes content hidden by css.Sacrosanct
Microdata for testing was using schema.org to produce a rich snippet.Sacrosanct
S
5

Does it effect SEO?

If I had to answer this with a yes or no answer then I'd say: NO

Is this practice fine or are there some weighty arguments not to do so?

We could argue about the animation all day and still not have for sure answer. What purpose does an animation fade have for a search engine? None. So hence its supposedly for the user's enjoyment? What purpose does an animation fade have for a user? None. So if we go with the 'Design for users not for search engines' model I would probably remove the animation. This is my opinion.

Back to the SEO question, does it effect SEO? Not really no but that depends on the search engine and your audience. If I am a person who uses a screen reader I may not benefit from your page as my screen reader will fail. If I have javascript disabled it will hurt my user experience (I personally browse with FF NoScript plugin).

I know you said users without javascript have no business on your site but nonetheless you should take this into account and handle it somehow. Also Googlebot does not have javascript or session cookies enabled while it crawls. Secondly if one of your js fails you may want it to gracefully fall back to something useable for the user or at least some instructions letting them know like 'Welcome! We have fancypants animations going on here that your browser doesn't support! Please enable javascript'.

Forced animations in general are annoying to a user, especially when they are repeating every page load. Adding page load is bad for Google SEO since speed is now a factor in ranking.

Like I mentioned the main Googlebot does not crawl with javascript enabled or session cookies. They have different crawlers for different purposes, like some just for mobile and some for js and some for flash. It is worth noting that having an animation/popup/or anything on load will be captured by 'Google Instant Previews' and shown to the user in the results (in your case it might look like a blank page). And like WDever mentioned, in general it is safer to use text-indents or negative margins as your initial state rather then visibility/display/overflow for this sort of thing.

This is how I would do it (here's a live preview with 4 second animation delay to test with and without js enabled):

<html>
<head>
<style>
.myhtml {visibility:hidden; overflow:hidden;}
</style>
<script>document.documentElement.className='myhtml'</script>
</head>
<body>

 1. html is not hidden initially and no class
 2. css styles register .myhtml class with the hidden stuff you want
 3. the script tag just before the BODY tag will fire and add the class to html thus hiding things for those with javascript enabled. Everyone else who has JS disabled sees the page properly.
 4. at the bottom of the page your jquery fires animating the page

<script>
$(document).ready(function() {
    $('html.myhtml').css('overflow', 'auto').fadeTo(0, 0, function() {
        $(this).css('visibility', 'visible').animate({
            opacity: 1
        }, 200);
    });
})​
</script>
</body>
</html>
Systemic answered 5/10, 2012 at 20:31 Comment(2)
Well, actually I think you are right, in saying that it doesn't matter... and if so then why indeed should one bother implementing it. After doing some retests, I noticed those hiding and showing tricks often make it look choppy and indeed delay the page load. UNLESS we are talking about very simple layout/text pages like in your posted example that we may be sure won't overload the performance.Ditmore
Why not use your stylesheet to set .myhtml to display:none and then add a <noscript>.myhtml {display:block}</noscript> . Seems easier.Chromate
C
2

I think you should register to Google Webmasters Tools. Then find a function called "Fetch as googlebot" now let google go and fetch your desired page and see if it finds any error or unusual behavior or it is not showing what you have anticipated for. If that is the case you can be sure that there is something wrong with your page and google will tell you what problem it faced while crawling your page. Then it's a matter of rectifying your problem.

Edit: The main issue of search engines with javascript is that js create barriers in getting and reading content from pages. To be specific, this problem arises mostly when there is no content on the actual page and you use js to fetch content from somewhere else (hence ajax seo problems). So one should be concerned about putting content on pages instead of fetching from somewhere else.

So one should also test their pages with js and css turned off and see how their pages look like when google and other search engines see your pages. So after all that fancy animation, fetching and other stuff if google is still able to read, crawl and index your pages than I will not worry for a second and neither should you. After all if google is fine, we are more than fine.

Chevalier answered 4/10, 2012 at 11:30 Comment(3)
It fetches it exactly as how it should look, and no errors reported. may be it's really all fine - well this practice surely makes sense, thanks.Ditmore
One more thing, you should also look at the cached page by google in serps(google search results). It tells you, what google found and stored when it crawled your website. If everything is exactly like you expected and nothing wrong than no need to worry, and remember most of your display related processing is going to happen in users' browsers. So if google is ok with your content than you are good to go.Chevalier
I have edited my answer and included a little clarification maybe it would help you some more.Chevalier
B
1

As far as I am aware, Google only recognizes the initial state of the page. This includes CSS rendering, for example, if you add display:none; or visibility:hidden;, I don't think Google will index it.

To be safe, I'd hide the content on load, and then fade it in. I have not really tested it, but I have never seen Google's bots interact too well with JavaScript. An exception seems to be while using the hashbang method.

Another bonus to this method, will be that users with javascript deactivated (I know, duh), will still be able to see your content, as it won't be hidden in the first place.

Blackwood answered 26/9, 2012 at 18:12 Comment(7)
If the user has javascript deactivated then they have nothing to do on my site. Maybe the google used to do it the way you say, but this question is time sensitive, and it's 2012 you know. You didn't bring any proof either, so I'm not convinced.Ditmore
I can give you an example where I used this to my advantage SEO wise. I have a redirection page, that displays some very simple html markup, with no layout what so ever. This is optimized for SEO, and every link points towards it. Then I use Javascript to redirect the user to a page made dynamic by javascript and url hash. While this is not the best practice what so ever, the circumstances required me to go around it that way. Google couldn't index my hashed url. It couldn't read the javascript redirection either, and thus stayed on the SEO optimized page. This page still works.Blackwood
But no, I cannot give an exampple of a site that aren't indexed at all because of javascript. As I said, there seems to be a special scheme regarding hashbang urls. Anyways, I consider it bad practise. If you need proof, I am sorry I haven't got any. But when you are asking about best practise, you will get opinions, not proven facts exclusively.Blackwood
I understand your point. I believe though that the redirects are handled differently, and involve many factors, thus hardly related here. I know my case is rather unlikely to cause seo problems, but either way we would need some more backing up.Ditmore
Well, try it out, and see if the crawlers indexes your page. You also might want to consider the compatibility with lesser search engines. Hiding the content at page load just seems like an overall better scemantic solution, as the content is actually meant to be visible to the user when he enters the page. It is not shown dynamically after pressing a link. If you want to use javascript to make a fancy load effect. you shouldn't do half of it in inline css. Well, that's just my opinion. displayand visibility has scemantic value, as it defines what is visible to the client in the view.Blackwood
On a completely unrelated note, I would also say that animating your entire html element seems like a very bad idea. If you don't have a wrapper, at least animate body instead. The <html>-tag is pure markdown, and should not affect your view what so ever, other than defining your html document.Blackwood
What about "I actually tested this" is being ignored? Google absolutely DOES index content hidden with CSS.Sacrosanct
S
0

I'm not sure if those answers are right anylonger. Google will still be able to crawl your content, doesn't matter if it's using an animation to fade in.

However in my opinion long animations can still delay interactions and readability in a given time which could cause to bad ratings in regards of INP (Interaction to Next Paint) and FCP (First Contentful Paint).

Soerabaja answered 13/7, 2023 at 8:9 Comment(1)
Your answer could be improved with additional supporting information. Please edit to add further details, such as citations or documentation, so that others can confirm that your answer is correct. You can find more information on how to write good answers in the help center.Unable
S
-1

I sadly do not have the SEO answer to your question, but a solution would be to use a negative margin to hide the element off screen. Then when javascript kick in you set the correct position and hide, then you fade in or do what ever you want to do.

Somatist answered 3/10, 2012 at 18:35 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.