Google's crawler won't understand own maps. How to workaround?
Asked Answered
D

6

12

I found strange words, (have, here, imagery, sorry) that were supposed not to be on my site, being taken as keywords by the crawler from Google

first site

It seems like Google is having errors when crawling pages that use Google maps, so it is taking the error strings as great keywords!

I am using openlayers to show maps in both sites. The code is like this

<script src="http://openlayers.org/api/OpenLayers.js"></script>
<script src="http://maps.google.com/maps/api/js?v=3&amp;sensor=false"></script>
<script type="text/javascript">
$(function() {
  $("#mapOuter").html('<div class="thumbnail"><div id="map" style="height:250px"></div></div>')
  map = new OpenLayers.Map("map")
  //map.addLayer( new OpenLayers.Layer.OSM   ("OpenStreeetMap") )
  map.addLayer( new OpenLayers.Layer.Google("Google v3"     ) )
  vectors = new OpenLayers.Layer.Vector("vector")
  map.addLayer( vectors )

  map.addControl( new OpenLayers.Control.LayerSwitcher() );
  map.addControl( new OpenLayers.Control.Navigation({documentDrag:true}) );
  map.addControl( new OpenLayers.Control.PanZoom() );
  var in_options = {
      'internalProjection': map.baseLayer.projection,
      'externalProjection': new OpenLayers.Projection("EPSG:4326")
  };

    var lon=-57.954900
    var lat=-34.917000

  map.setCenter(new OpenLayers.LonLat(lon, lat) // Center of the map
    .transform(
      new OpenLayers.Projection("EPSG:4326"), // transform from WGS 1984
      new OpenLayers.Projection("EPSG:900913") // to Spherical Mercator Projection
    ), 15 // Zoom level
  )

});
</script>

How can I do to fix this "error" so the Google crawler can take good content from my site?

Bonus Google Search (to show that the errors are indexed)

Google search

UPDATE, "Solution" applied:

I had one different map per each page in my site, I ended up converting all maps to images and only keep one interactive map where I really needed user interaction with coordinates and mapping stuff. The solution I used led me to create and opensource osm-static-maps. Hope it helps somebody!

The site got several improvements:

  • Got rid of this awkward words in google webmasters.
  • More relevant SEO using static images with "alt" html img attribute instead of "unindexable" js map.
  • Faster page loading (got rid of all mapping libraries and tile loading).
  • Faster js performance (less js to process by client)
  • Improved user experience: scrolling page caused map zooming instead of actually scrolling (you can think that this could be solved by disabling map scroll to zoom, but it lead to a user expecting to zoom the map on scroll, both ways were ok and wrong at the same time).

On the downside, I found:

  • Less user interactivity (boring page).
  • Less context on the map (less informative map).

This two things could be "fixed" loading the map when the user clicks the map img. The bad side is that if the user clicks the map img unintentionally, the map load can be seen as unexpected behaviour by the user.

Edit2

I made an opensource project out of this. Check out! https://github.com/jperelli/osm-static-maps

Drakensberg answered 2/7, 2013 at 23:38 Comment(0)
D
0

"Solution" applied:

I had one different map per each page in my site, I ended up converting all maps to images and only keep one interactive map where I really needed user interaction with coordinates and mapping stuff. The solution I used led me to create and opensource [osm-static-maps][3]. Hope it helps somebody!

The site got several improvements:

  • Got rid of this awkward words in google webmasters.
  • More relevant SEO using static images with "alt" html img attribute instead of "unindexable" js map.
  • Faster page loading (got rid of all mapping libraries and tile loading).
  • Faster js performance (less js to process by client)
  • Improved user experience: scrolling page caused map zooming instead of actually scrolling (you can think that this could be solved by disabling map scroll to zoom, but it lead to a user expecting to zoom the map on scroll, both ways were [ok and wrong at the same time][4]).

On the downside, I found:

  • Less user interactivity (boring page).
  • Less context on the map (less informative map).

This two things could be "fixed" loading the map when the user clicks the map img. The bad side is that if the user clicks the map img unintentionally, the map load can be seen as unexpected behaviour by the user.

Edit2

I made an opensource project out of this. Check out! https://github.com/jperelli/osm-static-maps

Drakensberg answered 8/7, 2015 at 19:20 Comment(0)
T
3

Unfortunately i saw this a lot too...

My assumption is that googlebot won't fully evaluate all js code on a page, but will use heuristics as well. Thus getting no imagery (which gets indexed). Based on this assumption I did the following:

  1. Create a div with a "random" ID (for the map) and style="display: none;"

  2. Create a noscript tag with an img tag in it with the SAME "random" ID (i used a static map image as fallback here, which is also good as a no-js fallback)

  3. Create a (custom) javascript function where the unique ID must be passed to initialize your map AND toggle the display style to block on the map-element.

So far, none of the maps "sorry we have no imagery" gets indexed.

Hope it helps

Twana answered 24/10, 2013 at 14:26 Comment(2)
Great answer. I don't understand is the "same ID" part. Isn't that going to make troubles with html validation? Or is it supposed to have the same id because it is inside a noscript tag, thus making a correspondence using the id? The other thing is, why should it be random?Drakensberg
My guess was that since google indexes the noscript tags, it will see a "new" element with an ID, if it tries to build a map on an img tag, it would fail. Since googlebot (presumably) still doesn't always full javascript evaluation (but partial), using random IDs makes the partial evaluation harder. You could try with a non random ID as wellTwana
F
2

perhaps you can add a bit more specific meta tags such as

<meta name="geo.region" content="US-WA" />



<meta name="geo.placename" content="Snohomish" />



<meta name="geo.position" content="-57.954900;-34.917000" />

also adding what Matt Rowles meta description and some of the word filters in Google Webmasters.

Fortis answered 16/7, 2013 at 20:42 Comment(2)
I can't find the "word filters" option in Google Webmasters, but it sounds like a possible solution!Drakensberg
perhaps you have a look at this page powered by google , its related to webmaster guidelines to achieve the Quality you need visit [link]support.google.com/webmasters/answer/35769?hl=enFortis
A
2

This answer won't help you remove the words of the crawled pages, but it might prevent them from being added after the next crawl.

Since your problem might be related to the crawler not being able to load a valid map. It's not exactly clear why it can't. The map provider might be blocking googlebots.

Anyway if it's not too hard, I'd have a look here:

https://support.google.com/webmasters/answer/1061943?hl=en

Create a list of user agents written here:

I'll use 'Googlebot' as an example, but you should use a list with every blocked user agents.

if (navigator.userAgent !== 'Googlebot') {
   // load the map and other stuff
} else {
   // show a picture where the map should be or do nothing.
}

Google bot executes JS so it should work preventing errors in case the google bot can't load it.

One thing you could do is to change your browser's useragent to 'Googlebot' and load your page. If the map provider is preventing any browser with this user agent, you should see exactly what the googlebot sees. The other problem is that googlebot might also have some timeout to prevent loading too much data and it won't load images.

Adding guards might help preventing google bot to actually load the map if the problem is really in the map.

Androw answered 17/7, 2013 at 2:23 Comment(2)
Could this impact in lowering PageRank? because of trying to hide (cheating) content from a userAgentDrakensberg
Well, it's hard to say. I'd expect the googlebot to rank a page by its content and not how the page gets loaded. If the content that is loaded is relevant. It shouldn't change the page rank. An image or a map shouldn't have much impact. I don't expect googlebot to pass images in OCR to index content inside images but who knows. Anyway, what really matters in my opinion is the content once the page is loaded. How it gets loaded is irrelevant.Program
M
1

1) Perhaps setting your Meta Description inside your <head> tags will supercede this:

<meta name="description" content="This is an example of a meta description. This will often show up in search results.">

2) If the meta tag doesn't work, I would also suggest that this is possibly due to the very first thing in the <body> being rendered (or rather, attempted by the looks of your screenshot) is a Maps display prior to any other content being loaded.

For example, if you place a <div> or <p> tag with some introduction content about your website before the Map in your <body>, you may avoid this. However, I am not 100% sure about this you will have to test and see the results (keep us posted).

If you plan on doing this and want a) the Google crawler to still pick it up and b) wish to hide the actual block of words itself from viewers (style="display: none;" or style="position: absolute; left:-9999px;"), do so at your own discretion (more info here).

Minutely answered 10/7, 2013 at 4:26 Comment(2)
1) I tried this to "override" the keywords, but It doesn't seem to work. I'm thinking that the problem raises when javascript gets executed (because bad generated maps are somehow parsed by the crawler) Please note that the strange words are not part of the written content of the site.Drakensberg
2) I realised that I have another site with this very same problem, and it has more content before the map rendering. It's like this words are being show in place of all the map tiles, so they get repeated over too many times, and end up being more relevant that the actual content of the page, no matter if the real content is beforeDrakensberg
F
0

Did you try to add spider meta tags , it really helps a lot try this out in the head section.

<meta name="robots" content="index, follow">

The spider will now index your whole website, also will not only index the first webpage of your website but also all your other webpages.

Also try to make you description more unique! much more powerful but not to overdose those keys.

thanks

Fortis answered 17/7, 2013 at 17:38 Comment(0)
D
0

"Solution" applied:

I had one different map per each page in my site, I ended up converting all maps to images and only keep one interactive map where I really needed user interaction with coordinates and mapping stuff. The solution I used led me to create and opensource [osm-static-maps][3]. Hope it helps somebody!

The site got several improvements:

  • Got rid of this awkward words in google webmasters.
  • More relevant SEO using static images with "alt" html img attribute instead of "unindexable" js map.
  • Faster page loading (got rid of all mapping libraries and tile loading).
  • Faster js performance (less js to process by client)
  • Improved user experience: scrolling page caused map zooming instead of actually scrolling (you can think that this could be solved by disabling map scroll to zoom, but it lead to a user expecting to zoom the map on scroll, both ways were [ok and wrong at the same time][4]).

On the downside, I found:

  • Less user interactivity (boring page).
  • Less context on the map (less informative map).

This two things could be "fixed" loading the map when the user clicks the map img. The bad side is that if the user clicks the map img unintentionally, the map load can be seen as unexpected behaviour by the user.

Edit2

I made an opensource project out of this. Check out! https://github.com/jperelli/osm-static-maps

Drakensberg answered 8/7, 2015 at 19:20 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.