I’m porting an iOS app to Android, and using Google Maps Android API v2.
The application needs to draw a heatmap overlay onto the map.
So far, it looks like the best option is to use a TileOverlay
and implement a custom TileProvider
. In the method getTile
, my method is given x, y, and zoom, and needs to return a bitmap in the form of a Tile
. So far, so good.
I have an array of heatmap items that I will use to draw radial gradients onto the bitmap, each with a lat/long. I am having trouble with the following two tasks:
- How do I determine if the tile represented by x, y, and zoom contains the lat/long of the heatmap item?
- How do I translate the lat/long of the heatmap item to x/y coordinates of the bitmap.
Thank you for your help!
UPDATE
Thanks to MaciejGórski's answer below, and marcin's implementation I was able to get the 1st half of my question answered, but I still need help with the 2nd part. To clarify, I need a function to return the x/y coordinates of the tile for a specified lat/long. I've tried reversing the calculations of MaciejGórski's and marcin's answer with no luck.
public static Point fromLatLng(LatLng latlng, int zoom){
int noTiles = (1 << zoom);
double longitudeSpan = 360.0 / noTiles;
double mercator = fromLatitude(latlng.latitude);
int y = ((int)(mercator / 360 * noTiles)) + 180;
int x = (int)(latlng.longitude / longitudeSpan) + 180;
return new Point(x, y);
}
Any help is appreciated!