I have a problem which draws my back in some project for some time now.
I'm basically looking to trap a polygon using x, y points drawn by some script I've written. lat, lon are the center GPS cords of the polygon and I'm looking for its surrounding polygon.
here is a part of my code in python:
def getcords(lat, lon, dr, bearing):
lat2=asin(sin(lat)*cos(dr)+cos(lat)*sin(dr)*cos(bearing))
lon2=lon+atan2(sin(bearing)*sin(dr)*cos(lat),cos(dr)-sin(lat)*sin(lat2))
return [lat2,lon2]
my input goes like this:
- lat, lon - are given in decimal degrees.
- dr - is the angular computed by dividing the distance in miles by the earth's -radius(=3958.82)
- bearing - between 0-360 degrees.
However for the input:
getcorsds1(42.189275, -76.85823, 0.5/3958.82, 30)
I get output: [-1.3485899508698462, -76.8576637627568]
, however [42.2516666666667, -76.8097222222222]
is the right answer.
as for the angular distance, I calculate it simply by dividing the distance in miles by the earth's radius(=3958.82).
anybody?