How would you detect touches only on non-transparent pixels of a UIImageView
, efficiently?
Consider an image like the one below, displayed with UIImageView
. The goal is be to make the gesture recognisers respond only when the touch happens in the non-transparent (black in this case) area of the image.
Ideas
- Override
hitTest:withEvent:
orpointInside:withEvent:
, although this approach might be terribly inefficient as these methods get called many times during a touch event. - Checking if a single pixel is transparent might create unexpected results, as fingers are bigger than one pixel. Checking a circular area of pixels around the hit point, or trying to find a transparent path towards an edge might work better.
Bonus
- It'd be nice to differentiate between outer and inner transparent pixels of an image. In the example, the transparent pixels inside the zero should also be considered valid.
- What happens if the image has a transform?
- Can the image processing be hardware accelerated?