I have placed an image (UIImageView) on the navigation bar. Now I want to detect the touch event and want to handle the event. How can I do that?
In practical terms, don't do that.
Instead add a button with Custom style (no button graphics unless you specify images) over the UIImageView. Then attach whatever methods you want called to that.
You can use that technique for many cases where you really want some area of the screen to act as a button instead of messing with the Touch stuff.
A UIImageView
is derived from a UIView
which is derived from UIResponder
so it's ready to handle touch events. You'll want to provide the touchesBegan
, touchesMoved
, and touchesEnded
methods and they'll get called if the user taps the image. If all you want is a tap event, it's easier to just use a custom button with the image set as the button image. But if you want finer-grain control over taps, moves, etc. this is the way to go.
You'll also want to look at a few more things:
Override
canBecomeFirstResponder
and return YES to indicate that the view can become the focus of touch events (the default is NO).Set the
userInteractionEnabled
property to YES. The default forUIViews
is YES, but forUIImageViews
is NO so you have to explicitly turn it on.If you want to respond to multi-touch events (i.e. pinch, zoom, etc) you'll want to set
multipleTouchEnabled
to YES.
userInteractionEnabled
to YES on my parent view. See #5887805 –
Evslin To add a touch event to a UIImageView, use the following in your .m file:
UITapGestureRecognizer *newTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(myTapMethod)];
[myImageView setUserInteractionEnabled:YES];
[myImageView addGestureRecognizer:newTap];
-(void)myTapMethod{
// Treat image tap
}
You can also add a UIGestureRecognizer. It does not require you to add an additional element in your view hierarchy, but still provides you will all the nicely written code for handling touch events with a fairly simple interface:
UISwipeGestureRecognizer *swipeRight = [[UISwipeGestureRecognizer alloc]
initWithTarget:self action:@selector(handleSwipe:)];
swipeRight.direction = UISwipeGestureRecognizerDirectionRight;
[imgView_ addGestureRecognizer:swipeRight];
[swipeRight release];
UISwipeGestureRecognizer *swipeLeft = [[UISwipeGestureRecognizer alloc]
initWithTarget:self action:@selector(handleSwipe:)];
swipeLeft.direction = UISwipeGestureRecognizerDirectionLeft;
[imgView_ addGestureRecognizer:swipeLeft];
[swipeLeft release];
I've been on different threads on the past few hours trying to find a solution for my problem, to no avail. I see that many developers share this problem, and I think people here know about this. I have multiple images inside a UIScrollView
, trying to get tap events on them.
I am not getting any events from an UIImangeView
, but I do get an event from a similar UILable
with very similar parameters I am setting to it. Under iOS 5.1.
I have already done the following:
- set setUserInteractionEnabled to YES for both `UIImageView and parent view .
- set setMultipleTouchEnabled to YES for
UIImageView
. - Tried subclassing
UIImageView
, didn't help any.
Attaching some code below, in this code I initialize both a UIImageView
and UILabel
, the label works fine in terms of firing events. I tried keeping out irrelevant code.
UIImageView *single_view = [[UIImageView alloc]initWithFrame:CGRectMake(200, 200, 100, 100)];
single_view.image = img;
single_view.layer.zPosition = 4;
UITapGestureRecognizer *singleTap = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(singleTapGestureCaptured:)];
[single_view addGestureRecognizer:singleTap];
[single_view setMultipleTouchEnabled:YES];
[single_view setUserInteractionEnabled:YES];
[self.myScrollView addSubview:single_view];
self.myScrollView.userInteractionEnabled = YES;
UILabel *testLabel = [[UILabel alloc] initWithFrame:CGRectMake(100, 100, 100, 100)];
testLabel.backgroundColor = [UIColor redColor];
[self.myScrollView addSubview:testLabel];
[testLabel addGestureRecognizer:singleTap];
[testLabel setMultipleTouchEnabled:YES];
[testLabel setUserInteractionEnabled:YES];
testLabel.layer.zPosition = 4;
And the method which handles the event:
- (void)singleTapGestureCaptured:(UITapGestureRecognizer *)gesture
{
UIView *tappedView = [gesture.view hitTest:[gesture locationInView:gesture.view] withEvent:nil];
NSLog(@"Touch event on view: %@", [tappedView class]);
}
As said, the label tap is received.
Instead of making a touchable UIImageView then placing it on the navbar, you should just create a UIBarButtonItem, which you make out of a UIImageView.
First make the image view:
UIImageView *yourImageView = [[UIImageView alloc] initWithImage:[UIImage imageNamed:@"nameOfYourImage.png"]];
Then make the barbutton item out of your image view:
UIBarButtonItem *yourBarButtonItem = [[UIBarButtonItem alloc] initWithCustomView:yourImageView];
Then add the bar button item to your navigation bar:
self.navigationItem.rightBarButtonItem = yourBarButtonItem;
Remember that this code goes into the view controller which is inside a navigation controller viewcontroller array. So basically, this "touchable image-looking bar button item" will only appear in the navigation bar when this view controller when it's being shown. When you push another view controller, this navigation bar button item will disappear.
You might want to override the touchesBegan:withEvent:
method of the UIView
(or subclass) that contains your UIImageView
subview.
Within this method, test if any of the UITouch
touches fall inside the bounds of the UIImageView
instance (let's say it is called imageView
).
That is, does the CGPoint
element [touch locationInView]
intersect with with the CGRect
element [imageView bounds]
? Look into the function CGRectContainsPoint
to run this test.
First, you should place an UIButton and then either you can add a background image for this button, or you need to place an UIImageView over the button.
Or:
You can add the tap gesture to a UIImageView so that get the click action when tap on the UIImageView.
For those of you looking for a Swift 4 solution to this answer, you can use the following to detect a touch event on a UIImageView.
let gestureRecognizer: UITapGestureRecognizer = UITapGestureRecognizer(target: self, action: #selector(imageViewTapped))
imageView.addGestureRecognizer(gestureRecognizer)
imageView.isUserInteractionEnabled = true
You will then need to define your selector as follows:
@objc func imageViewTapped() {
// Image has been tapped
}
Add gesture on that view. Add an image into that view, and then it would be detecting a gesture on the image too. You could try with the delegate method of the touch event. Then in that case it also might be detecting.
© 2022 - 2024 — McMap. All rights reserved.