I'm developing an iOS app based on augmented reality. What it suppose to do is user will set his distance from wall through a UISlider. Then he'll select a picture from gallery and see how will it look on the wall. App should scale the UIImage accordingly to the distance of user from wall and user can drag it to see how it look on the wall.
I want to apply UILongPressGestureRecognizer on the added UIImage so that it can be deleted i.e. tap hold and delete.
This is the code I've applied to import the image which is already in my library I'll do the import from gallery later:
self.myImage = [UIImage imageNamed:@"myimage.png"];
self.myImageView = [[UIImageView alloc] initWithImage:self.myImage];
self.myImageView.userInteractionEnabled = YES;
CGRect cellRectangle;
self.myImageView.contentMode = UIViewContentModeScaleAspectFit;
cellRectangle = CGRectMake(0, 0, self.myImage.size.width/5, self.myImage.size.height/5);
And for UILongPressGestureRecognizer this is the code:
self.lpgr = [[UILongPressGestureRecognizer alloc]
initWithTarget:self action:@selector(handleLongPressGestures:)];
self.lpgr.minimumPressDuration = 2.0; //seconds
self.lpgr.accessibilityFrame = cellRectangle;
[self.customCam addGestureRecognizer:self.lpgr];
Where customCam is the view on which the camera is shown for AR.
- (void)handleLongPressGestures:(UILongPressGestureRecognizer *)sender
{
if ([sender isEqual:self.lpgr]) {
if (sender.state == UIGestureRecognizerStateBegan)
{
CGPoint p = [self.lpgr locationInView:self.myImageView];
NSLog(@"TapLong Run on points %@",NSStringFromCGPoint(p));
}
}
}
The problem with this code is it applies the UILongPressGestureRecognizer on all of the customCam view.
How to bound it that it'll remain within the myImageView.
I've also tried doing this:
[self.myImageView addGestureRecognizer:self.lpgr];
but this didn't work and I've also added this self.myImageView.userInteractionEnabled = YES;