Ever since the beginning of iOS, handling touch events was a very essential activity for every iOS developer. Any app published had to be able to handle various touch events (gestures, swipes, pinches,..) and up till now, this still hasn’t changed. Although this seems advanced programming, Apple has made it fairly easy to deal with all of these different types of touches.
Before I’m going to talk about handling touches, I’ll give a brief summary of how iOS knows which touch event to pass to which view.
A responder is an object that can respond to events and handle them. All responder objects are instances of classes that ultimately inherit from UIResponder. These classes declare an interface for event handling. Every visible element of an app is mostly a responder: views, controls,…
Now that we know what responders are, we can take a look at how iOS uses these responder objects for handling user interaction.
When a user interacts with the touch screen of iOS hardware, the operation system triggers an event according to the type of touch (beginning of a touch, touch moved, touch ended,...). The view that was touched is being asked if it can handle this kind of touch. When this is not the case, the event is forwarded to the next responder. For example when a button is tapped, and the button can’t handle this action, the event is forwarded to the button’s next responder: UIView. If UIView can’t handle this one, it is typically forwarded to the viewcontroller etc.
Recognizing gestures through UIGestureRecognizer
UIGestureRecognizer is an abstract class for gesture recognizer subclasses.
The UIKit framework predefines 6 types of gesture recognizers for the most common gestures:
- UITapGestureRecognizer: used for detection of 1 - … taps.
- UIPinchGestureRecognizer: For zooming a view
- UIPanGestureRecognizer: Panning or dragging
- UISwipeGestureRecognizer: Swiping in any direction
- UIRotoationGestureRecognizer: Rotating
- UILongPressGestureRecognzer: Long press
All these different recognizers still have unique properties that can be tweaked to any developer’s needs. Example: double tap on view instead of single tap.
Creating a custom gesture recognizer is also possible but I won’t be showing it in this post.
Touch notification methods
When you’re not using gesturerecognizers, touch notification methods are used. When a view is subclassed, you can override these 4 methods and start handling some touches.
Touch events will call these methods on the first responder object. Each of these methods gets called at a specific ‘momentum’ of a touch.
Called when user first touches the screen with one or more fingers.
Called when user moved fingers across the screen.
Called when user ends his touch by removing his finger from the screen.
Is called when a system event cancels a touch. (for example a low memory warning)
As an example, I added a demo project, demonstrating all these different techniques for event handling.