Touch Events

We now know how to display the text and how to display an image using the UILabel and UIImageView components respectively. User experience does not limit to displaying the content. User experience also includes interacting with the UI Components.
Every system has it’s own way of interacting with the UI Components and it is termed as the “Responder Chain” design pattern. When you touch the screen a UITouch Object gets created. This UITouch Object registers the first responsive view. The registration of view happens using following 4 golden rules
1.The view should be interactive enabled.
2.The view should be visible.
3.The parent view should be interaction enabled. (window is made active first ,
as it is the parent view of the entire app and we get the callback from applicationBecameActive )
4.The view should be with in the boundary of the parent view.
(Thus the window is always full screen)
The responder chain first gives preference to the top most view in the view hierarchy stack. Thus the last view to get preference will be window.
As we have seen AppDelegate or UIViewController or UIView, all of them inherit from UIResponder class. UIResponder class handles the touch events. As your class ViewController is a custom UIViewController which contains one UILabel object and two UIImageView objects, also inherits from UIResponder class. Thus you can override the touch event handlers in the ViewController.swift file.
We need to override following 3 methods for handling interactivity:
func touchesBegan(touches: NSSet!, withEvent event: UIEvent!)

When you run the program and you click the mouse on the iOS simulator or in simple words when you right click on the screen, you will get the following console output touchesBegan .When you lift your finger from the mouse click you will get following log in the console touchesEnded With this we can conclude when you touch the screen touchesBegan is called and touchesEnded is called when you lift your finger from the screen.
But when you touch the screen, drag your finger across the screen and lastly, lift your finger following is sequence of method calls touchesBegan
n times touchesMoved is called touchesEnded

Thus if you see for every touchesBegan you will have touchesEnded. The method “touchesMoved” is not necessarily called. Lets try to get simulate a toggle effect. When we touch the pumpkin on the screen, the label should disappear and when we lift our finger, the label should be visible again. Every UI Component has a hidden property, which is “FALSE” by default, thus when we add the view to the parent view, the view is shown on the screen.
Thus when touch captures the pumpkin object, label’s hidden property should be set to “TRUE” in touchedBegan and in the touchesEnded, label’s hidden property should be set to “FALSE”.
override func touchesBegan(touches: NSSet!, withEvent event: UIEvent!) {
println(“touchesBegan”)

If you notice these callbacks send back NSSet touches. As iOS devices are multi touch enabled, device so there can be possibility that the user might have touched the screen with more than one finger. iPhones and iPods can detect a maximum of 5 fingers and iPads can detect a maximum of 10 fingers. NSSet is a collection type but contains distinct objects.

First we extract a touch object from the collection. Below is a demonstration of a single touch object and than we compare the view property of the touch object with the pumpkin object. If both are the same we set the hidden property of the first label to “true”. Same code is written in touchesEnded but instead we set hidden property to “false”.
When you run the app the toggle effect will not work. This is because by default UIImageView is interaction disabled. Thus we modify the loadPumpkin method as follows
func loadPumpkin()
{
var tempImg = UIImage(named: “1.png”)