How to capture touches over a UIWebView

I spent the last few weeks polishing the What They Speak When They Speak to Me (WTS) iOS application I developed with Obx Labs, which I mentioned in a previous post. Developing a working prototype for an iOS application can be done fairly quickly, thanks to the tools provided by Apple and a growing list of libraries and engines, but the more time consuming part of development takes place later, working with and often against the features embedded in the iOS SDK to polish your product. This how-to is about one specific issue that arose during the development of WTS: capturing touches over a UIWebView without losing its functionality.

This short tutorial assumes that you know how to place a UIWebView (or other touch swallowing views) in your app, and that you are at the frustrating point of trying to catch touch events over the Web View to implement some interactive behaviour. In my case, I wanted to swipe the Web View left and right to show or hide it, similarly to how the Twitter for iPad app manages its browser tab.

We’ll start from scratch and go through the following four steps. You can jump to step 4 if you’re looking for the meat of this tutorial.

Create a new XCode project using the “View-based application” template.

Add the standard methods to manage touch events.

Add a UIWebView covering the main view.

Add a custom class that extends UIWindow to capture touch events.

1. Create a new XCode project using the “View-based application” template.

This step is self explanatory. From XCode you select from the main menu: File > New Project, and them “View-based Application” which is under the “Application” template folder. This is here mainly to have a common code base from which to start the tutorial; I named the project “CaptureTouch”.

2. Add the standard methods to manage touch events.

Before we get to the problematic UIWebView, we want to make sure that touch events get to the application’s main standard view. The new project you created in step 1 should contain a view controller named CaptureTouchViewController. In the implementation file of this view controller, add the four standard touch management methods:

At this point, if you build and run the application, you should see a gray background (the view), and clicking anywhere on the background will output “Touch began” to the console.

3. Add a UIWebView that covers the main view.

As mentioned before, we assume you know how the UIWebView works, so we won’t get into too much detail. All we need is the simplest UIWebView covering the application’s main view. We will add the UIWebView in the .xib file created by XCode, but first we need to add an attribute and outlet to the CaptureTouchViewController. Your CaptureTouchViewController.h file should then look like the following:

With the outlet created, you can open the CaptureTouchViewController.xib file in Interface Builder. Open the “View” object, and then drag-and-drop a new Web View into it. The Web View should automatically expand to cover the whole view. Right-click on the Web View, and then link a “New Referencing Outlet” with the “webView” attribute you created above by clicking the “New Referencing Outlet”, dragging to the “File’s Owner” object, and then selecting the “webView” from the list that pops up.

At this point, you linked the interface to the “webView” attribute, but it is not loading any html. We will need one last bit of code before we get to the core of this tutorial, which will load a url to make sure the Web View is working correctly. After the application’s main view finished loading, the CaptureTouchViewController’s viewDidLoad: method is called. This is where we add the few lines that will load a url into the web view. Your viewDidLoad: method should look like the following after the changes:

If you build and run the application, you should see the web page. Everything looks fine, but if you touch anywhere on the screen, you’ll notice that the “Touch began” message from step 2 does not appear in the console anymore. The Web View swallows the touches to manage scrolling and displaying the magnifying glass if you hold down a touch over text, and it blocks touch events from getting to the view. The next step shows how to capture those touch events.

4. Add a custom class that extends UIWindow to capture touch events.

There are different ways to capture touches over a Web View. One would be to extend the UIWebView class, but Apple says you should not, so we will stay away from that solution in case it causes problem later. Instead, we are going to extend the UIWindow class, and capture touch events before they get propagated to the correct view(s). The first thing you’ll need is a new class, let’s call it TouchCapturingWindow, with the following header and implementation files:

This class is heavily inspired by Michael Tyson’s tutorial, with a few changes and some added notes about the implementation. Here’s how it works. The TouchCapturingWindow overrides the sendEvent: method of UIWindow to check if touch events should be sent to certain views instead of only the top view, which is more or less the default behaviour. If you intend to have multiple views in your application, you probably don’t want to have all of them capture touch events, so the TouchCapturingWindow provides methods (i.e. addViewForTouchPriority: and removeViewForTouchPriority:) to add and remove the specific view(s) you want to touch. Once you’ve replaced the standard UIWindow with an instance of this custom class, touch events will go through the sendEvent: method, and allow you to redirect them to the correct view(s) based on any criteria. In the above case, the only criteria is if the touch falls inside the frame of any of the views that were added to the TouchCapturingWindow.

First, you need to change the default “window” of your application’s delegate to use the new custom class. Open the CaptureTouchAppDelegate.h file, and replace the UIWindow class by TouchCapturingWindow; don’t forget to import the header, which should give you something like this:

After you changed the window in the code, you’ll need to adjust the MainWindow.xib to also reflect this change. Open MainWindow.xib, and change the class of its window object from UIWindow to the new TouchCapturingWindow.

Now that the window can propagate touch events the way we want, we need to tell it which view to prioritize. In this case, we want the application’s main view to receive the touch events that would normally be blocked by the Web View covering it. To add the view to the priority list, you’ll need to modify the applicationDidFinishLaunchingWithOptions: method of the CaptureTouchAppDelegate. Just after the window is made visible, add the view using the new addViewForTouchPriority: method, which should give you the following:

At this point, if you run the application, you can see that touching anywhere on the Web View outputs the “Touch began” message of step 2. You can now use the touch events to add some touch-based interaction to your application, but be careful not to conflict with the Web View’s features such as scrolling and text selection.

At the beginning I mentioned that I wanted to keep all the Web View’s features. This is accomplished by one short but important line of code. In the sendEvent: method of the new TouchCapturingWindow class, the line [super sendEvent:event] assures that the Web View receives the event before we propagate it to the main view. As of iOS5, placing that line at the beginning of the method stopped working, and it now needs to be at the end of the method. Placing it at the end keeps all the Web View’s features for iOS5, but does not show them for devices with

[super sendEvent:event];

On a final note, if you look at the sendEvent: method, you’ll notice that touch events are propagated to a view only if the location of the touch is inside the frame of the view. This is a common behaviour, but there is no reason why you should always stick to it. You might want to check the state of a view to decide if the view should receive touches, control the view by touching outside its visual frame, or send event to a specific view only after the user tapped around up, up, down, down, left, right, left, right…

In step 2. “Add the standard methods to manage touch events.”
Before we get to the problematic UIWebView, we want to make sure that touch events get to the application’s main standard view. The new project you created in step 1 should contain a view controller named CaptureTouchViewController. In the header file of this view controller, add the four standard touch management methods:

Just wondering… do you misprinted words “In the the header file …” instead of (.m) “In the implementation file…” to add the touch methods.

Rest, its a great tutorial to handle touch event in UIWebview. Thanks a lot.

I have tried implementing this solution for a UIWebView which is present in a view other than the main one, and have come across some problems.

The main view, which defines the UIWindow does not contain a UIWebView, but through the click of a button takes the user to a different view. If I implement this as above, I run into all sorts of issues. Would you expect your solution to work for this case, or do you think that there need to be some changes?

The web view is placed inside a UIScrollView, and I have tried to adapt your ideas to subclass the UIScrollView, but have also not been successful in this. Rather than getting an error message in the console, though, I am getting the following:warning: Unable to restore previously selected frame.

Can you throw some light on how this would work when Storyboard is used and there is no XIB. Do we only have to replace the UIWIndow to TouchCapturingWindow in the AppDelegate.h and everything else remains the same.

I haven’t had the need to use Storyboard yet, so I’m only guessing here. It looks like Storyboard groups many XIB files under one roof. In theory, you should be able to follow this tutorial, make sure you use the CaptureTouchViewController when you build your storyboard, use the TouchCapturingWindow when you define your window attribute in the delegate (the property type can/should stay as UIWindow*), and then add the view(s) you want to receive touches with the addViewForTouchPriority. As long as your storyboard is linked with a UIViewController object that you can access, you should be able to pass that controller’s ‘view’ to the TouchCapturingWindow.

Very nice and clean solution. But unfortunately it doesn’t work if you want to put the UIWebView into a UIScrollView. Even a scrollview subclass doesn’t work as expected. The touchesBegan and company are all called, but for some weird reason the scroll view ignore it all and don’t scroll.

If someone had accomplished to make the uiscrollview actually scroll, please let me know! I’m really tried every thing I could imagine.

I wasn’t exactly sure what you are trying to do, but I was curious. I took a shot at it. Here’s an updated Xcode project: CaptureTouch Xcode project (with UIScrollView). The Xib file is only for iPhone/iPod, but the concept is the same for the iPad. Let me know if that fixes your problem.

Hey! Thanks for the reply, but I can’t download the file from the link you posted. Could you please send it directly to my email?

What I’m trying to do is to have a horizontal paginated uiscrollview with various elements, and one of the “pages” are uiwebviews. I actually have managed to pass the touches to a UIScrollView subclass, but couldn’t make the scrollview “behave” like it should and scroll normally.
The touchesBegan is called and from there I’m changing the contentoffset manyally, but it doesn’t seams right and my scrolling implementation is very buggy.

Thanks for this article, I’m eager to try it out! I’m thinking about trying this method to duplicate mobile safari’s touch-hold to download images on a uiwebview. Could this method be used with a UILongpressgesture? Otherwise, any other way to have the touch-hold functionality? Thank you!

That’s a good question. You could probably tweak the code of the TouchCapturingWindow to pass it a UIGestureRecognizer as (I think) they use the same set of touchesBegan to touchesEnded methods, or you might just want to implement it from scratch using a timer. Touch, start timer, check if touch has ended or moved too much when timer is triggered, if not you have your long pressure. To implement the download image feature, you would also to find the position of the image in the UIWebView with a javascript function and adjusting for the scroll position; I can’t think of another way to get an element’s position in a UIWebView. If you tweak the code of this example to make it work for what you need, sent it my way, and I could add it to this post if you want.

What I have manage to do until now is to pass the touches to a subclass of UIScrollView and force the scroll using the contentOffset prop. But of course it is buggy because it doesn’t has the kinect effect uiscrollviews have normally.

Maybe I’m overlooking this, maybe there is a way to call the default scroll behavior of scrollview on it beginstouches handler…

Thanks for pointing this out. I never thought of testing pages that use Javascript frameworks like Sencha Touch. I must admit that original feature I used this code for was fairly simple, it was a static internal UIWebview without scrolling. That snippet of code might be starting to show the extent of what it can do without significant tweaks.

For controlling the scrollview, I’ve always had trouble passing touch events forward through the standard touch methods. One way I can think of, although that’s a bit like reinventing the wheel, is to control the scroll through the setContentOffset method, but that probably means implementing the bouncing too. Unless setting the offset outside the bounds automatically bounces the scroll view, I never tried, but I feel that’s unlikely.

I wanted to add this here for you all. I’ve been struggling with exactly this for a couple of days. I found the tutorial on mithin, which is nicely written, but doesn’t match my needs, I needed all touch events to be passed, not just where the user touched. Thankfully, this tutorial was exactly what I wanted, so thank you!

My trouble with this tutorial is that I’m new to iOS programming, only been doing it a couple of months! Unfortunately, that means the projects I’ve been working on are all iOS SDK 5.1, using storyboards and ARC (I know I should learn about memory management in more depth, but I haven’t the time right now).

I couldn’t get this code working, as I didn’t know how to change the class of UIWindow without access to the .xib file. I’ve solved that now, and can now parse touch events!

Here’s what I did:

The first thing to do is change a line in CaptureTouchAppDelegate.h, from:@property (retain, nonatomic) IBOutlet UIWIndow *window;

I got no error having my custom UIWindow sublass here, although it did complain about retain, strong fixed that. Also in CaptureTouchAppDelegate.h I had to add #import “CaptureTouchViewController.h”

Next, in the AppDelegate.m file, (It’s worth noting that you need to add the @synthesize statement for viewController, as the code above doesn’t show this) I had to add the following method as a replacement for changing the window’s subclass via interface builder:

Nearly there! The last thing I had to do was figure out how to get the view controller! Without this last line of code, the app would crash out, complaining it couldn’t add a nil object to the views NSMutableArray.

In -(BOOL)application:didFinishLaunchingWithOptions: add the following line before
[self.window addSubview:viewController.view];

Hi All, I want to post a quick update to this for you. I was having no end of trouble in that I could not get an image to display on the uiwebview where I clicked. It turns out the code at the end of my previous comment is wrong! You do NOT want to instantiate a new view controller as I have done there. Instead you want to use this code:

viewController = self.window.rootViewController;

And there we go, that’s solved my day of banging my head against the wall I hope someone finds this useful!