It seems like the browser you are using has JavaScript disabled. As a result, the site will not function properly. We really want you to enable it so you may experience our site as we intended it. If you have no idea what we are talking about or if you need help, visit http://www.enable-javascript.com×
This website uses cookies. By continuing to browse this site you are agreeing to our use of cookies. Find out more on our cookie page.×

Oops, it seems like you're using an old browser that we do not fully support. If you're able to, please upgrade your browser here.×
This website uses cookies. By continuing to browse this site you are agreeing to our use of cookies. Find out more on our cookie page.×

Touch screen input

Almost every app that you develop needs to provide support for user
interaction. This interaction might be through keyboard input or sensors on the device,
but most commonly, users interact with your app by touching the screen of the device.
Users can touch different UI controls that you display on the screen, and your app can
handle each touch event in any way you want.

Interactive controls

The Cascades framework provides some basic touch handling that you can use in your apps right
away, without using a lot of code. Most Cascades UI controls are interactive and handle basic touch events automatically. For example,
the Button
control handles touch events to determine whether it has
been pressed or released and should change its appearance accordingly.

Many UI controls also contain signals that are emitted when users touch or
otherwise interact with the control. For example, a Button emits the clicked()
signal when it's clicked, and a CheckBox
emits the checkedChanged()
signal when it's checked or not
checked.

The signals that core controls emit typically correspond to the most
common ways that users interact with the controls, which makes responding to these
signals a great way to handle common interactions. The Cascades framework implements detection of the touch event automatically;
all you need to do is respond when the appropriate signal is emitted.

Here's how to handle signals from a Button and a CheckBox by using
signal handlers in QML. Cascades provides these signal handlers for you to use, and you can
access them by using the prefix "on," followed by the signal name that you want
to handle. In this code sample, the checked status of the check box is displayed
beside it, and clicking the button resets the checked status to its default
value (not checked).

In C++, you connect the signals from the Button and CheckBox to slot
functions that you create yourself.

// Create the root page and top-level container
Page* root = new Page;
Container* topContainer = new Container;
// Create the check box and button, and add them to the
// top-level container
myCheckBox = CheckBox::create("Unchecked");
resetButton = Button::create("Reset");
topContainer->add(myCheckBox);
topContainer->add(resetButton);
// Connect the signals of the check box and button to slot functions.
// If any Q_ASSERT statement(s) indicate that the slot failed to
// connect to the signal, make sure you know exactly why this
// has happened. This is not normal, and will cause your app
// to stop working
bool res = QObject::connect(myCheckBox, SIGNAL(checkedChanged(bool)),
this, SLOT(handleCheckedChanged(bool)));
// This is only available in Debug builds
Q_ASSERT(res);
res = QObject::connect(resetButton, SIGNAL(clicked()),
this, SLOT(handleButtonClicked()));
// This is only available in Debug builds
Q_ASSERT(res);
// Since the variable is not used in the app, this is added to avoid a
// compiler warning
Q_UNUSED(res);
// Set the content of the page and display it
root->setContent(topContainer);
app->setScene(root);

For more information about handling events using signals and slots, see Signals and
slots
.

Touch events

As you write more complex apps, you may need to handle touch events with
more precision or flexibility. The Cascades controls that you use might not emit signals for the types of user interaction that
you want to respond to. Or, you might want your app to react to gestures other than the
default ones that Cascades supports. In these cases, you can capture touch events directly and respond to
them.

Touch events in Cascades

In Cascades, every object that inherits from the Control
class includes a signal called touch()
, which is emitted when the object
receives a touch event. Core UI controls (such as Button
and Container
) and any custom controls that you
create by extending the CustomControl
class also emit this signal.
You can respond to the touch event by using the signal handler onTouch in QML, or by connecting the touch() signal to a slot in C++.

When you connect the touch()
signal to a slot, the signal
includes a TouchEvent
parameter. This parameter
contains additional information about the touch event, such as the position that the
touch occurred (in either screen coordinates or local coordinates) and the type of
touch interaction (up, down, or move). You can use enumeration values in the TouchType
class (such as TouchType::Up and TouchType::Down) to determine the touch interaction that a touch event
represents.

Touch events in libscreen

To handle touch events using C, you must create a screen context for the screen of
the device by calling screen_create_context()
. This context is associated
with the libscreen library, and allows your application to receive screen events.
Next, you need to initialize the BPS library by calling bps_initialize()
. You can then call screen_request_events()
to start receiving screen
events, passing in the screen context that you created.

After your application is registered to receive screen events, you can call
bps_get_event() to retrieve the next available event from the
library. You can use an event loop with a simple exit condition to process events
continuously while your application is running. After an event is received, you
should determine whether it's a screen event by calling
bps_event_get_domain() and
screen_get_domain().

When you know you've got a screen event, you should extract the specific event from
the generic BPS event by calling screen_event_get_event(). This
step is important because it allows you to query the libscreen event for various
properties. The property that you need for detecting a touch event is
SCREEN_PROPERTY_TYPE, and you can call
screen_get_event_property_iv() to get the value of that
property. After you store the value, you can test to see if it's the start of a
touch event (represented by SCREEN_EVENT_MTOUCH_TOUCH) and handle
the touch event accordingly.

Handling touch events

Here's how to use some of the properties that are included
in TouchEvent to move a blue square
around the screen. Touching an area of the screen moves the square to that
position, and touching and holding the screen lets you drag the square
around the screen. The color of the square also changes to green as long as
a finger is held on the screen (that is, until an "up" touch event is
received, which indicates that the finger has been released).

Here's how to achieve the same result in C++. Again, for simplicity, it's
assumed that the Container
that moves and the slot
that handles the touch() signal are both
declared in a corresponding header file.

// In your application's source file.
// Create the root page and top-level container.
Page* root = new Page;
Container* topContainer = new Container;
// Create the container that represents a blue square and add it to
// the top-level container.
movingContainer = Container::create()
.preferredWidth(150)
.preferredHeight(150)
.background(Color::Blue);
topContainer->add(movingContainer);
// Connect the top-level container's touch() signal to a slot.
// If any Q_ASSERT statement(s) indicate that the slot failed to
// connect to the signal, make sure you know exactly why this
// has happened. This is not normal, and will cause your app
// to stop working
bool connectResult = QObject::connect(topContainer,
SIGNAL(touch(bb::cascades::TouchEvent*)),
this,
SLOT(handleTouch(bb::cascades::TouchEvent*)));
// This is only available in Debug builds.
Q_ASSERT(connectResult);
// Since the variable is not used in the app, this is added to
// avoid a compiler warning.
Q_UNUSED(connectResult);
// Set the content of the page and display it.
root->setContent(topContainer);
app->setScene(root);
...
// Define the slot function for the touch() signal.
void App::handleTouch(bb::cascades::TouchEvent* event)
{
// If the touch event is a move event, change the color of the
// square to green. If the touch event is an up event, change
// the color back to blue
if (event->touchType() == TouchType::Move)
movingContainer->setBackground(Color::Green);
else if (event->touchType() == TouchType::Up)
movingContainer->setBackground(Color::Blue);
// Determine the location inside the container that was touched,
// and move the blue square to that location.
movingContainer->setTranslationX(event->localX() -
(movingContainer->preferredWidth() / 2));
movingContainer->setTranslationY(event->localY() -
(movingContainer->preferredHeight() / 2));
}

1. Download the tools

Before you start developing, you'll need to visit the Downloads tab. Here you'll find downloads for the BlackBerry 10 Native SDK, BlackBerry 10 Device Simulator, and some other useful tools.

2. Try the sample apps

Now featuring a filter control, the Sample apps tab allows you to search for samples by name or by feature.

Select either the Core or Cascades radio buttons to display the samples relevant to you.

3. Educate yourself

The Documentation tab contains tons of examples, tutorials, and best practices to guide you along the path towards building an awesome app.

You can access all the documentation that you need in the left-hand navigation.

4. Start developing

The Reference tab is where you'll find essential details about how to use our APIs.

You can use the left-hand navigation to choose how you would like to browse the reference: by module, by topic, or alphabetically. If you have an idea of what you are looking for, start typing it in the Filter box.