Sign up to receive free email alerts when patent applications with chosen keywords are publishedSIGN UP

Abstract:

A method includes: displaying a first 2-D array of a plurality of user
interface components on the display in a portrait orientation; detecting
rotation of the display from the portrait orientation to a landscape
orientation with one or more accelerometers; and, in response to
detecting the rotation: rotating the first 2-D array of the plurality of
user interface components on the display about an axis that is normal to
a front surface of the display; replacing the first 2-D array with a
second 2-D array of the plurality of user interface components on the
display after the rotation of the first 2-D array exceeds a predefined
condition; and rotating the second 2-D array of the plurality of user
interface components on the display until the second 2-D array of the
plurality of user interface components is in the landscape orientation.

Claims:

1. A multifunction device, comprising: a display; one or more
accelerometers; one or more processors; memory; and one or more programs,
wherein the one or more programs are stored in the memory and configured
to be executed by the one or more processors, the one or more programs
including instructions for: displaying a first 2-D array of a plurality
of user interface components on the display in a portrait orientation;
detecting rotation of the display from the portrait orientation to a
landscape orientation with the one or more accelerometers; in response to
detecting the rotation: rotating the first 2-D array of the plurality of
user interface components on the display about an axis that is normal to
a front surface of the display; replacing the first 2-D array with a
second 2-D array of the plurality of user interface components on the
display after the rotation of the first 2-D array exceeds a predefined
condition; and rotating the second 2-D array of the plurality of user
interface components on the display until the second 2-D array of the
plurality of user interface components is in the landscape orientation.

2. The device of claim 1, wherein: the first 2-D array has M rows and N
columns and each user interface component is located at a respective
location in the first 2-D array that has a unique pair of (row index,
column index) determined in accordance with a predefined sequencing
algorithm; and the second 2-D array has N rows and M columns and each
user interface component is located at a respective location in the
second 2-D array that has a unique pair of (row index, column index)
determined in accordance with the predefined sequencing algorithm.

3. The device of claim 2, wherein M is different from N and, for at least
one respective user interface component in the plurality of user
interface components, its pair of (row index, column index) in the first
2-D array is distinct from its pair of (row index, column index) in the
second 2-D array.

4. The device of claim 2, wherein the predefined sequencing algorithm is
used to combine the M rows of user interface components in the first 2-D
array into a 1-D array and divide the 1-D array into the N rows of user
interface components in the second 2-D array.

5. The device of claim 2, wherein the predefined sequencing algorithm is
used to combine the N columns of user interface components in the first
2-D array into a 1-D array and divide the 1-D array into the M columns of
user interface components in the second 2-D array.

6. The device of claim 1, wherein the rotation direction of the first and
second 2-D arrays relative to the axis that is normal to the front
surface of the display is opposite the rotation direction of the rotation
of the display.

7. The device of claim 1, including instructions for: fading out the
first 2-D array from the display while rotating the first 2-D array; and
fading in the second 2-D array on the display following the fade-out of
the first 2-D array.

8. The device of claim 1, including instructions for: fading out the
first 2-D array from the display while rotating the first 2-D array; and
fading in the second 2-D array on the display while fading out the first
2-D array.

9. The device of claim 1, wherein replacing the first 2-D array with the
second 2-D array includes cross-fading individual user interface
components between the first 2-D array and the second 2-D array.

10. A method, comprising: at a multifunction device with a display and
one or more accelerometers: displaying a first 2-D array of a plurality
of user interface components on the display in a portrait orientation;
detecting rotation of the display from the portrait orientation to a
landscape orientation with the one or more accelerometers; in response to
detecting the rotation: rotating the first 2-D array of the plurality of
user interface components on the display about an axis that is normal to
a front surface of the display; replacing the first 2-D array with a
second 2-D array of the plurality of user interface components on the
display after the rotation of the first 2-D array exceeds a predefined
condition; and rotating the second 2-D array of the plurality of user
interface components on the display until the second 2-D array of the
plurality of user interface components is in the landscape orientation.

11. The method of claim 10, wherein: wherein the first 2-D array has M
rows and N columns and each user interface component is located at a
respective location in the first 2-D array that has a unique pair of (row
index, column index) determined in accordance with a predefined
sequencing algorithm; and wherein the second 2-D array has N rows and M
columns and each user interface component is located at a respective
location in the second 2-D array that has a unique pair of (row index,
column index) determined in accordance with the predefined sequencing
algorithm.

12. The method of claim 11, wherein M is different from N and, for at
least one respective user interface component in the plurality of user
interface components, its pair of (row index, column index) in the first
2-D array is distinct from its pair of (row index, column index) in the
second 2-D array.

13. The method of claim 11, wherein the predefined sequencing algorithm
is used to combine the M rows of user interface components in the first
2-D array into a 1-D array and divide the 1-D array into the N rows of
user interface components in the second 2-D array.

14. The method of claim 11, wherein the predefined sequencing algorithm
is used to combine the N columns of user interface components in the
first 2-D array into a 1-D array and divide the 1-D array into the M
columns of user interface components in the second 2-D array.

15. The method of claim 10, wherein the rotation direction of the first
and second 2-D arrays relative to the axis that is normal to the front
surface of the display is opposite the rotation direction of the rotation
of the display.

16. The method of claim 10, wherein replacing the first 2-D array with
the second 2-D array further includes: fading out the first 2-D array
from the display while rotating the first 2-D array; and fading in the
second 2-D array on the display following the fade-out of the first 2-D
array.

17. The method of claim 10, wherein replacing the first 2-D array with
the second 2-D array further includes: fading out the first 2-D array
from the display while rotating the first 2-D array; and fading in the
second 2-D array on the display while fading out the first 2-D array.

18. The method of claim 10, wherein replacing the first 2-D array with
the second 2-D array includes cross-fading individual user interface
components between the first 2-D array and the second 2-D array.

19. A graphical user interface on a multifunction device with a display,
one or more accelerometers, a memory, and one or more processors to
execute one or more programs stored in the memory, the graphical user
interface comprising: a first 2-D array of a plurality of user interface
components on the display in a portrait orientation; wherein: rotation of
the display from the portrait orientation to a landscape orientation is
detected with the one or more accelerometers; in response to detecting
the rotation: the first 2-D array of the plurality of user interface
components is rotated on the display about an axis that is normal to a
front surface of the display; the first 2-D array is replaced with a
second 2-D array of the plurality of user interface components on the
display after the rotation of the first 2-D array exceeds a predefined
condition; and the second 2-D array of the plurality of user interface
components is rotated on the display until the second 2-D array of the
plurality of user interface components is in the landscape orientation.

20. A computer readable storage medium storing one or more programs, the
one or more programs comprising instructions, which when executed by a
multifunction device with a display and one or more accelerometers, cause
the device to: display a first 2-D array of a plurality of user interface
components on the display in a portrait orientation; detect rotation of
the display from the portrait orientation to a landscape orientation with
the one or more accelerometers; in response to detecting the rotation:
rotate the first 2-D array of the plurality of user interface components
on the display about an axis that is normal to a front surface of the
display; replace the first 2-D array with a second 2-D array of the
plurality of user interface components on the display after the rotation
of the first 2-D array exceeds a predefined condition; and rotate the
second 2-D array of the plurality of user interface components on the
display until the second 2-D array of the plurality of user interface
components is in the landscape orientation.

Description:

RELATED APPLICATIONS

[0001] This application is claims priority to U.S. Provisional Application
Ser. No. 61/335,516, filed Jan. 6, 2010, entitled "Device, Method, and
Graphical User Interface with Grid Transformations During Device
Rotation," which is incorporated herein by reference in its entirety.

TECHNICAL FIELD

[0002] This relates generally to electronic devices with displays,
including but not limited to electronic devices with displays and
accelerometers that perform grid transformations during device rotation.

BACKGROUND

[0003] Existing methods for rotating a user interface during device
rotation are cumbersome and inefficient. For example, grid
transformations during device rotation are jarring and create a
significant cognitive burden on a user, thereby causing the user to take
longer to complete a task, wasting a user's time and a device's power
reserve, which can be particularly important consideration for
battery-operated devices.

SUMMARY

[0004] Accordingly, there is a need for computing devices with more
efficient and intuitive methods and interfaces for grid transformations
during device rotation. Such methods and interfaces may complement or
replace conventional methods for grid transformations during device
rotation. Such methods and interfaces reduce the cognitive burden on a
user and produce a more efficient human-machine interface. For
battery-operated computing devices, such methods and interfaces conserve
power and increase the time between battery charges.

[0005] The above deficiencies and other problems associated with user
interfaces for computing devices are reduced or eliminated by the
disclosed devices. In some embodiments, the device is a desktop computer.
In some embodiments, the device is portable (e.g., a notebook computer,
tablet computer, or handheld device). In some embodiments, the device has
a touchpad. In some embodiments, the device has a touch-sensitive display
(also known as a "touch screen" or "touch screen display"). In some
embodiments, the device has a graphical user interface (GUI), one or more
processors, memory and one or more modules, programs or sets of
instructions stored in the memory for performing multiple functions. In
some embodiments, the user interacts with the GUI primarily through
finger contacts and gestures on the touch-sensitive surface. In some
embodiments, the functions may include image editing, drawing,
presenting, word processing, website creating, disk authoring,
spreadsheet making, game playing, telephoning, video conferencing,
e-mailing, instant messaging, workout support, digital photographing,
digital videoing, web browsing, digital music playing, and/or digital
video playing. Executable instructions for performing these functions may
be included in a computer readable storage medium or other computer
program product configured for execution by one or more processors.

[0006] In accordance with some embodiments, a method is performed at a
multifunction device with a display and one or more accelerometers. The
method includes: displaying a first 2-D array of a plurality of user
interface components on the display in a portrait orientation; detecting
rotation of the display from the portrait orientation to a landscape
orientation with the one or more accelerometers; in response to detecting
the rotation: rotating the first 2-D array of the plurality of user
interface components on the display about an axis that is normal to a
front surface of the display; replacing the first 2-D array with a second
2-D array of the plurality of user interface components on the display
after the rotation of the first 2-D array exceeds a predefined condition;
and rotating the second 2-D array of the plurality of user interface
components on the display until the second 2-D array of the plurality of
user interface components is in the landscape orientation.

[0007] In accordance with some embodiments, a multifunction device
includes a display, one or more accelerometers, one or more processors,
memory, and one or more programs; the one or more programs are stored in
the memory and configured to be executed by the one or more processors
and the one or more programs include instructions for performing the
operations of the method described above. In accordance with some
embodiments, a graphical user interface on a multifunction device with a
display, one or more accelerometers, a memory, and one or more processors
to execute one or more programs stored in the memory includes one or more
of the elements displayed in the method described above, which are
updated in response to inputs, as described in the method above. In
accordance with some embodiments, a computer readable storage medium has
stored therein instructions which when executed by a multifunction device
with a display, cause the device to perform the operations of the method
described above. In accordance with some embodiments, a multifunction
device includes: a display, one or more accelerometers, and means for
performing the operations of the method described above. In accordance
with some embodiments, an information processing apparatus, for use in a
multifunction device with a display and one or more accelerometers,
includes means for performing the operations of the method described
above.

[0008] Thus, multifunction devices with displays and one or more
accelerometers are provided with faster, more efficient methods and
interfaces for grid transformations during device rotation, thereby
increasing the effectiveness, efficiency, and user satisfaction with such
devices. Such methods and interfaces may complement or replace
conventional methods for grid transformations during device rotation.

BRIEF DESCRIPTION OF THE DRAWINGS

[0009] For a better understanding of the aforementioned embodiments of the
invention as well as additional embodiments thereof, reference should be
made to the Description of Embodiments below, in conjunction with the
following drawings in which like reference numerals refer to
corresponding parts throughout the figures.

[0010] FIGS. 1A and 1B are block diagrams illustrating portable
multifunction devices with touch-sensitive displays in accordance with
some embodiments.

[0011]FIG. 1c is a block diagram illustrating exemplary components for
event handling in accordance with some embodiments.

[0012] FIG. 2 illustrates a portable multifunction device having a touch
screen in accordance with some embodiments.

[0013] FIG. 3 is a block diagram of an exemplary multifunction device with
a display and a touch-sensitive surface in accordance with some
embodiments.

[0014] FIGS. 4A and 4B illustrate exemplary user interfaces for a menu of
applications on a portable multifunction device in accordance with some
embodiments.

[0016] FIGS. 6A-6C are flow diagrams illustrating a method of grid
transformation during device rotation in accordance with some
embodiments.

DESCRIPTION OF EMBODIMENTS

[0017] Reference will now be made in detail to embodiments, examples of
which are illustrated in the accompanying drawings. In the following
detailed description, numerous specific details are set forth in order to
provide a thorough understanding of the present invention. However, it
will be apparent to one of ordinary skill in the art that the present
invention may be practiced without these specific details. In other
instances, well-known methods, procedures, components, circuits, and
networks have not been described in detail so as not to unnecessarily
obscure aspects of the embodiments.

[0018] It will also be understood that, although the terms first, second,
etc. may be used herein to describe various elements, these elements
should not be limited by these terms. These terms are only used to
distinguish one element from another. For example, a first contact could
be termed a second contact, and, similarly, a second contact could be
termed a first contact, without departing from the scope of the present
invention. The first contact and the second contact are both contacts,
but they are not the same contact.

[0019] The terminology used in the description of the invention herein is
for the purpose of describing particular embodiments only and is not
intended to be limiting of the invention. As used in the description of
the invention and the appended claims, the singular forms "a", "an" and
"the" are intended to include the plural forms as well, unless the
context clearly indicates otherwise. It will also be understood that the
term "and/or" as used herein refers to and encompasses any and all
possible combinations of one or more of the associated listed items. It
will be further understood that the terms "includes," "including,"
"comprises," and/or "comprising," when used in this specification,
specify the presence of stated features, integers, steps, operations,
elements, and/or components, but do not preclude the presence or addition
of one or more other features, integers, steps, operations, elements,
components, and/or groups thereof.

[0020] As used herein, the term "if" may be construed to mean "when" or
"upon" or "in response to determining" or "in response to detecting,"
depending on the context. Similarly, the phrase "if it is determined" or
"if [a stated condition or event] is detected" may be construed to mean
"upon determining" or "in response to determining" or "upon detecting
[the stated condition or event]" or "in response to detecting [the stated
condition or event]," depending on the context.

[0021] As used herein, the term "resolution" of a display refers to the
number of pixels (also called "pixel counts" or "pixel resolution") along
each axis or in each dimension of the display. For example, a display may
have a resolution of 320×480 pixels. Furthermore, as used herein,
the term "resolution" of a multifunction device refers to the resolution
of a display in the multifunction device. The term "resolution" does not
imply any limitations on the size of each pixel or the spacing of pixels.
For example, compared to a first display with a 1024×768-pixel
resolution, a second display with a 320×480-pixel resolution has a
lower resolution. However, it should be noted that the physical size of a
display depends not only on the pixel resolution, but also on many other
factors, including the pixel size and the spacing of pixels. Therefore,
the first display may have the same, smaller, or larger physical size,
compared to the second display.

[0022] As used herein, the term "video resolution" of a display refers to
the density of pixels along each axis or in each dimension of the
display. The video resolution is often measured in a dots-per-inch (DPI)
unit, which counts the number of pixels that can be placed in a line
within the span of one inch along a respective dimension of the display.

[0023] Embodiments of computing devices, user interfaces for such devices,
and associated processes for using such devices are described. In some
embodiments, the computing device is a portable communications device,
such as a mobile telephone, that also contains other functions, such as
PDA and/or music player functions. Exemplary embodiments of portable
multifunction devices include, without limitation, the iPhone® and
iPod Touch® devices from Apple Inc. of Cupertino, Calif. Other
portable devices, such as laptops or tablet computers with
touch-sensitive surfaces (e.g., touch screen displays and/or touch pads),
may also be used. It should also be understood that, in some embodiments,
the device is not a portable communications device, but is a desktop
computer with a touch-sensitive surface (e.g., a touch screen display
and/or a touch pad).

[0024] In the discussion that follows, a computing device that includes a
display and a touch-sensitive surface is described. It should be
understood, however, that the computing device may include one or more
other physical user-interface devices, such as a physical keyboard, a
mouse and/or a joystick.

[0025] The device supports a variety of applications, such as one or more
of the following: a drawing application, a presentation application, a
word processing application, a website creation application, a disk
authoring application, a spreadsheet application, a gaming application, a
telephone application, a video conferencing application, an e-mail
application, an instant messaging application, a workout support
application, a photo management application, a digital camera
application, a digital video camera application, a web browsing
application, a digital music player application, and/or a digital video
player application.

[0026] The various applications that may be executed on the device may use
at least one common physical user-interface device, such as the
touch-sensitive surface. One or more functions of the touch-sensitive
surface as well as corresponding information displayed on the device may
be adjusted and/or varied from one application to the next and/or within
a respective application. In this way, a common physical architecture
(such as the touch-sensitive surface) of the device may support the
variety of applications with user interfaces that are intuitive and
transparent to the user.

[0027] The user interfaces may include one or more soft keyboard
embodiments. The soft keyboard embodiments may include standard (QWERTY)
and/or non-standard configurations of symbols on the displayed icons of
the keyboard, such as those described in U.S. patent application Ser.
Nos. 11/459,606, "Keyboards For Portable Electronic Devices," filed Jul.
24, 2006, and 11/459,615, "Touch Screen Keyboards For Portable Electronic
Devices," filed Jul. 24, 2006, the contents of which are hereby
incorporated by reference in their entireties. The keyboard embodiments
may include a reduced number of icons (or soft keys) relative to the
number of keys in existing physical keyboards, such as that for a
typewriter. This may make it easier for users to select one or more icons
in the keyboard, and thus, one or more corresponding symbols. The
keyboard embodiments may be adaptive. For example, displayed icons may be
modified in accordance with user actions, such as selecting one or more
icons and/or one or more corresponding symbols. One or more applications
on the device may utilize common and/or different keyboard embodiments.
Thus, the keyboard embodiment used may be tailored to at least some of
the applications. In some embodiments, one or more keyboard embodiments
may be tailored to a respective user. For example, one or more keyboard
embodiments may be tailored to a respective user based on a word usage
history (lexicography, slang, individual usage) of the respective user.
Some of the keyboard embodiments may be adjusted to reduce a probability
of a user error when selecting one or more icons, and thus one or more
symbols, when using the soft keyboard embodiments.

[0028] Attention is now directed toward embodiments of portable devices
with touch-sensitive displays. FIGS. 1A and 1B are block diagrams
illustrating portable multifunction devices 100 with touch-sensitive
displays 112 in accordance with some embodiments. Touch-sensitive display
112 is sometimes called a "touch screen" for convenience, and may also be
known as or called a touch-sensitive display system. Device 100 may
include memory 102 (which may include one or more computer readable
storage mediums), memory controller 122, one or more processing units
(CPU's) 120, peripherals interface 118, RF circuitry 108, audio circuitry
110, speaker 111, microphone 113, input/output (I/O) subsystem 106, other
input or control devices 116, and external port 124. Device 100 may
include one or more optical sensors 164. These components may communicate
over one or more communication buses or signal lines 103.

[0029] It should be appreciated that device 100 is only one example of a
portable multifunction device, and that device 100 may have more or fewer
components than shown, may combine two or more components, or may have a
different configuration or arrangement of the components. The various
components shown in FIGS. 1A and 1B may be implemented in hardware,
software, or a combination of both hardware and software, including one
or more signal processing and/or application specific integrated
circuits.

[0030] Memory 102 may include high-speed random access memory and may also
include non-volatile memory, such as one or more magnetic disk storage
devices, flash memory devices, or other non-volatile solid-state memory
devices. Access to memory 102 by other components of device 100, such as
CPU 120 and the peripherals interface 118, may be controlled by memory
controller 122.

[0031] Peripherals interface 118 can be used to couple input and output
peripherals of the device to CPU 120 and memory 102. The one or more
processors 120 run or execute various software programs and/or sets of
instructions stored in memory 102 to perform various functions for device
100 and to process data.

[0032] In some embodiments, peripherals interface 118, CPU 120, and memory
controller 122 may be implemented on a single chip, such as chip 104. In
some other embodiments, they may be implemented on separate chips.

[0033] RF (radio frequency) circuitry 108 receives and sends RF signals,
also called electromagnetic signals. RF circuitry 108 converts electrical
signals to/from electromagnetic signals and communicates with
communications networks and other communications devices via the
electromagnetic signals. RF circuitry 108 may include well-known
circuitry for performing these functions, including but not limited to an
antenna system, an RF transceiver, one or more amplifiers, a tuner, one
or more oscillators, a digital signal processor, a CODEC chipset, a
subscriber identity module (SIM) card, memory, and so forth. RF circuitry
108 may communicate with networks, such as the Internet, also referred to
as the World Wide Web (WWW), an intranet and/or a wireless network, such
as a cellular telephone network, a wireless local area network (LAN)
and/or a metropolitan area network (MAN), and other devices by wireless
communication. The wireless communication may use any of a plurality of
communications standards, protocols and technologies, including but not
limited to Global System for Mobile Communications (GSM), Enhanced Data
GSM Environment (EDGE), high-speed downlink packet access (HSDPA),
wideband code division multiple access (W-CDMA), code division multiple
access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless
Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and/or
IEEE 802.11n), voice over Internet Protocol (VoIP), Wi-MAX, a protocol
for e-mail (e.g., Internet message access protocol (IMAP) and/or post
office protocol (POP)), instant messaging (e.g., extensible messaging and
presence protocol (XMPP), Session Initiation Protocol for Instant
Messaging and Presence Leveraging Extensions (SIMPLE), Instant Messaging
and Presence Service (IMPS)), and/or Short Message Service (SMS), or any
other suitable communication protocol, including communication protocols
not yet developed as of the filing date of this document.

[0034] Audio circuitry 110, speaker 111, and microphone 113 provide an
audio interface between a user and device 100. Audio circuitry 110
receives audio data from peripherals interface 118, converts the audio
data to an electrical signal, and transmits the electrical signal to
speaker 111. Speaker 111 converts the electrical signal to human-audible
sound waves. Audio circuitry 110 also receives electrical signals
converted by microphone 113 from sound waves. Audio circuitry 110
converts the electrical signal to audio data and transmits the audio data
to peripherals interface 118 for processing. Audio data may be retrieved
from and/or transmitted to memory 102 and/or RF circuitry 108 by
peripherals interface 118. In some embodiments, audio circuitry 110 also
includes a headset jack (e.g., 212, FIG. 2). The headset jack provides an
interface between audio circuitry 110 and removable audio input/output
peripherals, such as output-only headphones or a headset with both output
(e.g., a headphone for one or both ears) and input (e.g., a microphone).

[0035] I/O subsystem 106 couples input/output peripherals on device 100,
such as touch screen 112 and other input control devices 116, to
peripherals interface 118. I/O subsystem 106 may include display
controller 156 and one or more input controllers 160 for other input or
control devices. The one or more input controllers 160 receive/send
electrical signals from/to other input or control devices 116. The other
input control devices 116 may include physical buttons (e.g., push
buttons, rocker buttons, etc.), dials, slider switches, joysticks, click
wheels, and so forth. In some alternate embodiments, input controller(s)
160 may be coupled to any (or none) of the following: a keyboard,
infrared port, USB port, and a pointer device such as a mouse. The one or
more buttons (e.g., 208, FIG. 2) may include an up/down button for volume
control of speaker 111 and/or microphone 113. The one or more buttons may
include a push button (e.g., 206, FIG. 2). A quick press of the push
button may disengage a lock of touch screen 112 or begin a process that
uses gestures on the touch screen to unlock the device, as described in
U.S. patent application Ser. No. 11/322,549, "Unlocking a Device by
Performing Gestures on an Unlock Image," filed Dec. 23, 2005, which is
hereby incorporated by reference in its entirety. A longer press of the
push button (e.g., 206) may turn power to device 100 on or off. The user
may be able to customize a functionality of one or more of the buttons.
Touch screen 112 is used to implement virtual or soft buttons and one or
more soft keyboards.

[0036] Touch-sensitive display 112 provides an input interface and an
output interface between the device and a user. Display controller 156
receives and/or sends electrical signals from/to touch screen 112. Touch
screen 112 displays visual output to the user. The visual output may
include graphics, text, icons, video, and any combination thereof
(collectively termed "graphics"). In some embodiments, some or all of the
visual output may correspond to user-interface objects.

[0037] Touch screen 112 has a touch-sensitive surface, sensor or set of
sensors that accepts input from the user based on haptic and/or tactile
contact. Touch screen 112 and display controller 156 (along with any
associated modules and/or sets of instructions in memory 102) detect
contact (and any movement or breaking of the contact) on touch screen 112
and converts the detected contact into interaction with user-interface
objects (e.g., one or more soft keys, icons, web pages or images) that
are displayed on touch screen 112. In an exemplary embodiment, a point of
contact between touch screen 112 and the user corresponds to a finger of
the user.

[0038] Touch screen 112 may use LCD (liquid crystal display) technology,
LPD (light emitting polymer display) technology, or LED (light emitting
diode) technology, although other display technologies may be used in
other embodiments. Touch screen 112 and display controller 156 may detect
contact and any movement or breaking thereof using any of a plurality of
touch sensing technologies now known or later developed, including but
not limited to capacitive, resistive, infrared, and surface acoustic wave
technologies, as well as other proximity sensor arrays or other elements
for determining one or more points of contact with touch screen 112. In
an exemplary embodiment, projected mutual capacitance sensing technology
is used, such as that found in the iPhone® and iPod Touch® from
Apple Inc. of Cupertino, Calif.

[0039] A touch-sensitive display in some embodiments of touch screen 112
may be analogous to the multi-touch sensitive touchpads described in the
following U.S. Pat. Nos. 6,323,846 (Westerman et al.), 6,570,557
(Westerman et al.), and/or 6,677,932 (Westerman), and/or U.S. Patent
Publication 2002/0015024A1, each of which is hereby incorporated by
reference in its entirety. However, touch screen 112 displays visual
output from portable device 100, whereas touch sensitive touchpads do not
provide visual output.

[0041] Touch screen 112 may have a video resolution in excess of 100 dpi.
In some embodiments, the touch screen has a video resolution of
approximately 160 dpi. The user may make contact with touch screen 112
using any suitable object or appendage, such as a stylus, a finger, and
so forth. In some embodiments, the user interface is designed to work
primarily with finger-based contacts and gestures, which can be less
precise than stylus-based input due to the larger area of contact of a
finger on the touch screen. In some embodiments, the device translates
the rough finger-based input into a precise pointer/cursor position or
command for performing the actions desired by the user.

[0042] In some embodiments, in addition to the touch screen, device 100
may include a touchpad (not shown) for activating or deactivating
particular functions. In some embodiments, the touchpad is a
touch-sensitive area of the device that, unlike the touch screen, does
not display visual output. The touchpad may be a touch-sensitive surface
that is separate from touch screen 112 or an extension of the
touch-sensitive surface formed by the touch screen.

[0043] In some embodiments, device 100 may include a physical or virtual
wheel (e.g., a click wheel) as input control device 116. A user may
navigate among and interact with one or more graphical objects (e.g.,
icons) displayed in touch screen 112 by rotating the click wheel or by
moving a point of contact with the click wheel (e.g., where the amount of
movement of the point of contact is measured by its angular displacement
with respect to a center point of the click wheel). The click wheel may
also be used to select one or more of the displayed icons. For example,
the user may press down on at least a portion of the click wheel or an
associated button. User commands and navigation commands provided by the
user via the click wheel may be processed by input controller 160 as well
as one or more of the modules and/or sets of instructions in memory 102.
For a virtual click wheel, the click wheel and click wheel controller may
be part of touch screen 112 and display controller 156, respectively. For
a virtual click wheel, the click wheel may be either an opaque or
semitransparent object that appears and disappears on the touch screen
display in response to user interaction with the device. In some
embodiments, a virtual click wheel is displayed on the touch screen of a
portable multifunction device and operated by user contact with the touch
screen.

[0044] Device 100 also includes power system 162 for powering the various
components. Power system 162 may include a power management system, one
or more power sources (e.g., battery, alternating current (AC)), a
recharging system, a power failure detection circuit, a power converter
or inverter, a power status indicator (e.g., a light-emitting diode
(LED)) and any other components associated with the generation,
management and distribution of power in portable devices.

[0045] Device 100 may also include one or more optical sensors 164. FIGS.
1A and 1B show an optical sensor coupled to optical sensor controller 158
in I/O subsystem 106. Optical sensor 164 may include charge-coupled
device (CCD) or complementary metal-oxide semiconductor (CMOS)
phototransistors. Optical sensor 164 receives light from the environment,
projected through one or more lens, and converts the light to data
representing an image. In conjunction with imaging module 143 (also
called a camera module), optical sensor 164 may capture still images or
video. In some embodiments, an optical sensor is located on the back of
device 100, opposite touch screen display 112 on the front of the device,
so that the touch screen display may be used as a viewfinder for still
and/or video image acquisition. In some embodiments, an optical sensor is
located on the front of the device so that the user's image may be
obtained for videoconferencing while the user views the other video
conference participants on the touch screen display. In some embodiments,
the position of optical sensor 164 can be changed by the user (e.g., by
rotating the lens and the sensor in the device housing) so that a single
optical sensor 164 may be used along with the touch screen display for
both video conferencing and still and/or video image acquisition.

[0046] Device 100 may also include one or more proximity sensors 166.
FIGS. 1A and 1B show proximity sensor 166 coupled to peripherals
interface 118. Alternately, proximity sensor 166 may be coupled to input
controller 160 in I/O subsystem 106. Proximity sensor 166 may perform as
described in U.S. patent application Ser. Nos. 11/241,839, "Proximity
Detector In Handheld Device"; 11/240,788, "Proximity Detector In Handheld
Device"; 11/620,702, "Using Ambient Light Sensor To Augment Proximity
Sensor Output"; 11/586,862, "Automated Response To And Sensing Of User
Activity In Portable Devices"; and 11/638,251, "Methods And Systems For
Automatic Configuration Of Peripherals," which are hereby incorporated by
reference in their entirety. In some embodiments, the proximity sensor
turns off and disables touch screen 112 when the multifunction device is
placed near the user's ear (e.g., when the user is making a phone call).

[0047] Device 100 may also include one or more accelerometers 168. FIGS.
1A and 1B show accelerometer 168 coupled to peripherals interface 118.
Alternately, accelerometer 168 may be coupled to an input controller 160
in I/O subsystem 106. Accelerometer 168 may perform as described in U.S.
Patent Publication No. 20050190059, "Acceleration-based Theft Detection
System for Portable Electronic Devices," and U.S. Patent Publication No.
20060017692, "Methods And Apparatuses For Operating A Portable Device
Based On An Accelerometer," both of which are which are incorporated by
reference herein in their entirety. In some embodiments, information is
displayed on the touch screen display in a portrait view or a landscape
view based on an analysis of data received from the one or more
accelerometers. Device 100 optionally includes, in addition to
accelerometer(s) 168, a magnetometer (not shown) and a GPS (or GLONASS or
other global navigation system) receiver (not shown) for obtaining
information concerning the location and orientation (e.g., portrait or
landscape) of device 100.

[0048] In some embodiments, the software components stored in memory 102
include operating system 126, communication module (or set of
instructions) 128, contact/motion module (or set of instructions) 130,
graphics module (or set of instructions) 132, text input module (or set
of instructions) 134, Global Positioning System (GPS) module (or set of
instructions) 135, and applications (or sets of instructions) 136.
Furthermore, in some embodiments memory 102 stores device/global internal
state 157, as shown in FIGS. 1A, 1B and 3. Device/global internal state
157 includes one or more of: active application state, indicating which
applications, if any, are currently active; display state, indicating
what applications, views or other information occupy various regions of
touch screen display 112; sensor state, including information obtained
from the device's various sensors and input control devices 116; and
location information concerning the device's location and/or attitude.

[0049] Operating system 126 (e.g., Darwin, RTXC, LINUX, UNIX, OS X,
WINDOWS, or an embedded operating system such as VxWorks) includes
various software components and/or drivers for controlling and managing
general system tasks (e.g., memory management, storage device control,
power management, etc.) and facilitates communication between various
hardware and software components.

[0050] Communication module 128 facilitates communication with other
devices over one or more external ports 124 and also includes various
software components for handling data received by RF circuitry 108 and/or
external port 124. External port 124 (e.g., Universal Serial Bus (USB),
FIREWIRE, etc.) is adapted for coupling directly to other devices or
indirectly over a network (e.g., the Internet, wireless LAN, etc.). In
some embodiments, the external port is a multi-pin (e.g., 30-pin)
connector that is the same as, or similar to and/or compatible with the
30-pin connector used on iPod (trademark of Apple Inc.) devices.

[0051] Contact/motion module 130 may detect contact with touch screen 112
(in conjunction with display controller 156) and other touch sensitive
devices (e.g., a touchpad or physical click wheel). Contact/motion module
130 includes various software components for performing various
operations related to detection of contact, such as determining if
contact has occurred (e.g., detecting a finger-down event), determining
if there is movement of the contact and tracking the movement across the
touch-sensitive surface (e.g., detecting one or more finger-dragging
events), and determining if the contact has ceased (e.g., detecting a
finger-up event or a break in contact). Contact/motion module 130
receives contact data from the touch-sensitive surface. Determining
movement of the point of contact, which is represented by a series of
contact data, may include determining speed (magnitude), velocity
(magnitude and direction), and/or an acceleration (a change in magnitude
and/or direction) of the point of contact. These operations may be
applied to single contacts (e.g., one finger contacts) or to multiple
simultaneous contacts (e.g., "multitouch"/multiple finger contacts). In
some embodiments, contact/motion module 130 and display controller 156
detects contact on a touchpad. In some embodiments, contact/motion module
130 and controller 160 detects contact on a click wheel.

[0052] Contact/motion module 130 may detect a gesture input by a user.
Different gestures on the touch-sensitive surface have different contact
patterns. Thus, a gesture may be detected by detecting a particular
contact pattern. For example, detecting a finger tap gesture includes
detecting a finger-down event followed by detecting a finger-up (lift
off) event at the same position (or substantially the same position) as
the finger-down event (e.g., at the position of an icon). As another
example, detecting a finger swipe gesture on the touch-sensitive surface
includes detecting a finger-down event followed by detecting one or more
finger-dragging events, and subsequently followed by detecting a
finger-up (lift off) event.

[0053] Graphics module 132 includes various known software components for
rendering and displaying graphics on touch screen 112 or other display,
including components for changing the intensity of graphics that are
displayed. As used herein, the term "graphics" includes any object that
can be displayed to a user, including without limitation text, web pages,
icons (such as user-interface objects including soft keys), digital
images, videos, animations and the like.

[0054] In some embodiments, graphics module 132 stores data representing
graphics to be used. Each graphic may be assigned a corresponding code.
Graphics module 132 receives, from applications etc., one or more codes
specifying graphics to be displayed along with, if necessary, coordinate
data and other graphic property data, and then generates screen image
data to output to display controller 156.

[0055] Text input module 134, which may be a component of graphics module
132, provides soft keyboards for entering text in various applications
(e.g., contacts 137, e-mail 140, IM 141, browser 147, and any other
application that needs text input).

[0056] GPS module 135 determines the location of the device and provides
this information for use in various applications (e.g., to telephone 138
for use in location-based dialing, to camera 143 as picture/video
metadata, and to applications that provide location-based services such
as weather widgets, local yellow page widgets, and map/navigation
widgets).

[0079] In conjunction with RF circuitry 108, audio circuitry 110, speaker
111, microphone 113, touch screen 112, display controller 156, contact
module 130, graphics module 132, and text input module 134, telephone
module 138 may be used to enter a sequence of characters corresponding to
a telephone number, access one or more telephone numbers in address book
137, modify a telephone number that has been entered, dial a respective
telephone number, conduct a conversation and disconnect or hang up when
the conversation is completed. As noted above, the wireless communication
may use any of a plurality of communications standards, protocols and
technologies.

[0082] In conjunction with RF circuitry 108, touch screen 112, display
controller 156, contact module 130, graphics module 132, and text input
module 134, the instant messaging module 141 includes executable
instructions to enter a sequence of characters corresponding to an
instant message, to modify previously entered characters, to transmit a
respective instant message (for example, using a Short Message Service
(SMS) or Multimedia Message Service (MMS) protocol for telephony-based
instant messages or using XMPP, SIMPLE, or IMPS for Internet-based
instant messages), to receive instant messages and to view received
instant messages. In some embodiments, transmitted and/or received
instant messages may include graphics, photos, audio files, video files
and/or other attachments as are supported in a MMS and/or an Enhanced
Messaging Service (EMS). As used herein, "instant messaging" refers to
both telephony-based messages (e.g., messages sent using SMS or MMS) and
Internet-based messages (e.g., messages sent using XMPP, SIMPLE, or
IMPS).

[0087] In conjunction with touch screen 112, display system controller
156, contact module 130, graphics module 132, audio circuitry 110,
speaker 111, RF circuitry 108, and browser module 147, music player
module 146 includes executable instructions that allow the user to
download and play back recorded music and other sound files stored in one
or more file formats, such as MP3 or AAC files. In some embodiments,
device 100 may include the functionality of an MP3 player, such as an
iPod (trademark of Apple Inc.).

[0093] In conjunction with touch screen 112, display controller 156,
contact module 130, graphics module 132, and text input module 134, notes
module 153 includes executable instructions to create and manage notes,
to do lists, and the like in accordance with user instructions.

[0094] In conjunction with RF circuitry 108, touch screen 112, display
system controller 156, contact module 130, graphics module 132, text
input module 134, GPS module 135, and browser module 147, map module 154
may be used to receive, display, modify, and store maps and data
associated with maps (e.g., driving directions; data on stores and other
points of interest at or near a particular location; and other
location-based data) in accordance with user instructions.

[0095] In conjunction with touch screen 112, display system controller
156, contact module 130, graphics module 132, audio circuitry 110,
speaker 111, RF circuitry 108, text input module 134, e-mail client
module 140, and browser module 147, online video module 155 includes
instructions that allow the user to access, browse, receive (e.g., by
streaming and/or download), play back (e.g., on the touch screen or on an
external, connected display via external port 124), send an e-mail with a
link to a particular online video, and otherwise manage online videos in
one or more file formats, such as H.264. In some embodiments, instant
messaging module 141, rather than e-mail client module 140, is used to
send a link to a particular online video. Additional description of the
online video application can be found in U.S. Provisional Patent
Application No. 60/936,562, "Portable Multifunction Device, Method, and
Graphical User Interface for Playing Online Videos," filed Jun. 20, 2007,
and U.S. patent application Ser. No. 11/968,067, "Portable Multifunction
Device, Method, and Graphical User Interface for Playing Online Videos,"
filed Dec. 31, 2007, the content of which is hereby incorporated by
reference in its entirety.

[0096] Each of the above identified modules and applications correspond to
a set of executable instructions for performing one or more functions
described above and the methods described in this application (e.g., the
computer-implemented methods and other information processing methods
described herein). These modules (i.e., sets of instructions) need not be
implemented as separate software programs, procedures or modules, and
thus various subsets of these modules may be combined or otherwise
re-arranged in various embodiments. For example, video player module 145
may be combined with music player module 146 into a single module (e.g.,
video and music player module 152, FIG. 1B). In some embodiments, memory
102 may store a subset of the modules and data structures identified
above. Furthermore, memory 102 may store additional modules and data
structures not described above.

[0097] In some embodiments, device 100 is a device where operation of a
predefined set of functions on the device is performed exclusively
through a touch screen and/or a touchpad. By using a touch screen and/or
a touchpad as the primary input control device for operation of device
100, the number of physical input control devices (such as push buttons,
dials, and the like) on device 100 may be reduced.

[0098] The predefined set of functions that may be performed exclusively
through a touch screen and/or a touchpad include navigation between user
interfaces. In some embodiments, the touchpad, when touched by the user,
navigates device 100 to a main, home, or root menu from any user
interface that may be displayed on device 100. In such embodiments, the
touchpad may be referred to as a "menu button." In some other
embodiments, the menu button may be a physical push button or other
physical input control device instead of a touchpad.

[0099]FIG. 1c is a block diagram illustrating exemplary components for
event handling in accordance with some embodiments. In some embodiments,
memory 102 (in FIGS. 1A and 1B) or 370 (FIG. 3) includes event sorter 170
(e.g., in operating system 126) and a respective application 136-1 (e.g.,
any of the aforementioned applications 137-151, 155, 380-390).

[0100] Event sorter 170 receives event information and determines the
application 136-1 and application view 191 of application 136-1 to which
to deliver the event information. Event sorter 170 includes event monitor
171 and event dispatcher module 174. In some embodiments, application
136-1 includes application internal state 192, which indicates the
current application view(s) displayed on touch sensitive display 112 when
the application is active or executing. In some embodiments,
device/global internal state 157 is used by event sorter 170 to determine
which application(s) is(are) currently active, and application internal
state 192 is used by event sorter 170 to determine application views 191
to which to deliver event information.

[0101] In some embodiments, application internal state 192 includes
additional information, such as one or more of: resume information to be
used when application 136-1 resumes execution, user interface state
information that indicates information being displayed or that is ready
for display by application 136-1, a state queue for enabling the user to
go back to a prior state or view of application 136-1, and a redo/undo
queue of previous actions taken by the user.

[0103] In some embodiments, event monitor 171 sends requests to the
peripherals interface 118 at predetermined intervals. In response,
peripherals interface 118 transmits event information. In other
embodiments, peripheral interface 118 transmits event information only
when there is a significant event (e.g., receiving an input above a
predetermined noise threshold and/or for more than a predetermined
duration).

[0105] Hit view determination module 172 provides software procedures for
determining where a sub-event has taken place within one or more views,
when touch sensitive display 112 displays more than one view. Views are
made up of controls and other elements that a user can see on the
display.

[0106] Another aspect of the user interface associated with an application
is a set of views, sometimes herein called application views or user
interface windows, in which information is displayed and touch-based
gestures occur. The application views (of a respective application) in
which a touch is detected may correspond to programmatic levels within a
programmatic or view hierarchy of the application. For example, the
lowest level view in which a touch is detected may be called the hit
view, and the set of events that are recognized as proper inputs may be
determined based, at least in part, on the hit view of the initial touch
that begins a touch-based gesture.

[0107] Hit view determination module 172 receives information related to
sub-events of a touch-based gesture. When an application has multiple
views organized in a hierarchy, hit view determination module 172
identifies a hit view as the lowest view in the hierarchy which should
handle the sub-event. In most circumstances, the hit view is the lowest
level view in which an initiating sub-event occurs (i.e., the first
sub-event in the sequence of sub-events that form an event or potential
event). Once the hit view is identified by the hit view determination
module, the hit view typically receives all sub-events related to the
same touch or input source for which it was identified as the hit view.

[0108] Active event recognizer determination module 173 determines which
view or views within a view hierarchy should receive a particular
sequence of sub-events. In some embodiments, active event recognizer
determination module 173 determines that only the hit view should receive
a particular sequence of sub-events. In other embodiments, active event
recognizer determination module 173 determines that all views that
include the physical location of a sub-event are actively involved views,
and therefore determines that all actively involved views should receive
a particular sequence of sub-events. In other embodiments, even if touch
sub-events were entirely confined to the area associated with one
particular view, views higher in the hierarchy would still remain as
actively involved views.

[0110] In some embodiments, operating system 126 includes event sorter
170. Alternatively, application 136-1 includes event sorter 170. In yet
other embodiments, event sorter 170 is a stand-alone module, or a part of
another module stored in memory 102, such as contact/motion module 130.

[0111] In some embodiments, application 136-1 includes a plurality of
event handlers 190 and one or more application views 191, each of which
includes instructions for handling touch events that occur within a
respective view of the application's user interface. Each application
view 191 of the application 136-1 includes one or more event recognizers
180. Typically, a respective application view 191 includes a plurality of
event recognizers 180. In other embodiments, one or more of event
recognizers 180 are part of a separate module, such as a user interface
kit (not shown) or a higher level object from which application 136-1
inherits methods and other properties. In some embodiments, a respective
event handler 190 includes one or more of: data updater 176, object
updater 177, GUI updater 178, and/or event data 179 received from event
sorter 170. Event handler 190 may utilize or call data updater 176,
object updater 177 or GUI updater 178 to update the application internal
state 192. Alternatively, one or more of the application views 191
includes one or more respective event handlers 190. Also, in some
embodiments, one or more of data updater 176, object updater 177, and GUI
updater 178 are included in a respective application view 191.

[0113] Event receiver 182 receives event information from event sorter
170. The event information includes information about a sub-event, for
example, a touch or a touch movement. Depending on the sub-event, the
event information also includes additional information, such as location
of the sub-event. When the sub-event concerns motion of a touch the event
information may also include speed and direction of the sub-event. In
some embodiments, events include rotation of the device from one
orientation to another (e.g., from a portrait orientation to a landscape
orientation, or vice versa), and the event information includes
corresponding information about the current orientation (also called
device attitude) of the device.

[0114] Event comparator 184 compares the event information to predefined
event or sub-event definitions and, based on the comparison, determines
an event or sub-event, or determines or updates the state of an event or
sub-event. In some embodiments, event comparator 184 includes event
definitions 186. Event definitions 186 contain definitions of events
(e.g., predefined sequences of sub-events), for example, event 1 (187-1),
event 2 (187-2), and others. In some embodiments, sub-events in an event
187 include, for example, touch begin, touch end, touch movement, touch
cancellation, and multiple touching. In one example, the definition for
event 1 (187-1) is a double tap on a displayed object. The double tap,
for example, comprises a first touch (touch begin) on the displayed
object for a predetermined phase, a first lift-off (touch end) for a
predetermined phase, a second touch (touch begin) on the displayed object
for a predetermined phase, and a second lift-off (touch end) for a
predetermined phase. In another example, the definition for event 2
(187-2) is a dragging on a displayed object. The dragging, for example,
comprises a touch (or contact) on the displayed object for a
predetermined phase, a movement of the touch across touch-sensitive
display 112, and lift-off of the touch (touch end). In some embodiments,
the event also includes information for one or more associated event
handlers 190.

[0115] In some embodiments, event definition 187 includes a definition of
an event for a respective user-interface object. In some embodiments,
event comparator 184 performs a hit test to determine which
user-interface object is associated with a sub-event. For example, in an
application view in which three user-interface objects are displayed on
touch-sensitive display 112, when a touch is detected on touch-sensitive
display 112, event comparator 184 performs a hit test to determine which
of the three user-interface objects is associated with the touch
(sub-event). If each displayed object is associated with a respective
event handler 190, the event comparator uses the result of the hit test
to determine which event handler 190 should be activated. For example,
event comparator 184 selects an event handler associated with the
sub-event and the object triggering the hit test.

[0116] In some embodiments, the definition for a respective event 187 also
includes delayed actions that delay delivery of the event information
until after it has been determined whether the sequence of sub-events
does or does not correspond to the event recognizer's event type.

[0117] When a respective event recognizer 180 determines that the series
of sub-events do not match any of the events in event definitions 186,
the respective event recognizer 180 enters an event impossible, event
failed, or event ended state, after which it disregards subsequent
sub-events of the touch-based gesture. In this situation, other event
recognizers, if any, that remain active for the hit view continue to
track and process sub-events of an ongoing touch-based gesture.

[0118] In some embodiments, a respective event recognizer 180 includes
metadata 183 with configurable properties, flags, and/or lists that
indicate how the event delivery system should perform sub-event delivery
to actively involved event recognizers. In some embodiments, metadata 183
includes configurable properties, flags, and/or lists that indicate how
event recognizers may interact with one another. In some embodiments,
metadata 183 includes configurable properties, flags, and/or lists that
indicate whether sub-events are delivered to varying levels in the view
or programmatic hierarchy.

[0119] In some embodiments, a respective event recognizer 180 activates
event handler 190 associated with an event when one or more particular
sub-events of an event are recognized. In some embodiments, a respective
event recognizer 180 delivers event information associated with the event
to event handler 190. Activating an event handler 190 is distinct from
sending (and deferred sending) sub-events to a respective hit view. In
some embodiments, event recognizer 180 throws a flag associated with the
recognized event, and event handler 190 associated with the flag catches
the flag and performs a predefined process.

[0120] In some embodiments, event delivery instructions 188 include
sub-event delivery instructions that deliver event information about a
sub-event without activating an event handler. Instead, the sub-event
delivery instructions deliver event information to event handlers
associated with the series of sub-events or to actively involved views.
Event handlers associated with the series of sub-events or with actively
involved views receive the event information and perform a predetermined
process.

[0121] In some embodiments, data updater 176 creates and updates data used
in application 136-1. For example, data updater 176 updates the telephone
number used in contacts module 137, or stores a video file used in video
player module 145. In some embodiments, object updater 177 creates and
updates objects used in application 136-1. For example, object updater
176 creates a new user-interface object or updates the position of a
user-interface object. GUI updater 178 updates the GUI. For example, GUI
updater 178 prepares display information and sends it to graphics module
132 for display on a touch-sensitive display.

[0122] In some embodiments, event handler(s) 190 includes or has access to
data updater 176, object updater 177, and GUI updater 178. In some
embodiments, data updater 176, object updater 177, and GUI updater 178
are included in a single module of a respective application 136-1 or
application view 191. In other embodiments, they are included in two or
more software modules.

[0123] It shall be understood that the foregoing discussion regarding
event handling of user touches on touch-sensitive displays also applies
to other forms of user inputs to operate multifunction devices 100 with
input-devices, not all of which are initiated on touch screens, e.g.,
coordinating mouse movement and mouse button presses with or without
single or multiple keyboard presses or holds, user movements taps, drags,
scrolls, etc., on touch-pads, pen stylus inputs, movement of the device,
oral instructions, detected eye movements, biometric inputs, and/or any
combination thereof, which may be utilized as inputs corresponding to
sub-events which define an event to be recognized.

[0124] FIG. 2 illustrates a portable multifunction device 100 having a
touch screen 112 in accordance with some embodiments. The touch screen
may display one or more graphics within user interface (UI) 200. In this
embodiment, as well as others described below, a user may select one or
more of the graphics by making contact or touching the graphics, for
example, with one or more fingers 202 (not drawn to scale in the figure)
or one or more styluses 203 (not drawn to scale in the figure). In some
embodiments, selection of one or more graphics occurs when the user
breaks contact with the one or more graphics. In some embodiments, the
contact may include a gesture, such as one or more taps, one or more
swipes (from left to right, right to left, upward and/or downward) and/or
a rolling of a finger (from right to left, left to right, upward and/or
downward) that has made contact with device 100. In some embodiments,
inadvertent contact with a graphic may not select the graphic. For
example, a swipe gesture that sweeps over an application icon may not
select the corresponding application when the gesture corresponding to
selection is a tap.

[0125] Device 100 may also include one or more physical buttons, such as
"home" or menu button 204. As described previously, menu button 204 may
be used to navigate to any application 136 in a set of applications that
may be executed on device 100. Alternatively, in some embodiments, the
menu button is implemented as a soft key in a GUI displayed on touch
screen 112.

[0126] In one embodiment, device 100 includes touch screen 112, menu
button 204, push button 206 for powering the device on/off and locking
the device, volume adjustment button(s) 208, Subscriber Identity Module
(SIM) card slot 210, head set jack 212, and docking/charging external
port 124. Push button 206 may be used to turn the power on/off on the
device by depressing the button and holding the button in the depressed
state for a predefined time interval; to lock the device by depressing
the button and releasing the button before the predefined time interval
has elapsed; and/or to unlock the device or initiate an unlock process.
In an alternative embodiment, device 100 also may accept verbal input for
activation or deactivation of some functions through microphone 113.

[0127] FIG. 3 is a block diagram of an exemplary multifunction device with
a display and a touch-sensitive surface in accordance with some
embodiments. Device 300 need not be portable. In some embodiments, device
300 is a laptop computer, a desktop computer, a tablet computer, a
multimedia player device, a navigation device, an educational device
(such as a child's learning toy), a gaming system, or a control device
(e.g., a home or industrial controller). Device 300 typically includes
one or more processing units (CPU's) 310, one or more network or other
communications interfaces 360, memory 370, and one or more communication
buses 320 for interconnecting these components. Communication buses 320
may include circuitry (sometimes called a chipset) that interconnects and
controls communications between system components. Device 300 includes
input/output (I/O) interface 330 comprising display 340, which is
typically a touch screen display. I/O interface 330 also may include a
keyboard and/or mouse (or other pointing device) 350 and touchpad 355.
Memory 370 includes high-speed random access memory, such as DRAM, SRAM,
DDR RAM or other random access solid state memory devices; and may
include non-volatile memory, such as one or more magnetic disk storage
devices, optical disk storage devices, flash memory devices, or other
non-volatile solid state storage devices. Memory 370 may optionally
include one or more storage devices remotely located from CPU(s) 310. In
some embodiments, memory 370 stores programs, modules, and data
structures analogous to the programs, modules, and data structures stored
in memory 102 of portable multifunction device 100 (FIG. 1), or a subset
thereof. Furthermore, memory 370 may store additional programs, modules,
and data structures not present in memory 102 of portable multifunction
device 100. For example, memory 370 of device 300 may store drawing
module 380, presentation module 382, word processing module 384, website
creation module 386, disk authoring module 388, and/or spreadsheet module
390, while memory 102 of portable multifunction device 100 (FIG. 1) may
not store these modules.

[0128] Each of the above identified elements in FIG. 3 may be stored in
one or more of the previously mentioned memory devices. Each of the above
identified modules corresponds to a set of instructions for performing a
function described above. The above identified modules or programs (i.e.,
sets of instructions) need not be implemented as separate software
programs, procedures or modules, and thus various subsets of these
modules may be combined or otherwise re-arranged in various embodiments.
In some embodiments, memory 370 may store a subset of the modules and
data structures identified above. Furthermore, memory 370 may store
additional modules and data structures not described above.

[0129] Attention is now directed towards embodiments of user interfaces
("UI") that may be implemented on portable multifunction device 100.

[0130] FIGS. 4A and 4B illustrate exemplary user interfaces for a menu of
applications on portable multifunction device 100 in accordance with some
embodiments. Similar user interfaces may be implemented on device 300. In
some embodiments, user interface 400A includes the following elements, or
a subset or superset thereof: [0131] Signal strength indicator(s) 402
for wireless communication(s), such as cellular and [0132] Wi-Fi signals;
[0133] Time 404; [0134] Bluetooth indicator 405; [0135] Battery status
indicator 406; [0136] Tray 408 with icons for frequently used
applications, such as: [0137] Phone 138, which may include an indicator
414 of the number of missed calls or voicemail messages; [0138] E-mail
client 140, which may include an indicator 410 of the number of unread
e-mails; [0139] Browser 147; and [0140] Music player 146; and [0141]
Icons for other applications, such as: [0142] IM 141; [0143] Image
management 144; [0144] Camera 143; [0145] Video player 145; [0146]
Weather 149-1; [0147] Stocks 149-2; [0148] Workout support 142; [0149]
Calendar 148; [0150] Calculator 149-3; [0151] Alarm clock 149-4; [0152]
Dictionary 149-5; and [0153] User-created widget 149-6.

[0154] In some embodiments, user interface 400B includes the following
elements, or a subset or superset thereof: [0155] 402, 404, 405, 406,
141, 148, 144, 143, 149-3, 149-2, 149-1, 149-4, 410, 414, 138, 140, and
147, as described above; [0156] Map 154; [0157] Notes 153; [0158]
Settings 412, which provides access to settings for device 100 and its
various applications 136, as described further below; [0159] Video and
music player module 152, also referred to as iPod (trademark of Apple
Inc.) module 152; and [0160] Online video module 155, also referred to as
YouTube (trademark of Google Inc.) module 155.

[0161] Device 100 or 300 can be rotated about an axis normal
(perpendicular) to the plane of the display (e.g., touch screen 112) from
landscape orientation to portrait orientation, or vice versa. When the
device is rotated, a user interface displayed on the display of the
device can be transformed to accommodate the change in orientation. The
transformation can include various transitions, such as fading in or out,
and adjustments to the user interface, such as stretching, scaling, and
re-wrapping, for example. Examples of user interface transitions and
transformations during device rotation are disclosed in U.S. patent
application Ser. No. 12/473,846, titled "Rotation Smoothing of a User
Interface," filed May 28, 2009, which is incorporated by reference herein
in its entirety.

[0162] Attention is now directed towards embodiments of user interfaces
("UI") and associated processes that may be implemented on a
multifunction device with a display and a touch-sensitive surface, such
as device 300 or portable multifunction device 100.

[0163] FIGS. 5A-5I illustrate exemplary user interfaces for performing
grid transformations during device rotation in accordance with some
embodiments. The user interfaces in these figures are used to illustrate
the processes described below, including the processes in FIGS. 6A-6C.

[0164] UI 500A (FIG. 5A) depicts a user interface displayed on a display
112 of device 100 in landscape orientation. UI 500A includes a status bar
502, which can include one more elements (e.g., signal strength indicator
402, time 404, Bluetooth indicator 405, and battery indicator 406) that
indicate the status of device 100. UI 500A also includes a user interface
for an application, such as an online video application (e.g., online
video module 155). The application user interface includes a navigation
bar 504, content area 506, and a toolbar/tab bar 508 (hereinafter
"toolbar"). The navigation bar 504, among other things, shows a title of
the application or the particular screen within the application, and can
include user interface components for navigating to other screens in the
application. The toolbar 508 includes buttons or other user interface
components for performing particular functions within the application. In
some situations, the toolbar 508 can become a tab bar, where the buttons
in the tab bar can be used for navigating to particular "tabbed" screens
within the application.

[0165] The content area 506 includes one or more content items 510. In an
online video application, the content items 510 can represent online
videos, including thumbnail previews and other information associated
with the online videos.

[0166] The content items 510 are listed within the content area 506 in an
ordering based on, for example, a specified criterion (e.g., user rating,
date, alphabetical order, etc.). The content items 510 are arranged in a
two-dimensional (2-D) array or grid within the content area 506 according
to the list ordering. For example, in UI 500A, the content items 510 have
an ordering beginning from 510-A, 510-B, 510-C, and so on thru 510-L. In
the 2-D array, 510-A thru 510-D are in the first (topmost) row, then
510-E thru 510-H in the second row, and 510-I thru 510-L in the third
row. The content items 510 are listed horizontally and wrap to the next
row as needed to accommodate the width of the content area 506.

[0167] UI 500B (FIG. 5B) depicts device 100 as rotated clockwise (arrows
512, FIG. 5A) from a landscape orientation to a portrait orientation. The
status bar 502 is rotating counterclockwise (arrow 514) and being
repositioned from the right side of the touch screen 112 (where the
status bar 502 would otherwise be without the repositioning) to the top
side of the touch screen 112 in the portrait orientation.

[0168] UI 500C (FIG. 5C) depicts the application user interface
(navigation bar 504, content area 506, and toolbar 508) in landscape
orientation rotating counterclockwise (arrow 516) and fading out of view
as the application user interface is transformed to accommodate the
portrait orientation. As the application user interface rotates, it
eventually fades out completely. The status bar 502 is repositioned at
the top side of the touch screen 112.

[0169] UI 500D (FIG. 5D) depicts the application user interface
(navigation bar 504, content area 506, and toolbar 508) in portrait
orientation rotating counterclockwise (arrow 518) and fading into view.
The content area 506 in portrait orientation includes content items 510
that have been rearranged to accommodate the width of touch screen 112 in
portrait orientation.

[0170] UI 500E (FIG. 5E) depicts the user interface transformed for the
portrait orientation of the device 100. Status bar 502 is on the top side
of the touch screen 112. The application user interface (navigation bar
504, content area 506, and toolbar 508) is in portrait orientation and
has been stretched and scaled to accommodate the portrait orientation.
The 2-D array of content items 510 in content area 506 was transformed to
accommodate the portrait orientation while maintaining the original
ordering. Thus, for example, the first row includes content items 510-A
thru 510-C, the second row content items 510-D thru 510-F, the third row
content items 510-G thru 510-I, and the fourth row content items 5104
thru 510-L. The wrapping of the content items to the next row changed to
accommodate the changed width of the user interface.

[0171] UI 500F (FIG. 5F) depicts the device 100 as depicted in FIG. 5E
rotated clockwise (arrow 520, FIG. 5E) to landscape orientation. The
status bar 502 is rotating counterclockwise (arrow 522) and being
repositioned from the right side of the touch screen 112 to the top side
of the touch screen 112 in landscape orientation.

[0172] UI 500G (FIG. 5G) depicts the application user interface
(navigation bar 504, content area 506, and toolbar 508) in portrait
orientation rotating counterclockwise (arrow 524) and fading out of view
as the application user interface is transformed to accommodate the
landscape orientation. As the application user interface rotates, it
eventually fades out completely. The status bar 502 is repositioned at
the top side of the touch screen 112.

[0173] UI 500H (FIG. 5H) depicts the application user interface
(navigation bar 504, content area 506, and toolbar 508) in landscape
orientation rotating counterclockwise (arrow 526) and fading into view.
The content area 506 in landscape orientation includes content items 510
that have been rearranged to accommodate the width of touch screen 112 in
landscape orientation.

[0174] UI 500I (FIG. 5I) depicts the user interface transformed for the
landscape orientation of the device 100. Status bar 502 is on the top
side of the touch screen 112. The application user interface (navigation
bar 504, content area 506, and toolbar 508) is in landscape orientation
and has been stretched and scaled to accommodate the landscape
orientation. The 2-D array of content items 510 in content area 506 was
transformed to accommodate the landscape orientation while maintaining
the original ordering. Thus, for example, the first row includes content
items 510-A thru 510-D, the second row content items 510-E thru 510-H,
and the third row content items 510-I thru 510-L; the wrapping of the
content items to the next row changed to accommodate the changed width of
the user interface.

[0175] UI 500J and UI 500K (FIGS. 5J and 5K) illustrate an array of
application launch icons 136-1 to 136-17 in a landscape orientation and a
portrait orientation, respectively. UI 500J and UI 500K also include a
strip of frequently used application launch icons 136-18 to 136-21. The
grid transformation illustrated in FIGS. 5A-5I with respect to content
items 510 may be applied in an analogous manner to other arrays of user
interface components, such as application launch icons 136-1 to 136-17.
In some embodiments, while a first array is rotating, individual user
interface components (e.g., content items 510 in FIGS. 5A-5I or
application launch icons 136 in FIGS. 5J-5K) within the first array
cross-fade to respective user interface components in the second array.
For example, content item 510-E in FIG. 5A cross fades to content item
510-D in FIG. 5E. Similarly, application launch icon 136-6 in FIG. 5J
cross fades to application launch icon 136-5 in FIG. 5K.

[0176] In some embodiments, the rotations depicted in FIGS. 5B thru 5E (or
FIGS. 5F thru 5I) occur substantially simultaneously. In some other
embodiments, the rotations depicted in FIGS. 5B thru 5E (or FIGS. 5F thru
5I) occur in sequence. For example, the status bar 502 begins rotating
first, then the application user interface (navigation bar 504, content
area 506, and toolbar 508). However, even in the case where the rotations
occur in sequence, the rotations can occur at a speed such that users
perceive the rotations as occurring substantially simultaneously.

[0177] FIGS. 6A-6C are flow diagrams illustrating a method 600 of grid
transformation during device rotation in accordance with some
embodiments. The method 600 is performed at a multifunction device (e.g.,
device 300, FIG. 3, or portable multifunction device 100, FIG. 1) with a
display and a touch-sensitive surface. In some embodiments, the display
is a touch screen display and the touch-sensitive surface is on the
display. In some embodiments, the display is separate from the
touch-sensitive surface. Some operations in method 600 may be combined
and/or the order of some operations may be changed.

[0178] As described below, the method 600 provides an intuitive way to
transform a grid in a user interface during device rotation. The method
reduces the cognitive burden on a user when transforming a grid in a user
interface during device rotation, thereby creating a more efficient
human-machine interface. For battery-operated computing devices, enabling
a user to work with the applications faster and more efficiently
conserves power and increases the time between battery charges.

[0179] The device displays (602) a first 2-D array of a plurality of user
interface components on the display in a portrait orientation. For
example, in FIG. 5E, a plurality of content items 510 are displayed in
portrait orientation on the touch screen 112 of a device 100. The content
items 510 are arranged and displayed in a 2-D array having rows and
columns.

[0180] In some embodiments, respective user interface components are
associated with respective applications supported by the multifunction
device (604). For example, the user interface components can be icons for
applications displayed in a 2-D array, such as the application icons
shown in FIGS. 4A-4B and 5J-5K.

[0181] In some embodiments, the plurality of user interface components is
associated with content items in a single application supported by the
multifunction device (606). For example, in FIG. 5E, an online video
application (e.g., online video module 155) is displayed on the touch
screen 112 of device 100, and content items 510 are displayed in a 2-D
array in the content area 506 of the online video application.

[0182] The device detects (608) rotation of the display from the portrait
orientation to a landscape orientation with the one or more
accelerometers. For example, in FIG. 5F, device 100 is in landscape
orientation after being rotated about an axis perpendicular to the touch
screen 112 from portrait orientation as indicated by arrows 520 (FIG.
5E). The rotation of the device can be detected by the one or more
accelerometers 168 of device 100.

[0183] In some embodiments, the rotation is detected when the display is
rotated by more than a predefined degree (610). For example, in FIGS.
5E-5F, the rotation of device 100 from portrait orientation to landscape
orientation can be detected when the device 100 is rotated by a
predefined degree (e.g., 60 degrees). As another example, in FIGS. 5A-5B,
the rotation of device 100 from landscape orientation to portrait
orientation can be detected when the device 100 is rotated by the
predefined number of degrees.

[0184] In response to detecting the rotation (612), the device rotates
(614) the first 2-D array of the plurality of user interface components
on the display about an axis that is normal to a front surface of the
display. For example, in FIG. 5G, in response to device 100 being rotated
from portrait to landscape orientation, the array of content items 510 is
rotated about an axis that is perpendicular to the front surface of the
touch screen 112.

[0185] Further in response to detecting the rotation (612), the device 100
replaces (616) the first 2-D array with a second 2-D array of the
plurality of user interface components on the display after the rotation
of the first 2-D array exceeds a predefined condition. For example, in
FIG. 5G, the 2-D array of content items 510 in portrait orientation is
rotated counterclockwise (arrow 524). When the rotation of the 2-D array
of content items 510 in portrait orientation exceeds a predefined
condition (e.g., the 2-D array has rotated at least a predefined amount,
such as 40, 45, or 50 degrees), the 2-D array of content items 510 in
portrait orientation is replaced with a 2-D array of content items 510 in
landscape orientation, as shown in FIG. 5H.

[0186] In some embodiments, the first 2-D array has M rows and N columns
and each user interface component is located at a respective location in
the first 2-D array that has a unique pair of (row index, column index)
determined in accordance with a predefined sequencing algorithm; and the
second 2-D array has N rows and M columns and each user interface
component is located at a respective location in the second 2-D array
that has a unique pair of (row index, column index) determined in
accordance with the predefined sequencing algorithm (618). For example,
in FIG. 5E, the first 2-D array of content items 510 has four rows and
three columns. In FIG. 5I, the second 2-D array of content items 510 has
three rows and four columns. More generally, the numbers of rows and
columns in the 2-D arrays are based on the size of the content area 506,
the size of the content items 510 to be displayed, and the size of touch
screen 112. For example, in FIG. 5E, the content area 506 can display up
to four rows and three columns of content items 510 at a time. In FIG.
5I, the content area 506 can display up to three rows and four columns of
content items 510 at a time.

[0187] In some embodiments, within the first or second 2-D array, each
user interface component is located at a respective location in the
respective array with a unique (row index, column index) pair determined
in accordance with a predefined sequencing algorithm. For example, in
FIG. 5E or 5I, each content item 510 is located at a respective location
in the respective 2-D array, with each respective location having a
unique (row index, column index). As an example, content item 510-A has
index ([row] 1, [column] 1) in both FIGS. 5E and 5I. As another example,
content item 510-D has index (2, 1) in FIG. 5E and index (1, 4) in FIG.
5I. Examples of sequencing algorithms include sorting by one or more
specified criteria (e.g., user rating, date, alphabetical, etc.), and
user-specified sequencing (e.g., in FIG. 4A or 4B, the user can manually
rearrange and reposition the application icons within a 2-D array to
their preference). The (row index, column index) for the content items
are determined by a sequencing of the content items in accordance with a
sequencing algorithm. For example, in FIG. 5E, content item 510-A is
first in the sequence of content items 510 (e.g., based on user rating,
with one or more other criteria (e.g., date, alphabetical) as tiebreaker)
and is located at location (1, 1) in the array, content item 510-B is
second in the sequence and is located at location (1, 2), content item
510-C is third in the sequence and is located at location (1, 3), content
item 510-D is fourth in the sequence and is located at location (2, 1),
and so forth. In FIG. 5I, content item 510-A is first in the sequence of
content items 510 and is located at location (1, 1), content item 510-B
is second in the sequence and is located at location (1, 2), content item
510-C is third in the sequence and is located at location (1, 3), content
item 510-D is fourth in the sequence and is located at location (1, 4),
and so forth.

[0188] In some embodiments, M is different from N and, for at least one
respective user interface component in the plurality of user interface
components, its pair of (row index, column index) in the first 2-D array
is distinct from its pair of (row index, column index) in the second 2-D
array (620). For example, in FIG. 5E, the 2-D array in portrait
orientation has four rows and three columns, and in FIG. 5I the 2-D array
in landscape orientation has three rows and four columns. In FIG. 5E,
content item 510-D has the index (2, 1) in the 2-D array in portrait
orientation, and in FIG. 5I content item 510-D has the index (1, 4) in
the 2-D array in landscape orientation.

[0189] In some embodiments, the predefined sequencing algorithm is used to
combine the M rows of user interface components in the first 2-D array
into a 1-D array and divide the 1-D array into the N rows of user
interface components in the second 2-D array (622). For example, in FIG.
5E, the four rows of content items 510 can be combined into one 1-D
array, with the content items 510 in the 1-D array maintaining the
sequence as determined in accordance with the sequencing algorithm. The
1-D array can be divided into the three rows of content items 510 in FIG.
5I.

[0190] In some embodiments, the predefined sequencing algorithm is used to
combine the N columns of user interface components in the first 2-D array
into a 1-D array and divide the 1-D array into the M columns of user
interface components in the second 2-D array (624). For example, in FIG.
5E, the three columns of content items 510 can be combined into one 1-D
array, with the content items 510 in the 1-D array maintaining the
sequence as determined in accordance with the sequencing algorithm. The
1-D array can be divided into the four columns of content items 510 in
FIG. 5I.

[0191] In some embodiments, replacing the first 2-D array with the second
2-D array includes fading out the first 2-D array from the display while
rotating the first 2-D array and fading in the second 2-D array on the
display following the fade-out of the first 2-D array (626). For example,
in FIG. 5G, the 2-D array of content items 510 in portrait orientation
fades out as it is rotated counterclockwise (arrow 524). In FIG. 5H, the
2-D array of content items 510 in landscape orientation fades in as it is
rotated counterclockwise (arrow 526). The fade-in of the content items
510 in landscape orientation begins when the fade-out of the content
items 510 in portrait orientation is completed.

[0192] In some embodiments, replacing the first 2-D array with the second
2-D array includes fading out the first 2-D array from the display while
rotating the first 2-D array and fading in the second 2-D array on the
display while fading out the first 2-D array (628). For example, in FIG.
5G, the 2-D array of content items 510 in portrait orientation fades out
as it is rotated counterclockwise (arrow 524). In FIG. 5H, the 2-D array
of content items 510 in landscape orientation fades in as it is rotated
counterclockwise (arrow 526). The fade-in of the content items 510 in
landscape orientation begins while the fade-out of the content items 510
in portrait orientation is in progress.

[0194] Examples of fading the user interface and components thereof in or
out are described in U.S. patent application Ser. No. 12/473,846, titled
"Rotation Smoothing of a User Interface," which is incorporated by
reference in its entirety.

[0195] Further in response to detecting the rotation (612), the device
rotates (632) the second 2-D array of the plurality of user interface
components on the display until the second 2-D array of the plurality of
user interface components is in the landscape orientation. For example,
in FIG. 5H, the 2-D array of content items 510 in landscape orientation
rotates counterclockwise (arrow 526) as it fades in until the 2-D array
is upright in the landscape orientation, as depicted in FIG. 5I.

[0196] In some embodiments, the rotation direction of the first and second
2-D arrays relative to the axis that is normal to the front surface of
the display is opposite the rotation direction of the rotation of the
display (634). For example, in FIGS. 5E, 5G, and 5H, the device 100 is
rotated clockwise (arrows 520) and the 2-D arrays are rotated
counterclockwise (arrows 524, 526).

[0197] It should be appreciated that while method 600 describes a
portrait-to-landscape rotation, a method analogous to method 600 is
applicable to a landscape-to-portrait rotation (e.g., the rotation
depicted in FIGS. 5A thru 5E).

[0198] The operations in the information processing methods described
above may be implemented by running one or more functional modules in
information processing apparatus such as general purpose processors or
application specific chips. These modules, combinations of these modules,
and/or their combination with general hardware (e.g., as described above
with respect to FIGS. 1A, 1B and 3) are all included within the scope of
protection of the invention.

[0199] The operations described above with reference to FIGS. 6A-6C may be
implemented by components depicted in FIGS. 1A-1C. For example, detection
operation 608, rotation operations 614 and 630, and replacement operation
616 may be implemented by event sorter 170, event recognizer 180, and
event handler 190. Event monitor 171 in event sorter 170 detects a
contact on touch-sensitive display 112, and event dispatcher module 174
delivers the event information to application 136-1. A respective event
recognizer 180 of application 136-1 compares the event information to
respective event definitions 186, and determines whether a first contact
at a first location on the touch-sensitive surface (or whether rotation
of the device) corresponds to a predefined event or sub-event, such as
selection of an object on a user interface, or rotation of the device
from one orientation to another. When a respective predefined event or
sub-event is detected, event recognizer 180 activates an event handler
180 associated with the detection of the event or sub-event. Event
handler 180 may utilize or call data updater 176 or object updater 177 to
update the internal state of application 136-1 data. In some embodiments,
event handler 180 accesses a respective GUI updater 178 to update what is
displayed by the application. Similarly, it would be clear to a person
having ordinary skill in the art how other processes can be implemented
based on the components depicted in FIGS. 1A-1C.

[0200] The foregoing description, for purpose of explanation, has been
described with reference to specific embodiments. However, the
illustrative discussions above are not intended to be exhaustive or to
limit the invention to the precise forms disclosed. Many modifications
and variations are possible in view of the above teachings. The
embodiments were chosen and described in order to best explain the
principles of the invention and its practical applications, to thereby
enable others skilled in the art to best utilize the invention and
various embodiments with various modifications as are suited to the
particular use contemplated.