Sign up to receive free email alerts when patent applications with chosen keywords are publishedSIGN UP

Abstract:

Orientation in an external reference is determined. An external-frame
acceleration for a device is determined, the external-frame acceleration
being in an external reference frame relative to the device. An
internal-frame acceleration for the device is determined, the
internal-frame acceleration being in an internal reference frame relative
to the device. An orientation of the device is determined based on a
comparison between a direction of the external-frame acceleration and a
direction of the internal-frame acceleration.

Claims:

1. A game system, comprising:a controller;a controller monitor; andan
orientation inferring subsystem configured to:determine an external-frame
acceleration of the controller from time-elapsed position information
received from the controller monitor, the external-frame acceleration
being in an external reference frame relative to the controller;determine
an internal-frame acceleration for the device from acceleration
information received from the controller, the internal-frame acceleration
being in an internal reference frame relative to the controller;
anddetermine a coarse orientation of the controller based on a comparison
between a direction of the external-frame acceleration and a direction of
the internal-frame acceleration.

2. The game system of claim 1, where the controller includes an
acceleration-measuring subsystem configured to report acceleration
information to the orientation inferring subsystem.

3. The game system of claim 1, where the controller includes an
angular-motion measuring subsystem configured to report angular motion
information to the orientation inferring subsystem.

4. The game system of claim 3, where the angular-motion measuring
subsystem includes spaced-apart three-axis accelerometers.

5. The game system of claim 3, where the angular-motion measuring
subsystem includes a three-axis gyroscope.

6. The game system of claim 3, where the orientation inferring subsystem
is configured to update the coarse orientation based on the angular
motion information.

7. The game system of claim 1, where the controller monitor includes
stereo cameras.

8. The game system of claim 7, where the controller includes an infrared
light and the stereo cameras are configured to view the infrared light.

9. The game system of claim 1, where the orientation inferring subsystem
determines the external-frame acceleration as: 2 ( X 0 _ - X
0 ' _ ) ( t 0 - t - 1 ) 2 + g _ ##EQU00003## where:
X0 is a current position of the controller as observed by the
controller monitor at a time t0; g is a gravitational acceleration;
X0' is X-1+ V(t0-t-1)where: X-1 is a previous
position of the controller as observed by the controller monitor at a
previous time t-1; V _ is ( X - 1 _ - X
- 2 _ ) ( t - 1 - t - 2 ) ##EQU00004## where: X-2
is a more previous position of the controller as observed by the
controller monitor at a more previous time t.sub.-2.

10. The game system of claim 1, where the orientation inferring subsystem
uses an unscented Kalman filter to determine a unified estimate of
position and an absolute orientation of the controller.

11. A method of tracking an orientation of a game controller, the method
comprising:inferring a coarse orientation of the game controller
by:determining an external-frame acceleration for the game controller,
the external-frame acceleration being in an external reference frame
relative to the game controller;determining an internal-frame
acceleration for the game controller, the internal-frame acceleration
being in an internal reference frame relative to the game controller;
anddetermining an orientation of the game controller based on a
comparison between a direction of the external-frame acceleration and a
direction of the internal-frame acceleration; andupdating the coarse
orientation of the game controller based on angular motion information
observed by the game controller.

12. The method of claim 11, where determining an external-frame
acceleration for the game controller includes translating motion
information for the game controller that is visually observed by a stereo
camera.

13. A method of inferring device orientation in an external reference
frame, the method comprising:determining an external-frame acceleration
for the device, the external-frame acceleration being in an external
reference frame relative to the device;determining an internal-frame
acceleration for the device, the internal-frame acceleration being in an
internal reference frame relative to the device;determining an
orientation of the device based on a comparison between a direction of
the external-frame acceleration and a direction of the internal-frame
acceleration.

14. The method of claim 13, where determining an external-frame
acceleration for the device includes translating visually-observed motion
of the device.

15. The method of claim 14, where a stereo camera is used to
visually-observe motion of the device.

16. The method of claim 13, where determining the internal-frame
acceleration for the device includes receiving internal-frame
acceleration information observed by the device.

17. The method of claim 13, further comprising updating the orientation of
the device based on angular motion information observed by the device.

18. The method of claim 13, where determining an external-frame
acceleration for the device includes receiving initial position
information for the device, the initial position information being in the
external reference frame relative to the device; and receiving subsequent
position information for the device, the subsequent position information
being in the external reference frame relative to the device.

19. The method of claim 13, where determining the external-frame
acceleration for the device includes calculating: 2 ( X 0 _ -
X 0 ' _ ) ( t 0 - t - 1 ) 2 + g _ ##EQU00005## where:
X0 is a current position of the device in the external reference
frame at a time t0; g is a gravitational acceleration; X0' is
X-1+ V(t0-t-1)where: X-1 is a previous position of
the device in the external reference frame at a previous time t-1;
V _ is ( X - 1 _ - X - 2 _ ) ( t - 1
- t - 2 ) ##EQU00006## where: X-2 is a more previous
position of the device in the external reference frame at a more previous
time t.sub.-2.

20. The method of claim 13, further comprising using an unscented Kalman
filter to determine a unified estimate of position and an absolute
orientation of the device.

Description:

BACKGROUND

[0001]A gyroscope can use angular momentum to assess a relative
orientation of a device in a frame of reference that is internal to that
device. However, even the most accurate gyroscopes available may
accumulate small orientation errors over time.

SUMMARY

[0002]This Summary is provided to introduce a selection of concepts in a
simplified form that are further described below in the Detailed
Description. This Summary is not intended to identify key features or
essential features of the claimed subject matter, nor is it intended to
be used to limit the scope of the claimed subject matter. Furthermore,
the claimed subject matter is not limited to implementations that solve
any or all disadvantages noted in any part of this disclosure.

[0003]Determining orientation in an external reference frame is disclosed
herein. An external-frame acceleration for a device is determined, the
external-frame acceleration being in an external reference frame relative
to the device. An internal-frame acceleration for the device is also
determined, the internal-frame acceleration being in an internal
reference frame relative to the device. An orientation of the device is
determined based on a comparison between a direction of the
external-frame acceleration and a direction of the internal-frame
acceleration.

BRIEF DESCRIPTION OF THE DRAWINGS

[0004]FIG. 1A schematically shows an orientation-determining computing
system in accordance with an embodiment of the present disclosure.

[0005]FIG. 1B schematically shows a position-determining computing system
in accordance with another embodiment of the present disclosure.

[0006]FIG. 2 shows an exemplary configuration of the
orientation-determining computing system of FIG. 1.

[0007]FIG. 3 shows a comparison of an external-frame acceleration vector
and an internal-frame acceleration vector corresponding to the controller
orientation of FIG. 2.

[0008]FIG. 4 shows a process flow of an example method of tracking an
orientation of a game controller.

DETAILED DESCRIPTION

[0009]FIG. 1 shows an orientation-determining computing system 10
including a wand 12, a wand monitor 14 and an orientation inferring
subsystem 16. Orientation inferring subsystem 16 is configured to
determine an orientation of wand 12 in a frame of reference that is
external to the wand 12. In particular, the orientation inferring
subsystem 16 may infer a coarse orientation of the wand 12 in the
external reference frame by comparing acceleration information of the
wand 12 in the external reference frame with acceleration information of
the wand 12 in an internal reference frame.

[0010]The acceleration information in the external reference frame may be
assessed by wand monitor 14. The wand monitor 14 may be configured to
observe the wand 12 as the wand 12 moves relative to the wand monitor 14.
Such observations may be translated into an external-frame acceleration
for the wand. Any suitable technique may be used by the wand monitor 14
for observing the wand 12. As a nonlimiting example, the wand monitor 14
may be configured to visually observe the wand 12 with stereo cameras. In
some embodiments, the wand 12 may include a target 18 that facilitates
observation by the wand monitor 14.

[0011]The acceleration information in the internal reference frame may be
assessed by the wand 12. The wand 12 may be configured to sense wand
accelerations and report such sensed accelerations to orientation
inferring subsystem 16. In some embodiments, the wand may include an
acceleration-measuring subsystem 20 for measuring wand accelerations in a
frame of reference that is internal to the wand 12.

[0012]In addition to determining a coarse orientation of the wand 12 by
comparing wand accelerations in internal and external reference frames,
the orientation inferring subsystem 16 may update the coarse orientation
of the wand 12 based on angular motion information observed by the wand
12 itself. As such, the wand 12 may include an angular-motion measuring
subsystem 22 for measuring angular motion of the wand 12 in a frame of
reference that is internal to the wand. Even when such an angular-motion
measuring subsystem 22 is included, the coarse orientation inferred using
internal and external-frame accelerations may be used to limit errors
that may accumulate if only the angular-motion measuring subsystem 22 is
used.

[0013]The wand may be configured to serve a variety of different functions
in different embodiments without departing from the scope of this
disclosure. As a nonlimiting example, in some embodiments, computing
system 10 may be a game system in which wand 12 is a game controller
device for controlling various game functions. It is to be understood
that the orientation inferring methods described herein may additionally
and/or alternatively be applied to an orientation-determining computing
system other than a game system, and the wand need not be a game
controller in all embodiments.

[0014]Furthermore, it is to be understood that the arrangement shown in
FIG. 1A is exemplary, and other arrangements are within the scope of this
disclosure. As a nonlimiting example, FIG. 1B shows a
position-determining computing system 10' in accordance with another
embodiment of the present disclosure. Position-determining computing
system 10' includes a wand 12', a target monitor 14' and a position
inferring subsystem 16'. Position inferring subsystem 16' is configured
to determine a position of wand 12' in a frame of reference that is
external to the wand 12'. In particular, the position inferring subsystem
16' may infer a coarse position of the wand 12' in the external reference
frame by comparing orientation information of the wand 12' in the
external reference frame with acceleration information of the wand 12' in
an internal reference frame.

[0015]In some embodiments, target 18' may include one or more LEDs (e.g.,
infrared LEDs) positioned in a fixed location, such as near a television
or any other suitable location. In such embodiments, the wand 12' may
include a target monitor 14' configured to view the target 18' and deduce
an orientation of the wand based upon a relative position of the target
18' within the target monitor's field of view. Such information may be
used in cooperation with acceleration information measured by an
acceleration-measuring subsystem 20' and/or angular motion information
measured by an angular-motion measuring subsystem 22' to infer a coarse
position of the wand as discussed below with reference to inferring
coarse orientation.

[0016]In yet other embodiments, a wand may include both a target and a
target monitor, and/or both a target and a target monitor may be
positioned at one or more locations external to the wand. In other words,
the arrangements shown in FIGS. 1A and 1B may be at least partially
combined, thus enabling direct deduction of both wand position and wand
orientation, which may optionally be confirmed/verified with inferred
position and inferred orientation, as described herein. Further, it
should be understood that the relative positioning of targets, target
monitors, wand monitors, and other components described herein may be
varied from the specific examples provided herein without departing from
the scope of the present disclosure.

[0017]FIG. 2 shows an example game system 30 including a controller 32, a
controller monitor 34 including stereo cameras 36, and a gaming console
38 including an orientation inferring subsystem 40.

[0018]In such a game system 30, orientation inferring subsystem 40 is
configured to infer a coarse orientation of controller 32 in an external
reference frame relative to controller 32. In particular, the coarse
orientation of the controller 32 in a television's, or other display's,
reference frame may be inferred. The orientation inferring subsystem 40
infers the coarse orientation of the controller 32 by comparing
acceleration information from an external reference frame relative to the
controller 32 with acceleration information from an internal reference
frame relative to the controller 32.

[0019]In the illustrated embodiment, orientation inferring subsystem 40 is
configured to determine an external-frame acceleration of controller 32
using time-elapsed position information received from stereo cameras 36.
While shown placed near a television, it should be understood that stereo
cameras 36, or another wand/target monitor, may be placed in numerous
different positions without departing from the scope of this disclosure.

[0020]The stereo cameras may observe a target 41 in the form of an
infrared light on controller 32. The individual position of the target 41
in each camera's field of view may be cooperatively used to determine a
three-dimensional position of the target 41, and thus the controller 32,
at various times. Visually-observed initial position information and
subsequent position information may be used to calculate the
external-frame acceleration of the controller 32 using any suitable
technique.

[0021]The following technique is a nonlimiting example for using initial
position information and subsequent position information to determine an
external-frame acceleration of the controller. Taking X0 to be a
current position of controller 32 as observed by controller monitor 34 at
a time t0, and X-1 to be a previous position of controller 32
as observed by controller monitor 34 at a previous time t-1, an
expected position X0' for controller 32 at a current time t0
can be calculated according to the following equation,

X0'= X-1+ V(t0-t-1).

[0022]Here, the velocity V is calculated from prior position information
as follows,

V _ = ( X - 1 _ - X - 2 _ ) ( t - 1 - t - 2
) , ##EQU00001##

where X-2 is a more previous position of the controller as observed
by the controller monitor at a more previous time t-2.

[0023]If it is determined that the expected position X0' is not equal
to the current position X0, then the difference may be a result of
acceleration of controller 32. In such a case, the orientation inferring
subsystem 40 determines an external-frame acceleration of controller 32
at a current time t0 to be given by the following,

a _ = 2 ( X 0 _ - X 0 ' _ ) ( t 0 - t - 1
) 2 + g _ , ##EQU00002##

where g is a gravitational acceleration.

[0024]Orientation inferring subsystem 40 is configured to determine an
internal-frame acceleration of controller 32 from acceleration
information received from controller 32. The controller 32 may obtain the
internal-frame acceleration in any suitable manner. For example, the
controller may include an acceleration-measuring subsystem configured to
report acceleration information to the orientation inferring subsystem
40. In some embodiments, the acceleration-measuring subsystem may be a
three-axis accelerometer 42 located proximate to the target 41, as
schematically shown in FIG. 2.

[0025]The orientation inferring subsystem 40 can determine a coarse
orientation of controller 32 based on a comparison between a direction of
the external-frame acceleration and a direction of the internal-frame
acceleration. FIG. 3 shows an example of such a comparison 50
corresponding to the controller movement shown in FIG. 2. Vector 52
represents the direction of the external-frame acceleration and vector 54
represents the direction of the internal-frame acceleration. The
misalignment between the external-frame acceleration and the
internal-frame acceleration can be resolved to find any difference
between the external reference frame and the internal reference frame.
Accordingly, an orientation of the controller 32 can be inferred in the
external frame of reference.

[0026]As a nonlimiting example, if stereo cameras 36 observe controller 32
accelerating due east without changing elevation or moving north/south;
and if acceleration-measuring subsystem 20 reports that controller 32
accelerates to the right, without moving up/down or front/back; then
orientation inferring subsystem 40 can infer that controller 32 is
pointing toward the north. The above is a simplified and somewhat
exaggerated scenario. In many usage scenarios, controller 32 will be
pointed substantially toward a television or other display, and any
relative misalignments between internal and external reference frames
will be less severe. Nonetheless, the orientation inferring methods
described herein may be used to assess a coarse orientation.

[0027]The assessed external-frame acceleration of controller 32 may differ
from the actual controller acceleration due to one or more of the
following factors: noise and error in the data visually-observed by
stereo cameras 36, noise and error in the accelerometer data, and/or
misalignment between the internal reference frame and the external
reference frame. However, an inferred coarse orientation of controller
32, which is found as described herein, is absolute, rather than
relative, and therefore does not accumulate error over time.

[0028]In some embodiments, orientation inferring subsystem 40 may be
further configured to update the coarse orientation of controller 32
based on angular motion information observed by controller 32. The
controller 32 may obtain the angular motion information in any suitable
manner. One such suitable manner includes obtaining the angular motion
information by means of an angular-motion measuring subsystem 44
configured to report angular motion information to the orientation
inferring subsystem 40. In some embodiments, the angular-motion measuring
subsystem may include spaced-apart three-axis accelerometers configured
to be used in combination to determine the angular motion of controller
32. As shown in FIG. 2, in such embodiments, one three-axis accelerometer
42 may be located at a head end of controller 32 and another three-axis
accelerometer 46 may be located at a tail end of controller 32, such that
subtracting a head acceleration direction obtained by the head
accelerometer 42 from a tail acceleration direction obtained by the tail
accelerometer 46 yields an orientation change of controller 32 in the
internal reference frame relative to controller 32. In other embodiments,
such an angular-motion measuring subsystem 44 may include a three-axis
gyroscope 48 which calculates the angular velocity of controller 32,
which can then be integrated over time to determine an angular position.

[0029]In between frames where a coarse orientation is available (e.g., if
target 41 does not move sufficient distance for detection by stereo
cameras 36), measurements from the angular-motion measuring subsystem 44
may accumulate error. A long period of very slow motion, as might well
happen when drawing, is the worst-case scenario. However, such a
situation is the best-case scenario for smoothing and filtering the
accelerometer data, because it is expected that a user will attempt to
draw smooth lines and curves.

[0031]FIG. 4 shows a process flow diagram of an example method 60 of
tracking an orientation of a game controller. Method 60 begins at 62 by
inferring a coarse orientation of the game controller. At 64, method 60
includes determining an external-frame acceleration for the game
controller, the external-frame acceleration being in an external
reference frame relative to the game controller. At 66, method 60
includes determining an internal-frame acceleration for the game
controller, the internal-frame acceleration being in an internal
reference frame relative to the game controller. At 68, method 60
includes determining an orientation of the game controller based on a
comparison between a direction of the external-frame acceleration and a
direction of the internal-frame acceleration, as explained above. Upon
inferring a coarse orientation of the game controller, method 60 may
optionally include, at 70, updating the coarse orientation of the game
controller based on angular motion information observed by the game
controller.

[0032]In some embodiments, an unscented Kalman filter may be used to
combine three-dimensional position tracking from stereo cameras, angular
velocity information from gyroscopes, and acceleration information from
accelerometers into a unified estimate of position and absolute
orientation of the device. An unscented Kalman filter may be appropriate
because of nonlinearities that may be introduced in the observation part
of the process model (i.e., using the orientation to correct
accelerometers). An extended Kalman filter may alternatively be used.

[0033]The Kalman filter approach combines the information provided from
all sensors and allows the introduction of (Gaussian) noise models for
each of the sensors. For example, any noise associated with position
estimates from the cameras can be incorporated directly into the model.
Similarly, the noise of the gyroscopes and accelerometers may be
represented by the model. By tuning each of these separately, the system
may favor the more reliable sensors without neglecting less reliable
sensors.

[0034]The Kalman state, state transition, and observation model are
described as follows, and the standard Kalman filter equations are used
thereafter. At each frame, the state is updated with the state transition
model, and predicted sensor values are computed from state estimates
given the observation model. After the filter is updated, an updated
position and orientation information is "read" from the updated state
vector.

[0035]The Kalman state {x, {dot over (x)}, {umlaut over (x)}, q, ω}
includes information to be represented and carried from frame to frame,
and is described as follows: [0036]x is a 3D position of the device
(3-vector); [0037]{dot over (x)} is a velocity of the device (3-vector);
[0038]{umlaut over (x)} is an acceleration of the device (3-vector);
[0039]q is a device orientation (quaternion); and [0040]ω is an
angular velocity: change in yaw, pitch and roll in the device coordinate
frame (3-vector).

[0041]Next, a state transition is used to advance the state to the next
time step based on process dynamics (velocity, acceleration, etc.). The
state transition is described mathematically as follows:

x'=x+{dot over (x)}

{dot over (x)}'={dot over (x)}+{umlaut over (x)}

{umlaut over (x)}'={umlaut over (x)}

q'=qq(ω)

[0042]where: [0043]q(ω) is a quaternion formed from a change in
yaw, pitch, and roll.

[0044]Next the sensed values are "observed" from the state, as follows:
[0045]z is a 3D position from a stereo camera system (3-vector);
[0046]gyro are gyroscope values including change in yaw, pitch and roll
(3-vector); [0047]a is accelerometer values (3-vector); [0048]g is a
direction of gravity (3-vector);

[0053]where: [0054]R(q) is a rotation matrix formed from the quaternion
q.

[0055]The last equation is the focus, where the accelerometer values are
predicted by combining the effects of acceleration due to motion of the
device, the effect of gravity, and the absolute orientation of the
device. Discrepancies in the predicted values are then propagated back to
the state by way of the standard Kalman update equations.

[0056]It should be understood that the configurations and/or approaches
described herein are exemplary in nature, and that these specific
embodiments or examples are not to be considered in a limiting sense,
because numerous variations are possible. The specific routines or
methods described herein may represent one or more of any number of
processing strategies. As such, various acts illustrated may be performed
in the sequence illustrated, in other sequences, in parallel, or in some
cases omitted. Likewise, the order of the above-described processes may
be changed.

[0057]The subject matter of the present disclosure includes all novel and
nonobvious combinations and subcombinations of the various processes,
systems and configurations, and other features, functions, acts, and/or
properties disclosed herein, as well as any and all equivalents thereof.
Furthermore, U.S. Pat. No. 6,982,697 is hereby incorporated herein by
reference for all purposes.