Virtual reality interface patents

Virtual reality methods and systems

Some aspects include a virtual reality device configured to present to a user a virtual environment. The virtual reality device comprises a tracking device including at least one camera to acquire image data, the tracking device, when worn by the user, configured to determine a position associated with the user and a stereoscopic display device configured to display at least a portion of a representation of the virtual environment, wherein the representation of the virtual environment is based, at least in part, on the determined position associated with the user, wherein the display device and the tracking device are configured to be worn by the user.

Phone control and presence in virtual reality

In one general aspect, a method can include receiving, by a first computing device from a virtual reality (VR) headset, data indicative of a position of a second computing device, rendering, by the first computing device, an aspect of the second computing device for inclusion in a VR space based on the position of the second computing device, and integrating the rendered aspect of the second computing device with content for display as integrated content in the VR space. The method can further include providing the integrated content to the VR headset for display on a screen included in the VR headset, receiving data indicative of an interaction of a user with the second computing device, and based on the received data indicative of the interaction of the user with the second computing device, altering the content for display as integrated content in the VR space.

Tactile interface apparatus for providing physical feedback to a user

A tactile interface system which provides a method for entering user position into a virtual reality and providing tactile feedback from that virtual reality. This system will allow for the representation of the user limb position and enable the user to feel interactions with objects represented in the virtual reality. The system also provides for the recording of tactile information to accompany audio and visual recordings.This application continuation of application Ser. No. 08/371,497, filed Jan. 11, 1995, now abandoned.

Depth and chroma information based coalescence of real world and virtual world images

Methods and systems for selectively merging real-world objects into a virtual environment are disclosed. The method may include: receiving a first input for rendering of a virtual environment, a second input for rendering of a real-world environment, and a depth information regarding the rendering of the real-world environment; identifying at least one portion of the rendering of the real-world environment that is within a depth range and differentiable from a predetermined background; generating a merged rendering including the at least one portion of the rendering of the real-world environment into the rendering of the virtual environment; and displaying the merged rendering to a user.

Virtual reality keyboard system and method

A system and method implement a virtual reality (VR) keyboard. The VR keyboard system and method receive a VR glove position, generate a corresponding key code from the VR glove position using a predetermined mapping, and send the key code to an application program as a key input corresponding to a keyboard and/or keypad entrap of data and/or a command. The system and method also generate a display representing the key input based in on the VR glove position. The display of the key input may include, but is to limited to, a displayed depressed key in a VR headset of a VR representation of a VR keyboard indicating the key input. The system and method implementing a virtual reality keyboard addresses and solves numerous difficulties of physical and/or hardware-based input devices and provides many diverse advantages in use and applications.

Engine, system and method for providing three dimensional content and viewing experience for same

Methods and apparatus to provide user a somatosensory experience for thrill seeking jumping like activities

A method of providing user somatosensory experience for “thrill seeking jumping like activities” (for purposes such as but not limited to training, game, or entertainment) involves:Providing means for changing elevation of user that rapid enough for user to feel the difference of gravity feeling or “G force” with feelings of user under stationary situation, wherein the facing direction relative to the direction of movement and/or pose of the user can be changed, and the speed of elevation change provide by the means can be controlled/adjusted;While the user using the elevation changing means, using a computer-implemented virtual reality system to present to the user a virtual reality environment. The view point and/or direction changes in the virtual reality environment are consistent with the position changes and/or self motion of the user in the elevation changing means;So that the user experiences the virtual reality environment under a “variable gravity” condition or “G-Force” provided by the elevation changing means in a synchronized way, such “variable gravity” condition enhancing experience of the virtual reality environment. It is also possible for users to interact with the system by means such as game controller or gesture and etc, so that the output of the elevation means could be affected/changed by user input.

Sensory stimulus management in head mounted display

Methods, systems, and computer programs are presented for managing the sensory stimulus generated by a head mounted display (HMD). One method includes an operation for identifying the game state of a game being executed for display on a head mounted display (HMD). Further, the method includes an operation for determining the game intensity value for the game state based on user intensity ratings obtained from user play of the game. The user intensity ratings represent the level of sensory stimulus presented to users during the user play of the game via the HMDs of the respective users. Further, the method includes an operation for presenting the game intensity value to the HMD for rendering during execution of the game.

Systems and methods improve virtual reality and augmented reality functionality for mobile devices using radio frequency (RF) signals transmitted by a tracked device and received at four or more spatially separated antennae. These antennae are connected, wirelessly or through wired connections, to a base station. Through RF signal time of arrival information acquired at the antennae, the base station can continuously determine accurate position information of the tracked device, without lighting or line of sight limitations experienced by camera and other optical systems. As the position of the RF-transmitting tracked device is registered within a virtual environment produced by an interactive software program in communication with (or part of) the base station, the virtual viewpoint of the tracked device is controlled to reflect the relative position and orientation of the tracked device with respect to the virtual environment produced by the software program and displayed on a view screen.

Interactive virtual reality performance theater entertainment system

Interactive virtual reality (VR) performance theater and environment system (100) in which plural participants enjoy an entertainment and/or educational experience. The performance is viewed by the participants through VR display devices such as a head mounted display (HMD), wherein both an immersive graphical environment and live and/or pre-recorded performers are viewed. The participants exert limited control over content and outcome of performance by hand held input devices (16) having a plurality of buttons (80), and/or voice communications, while viewing the performers who are mixed within the immersive VR environment. Inter- connected computers (102, video and audio processing devices (106, 108) are used. A network (104) interconnects the computers, participants, and live/pre-recorded performers for video, audio, and graphics transmission.

Omni-directional treadmill

A treadmill having a track assembly that allows a user to walk or run in any arbitrary direction. A movable user support has a plurality of rotatable members that rotate about axes normal to the direction of movement of the user support. Separate power driven mechanisms concurrently move the user support and rotate the members to omni-directional user movement.A control for the power driven mechanisms is responsive to the directional orientation of the user on the user support to cause the user support to operate in the direction of the orientation of the user.

Weld training system and method

Systems and methods for a weld training system are provided. In particular, components of the weld training system may be removably disposed within an interior volume of a portable enclosure. The portable enclosure may be easy to transport by a welding operator from various training and/or recruiting locations. In some embodiments, the components of the weld training system include a weld training device, a work surface, a sensing device, a virtual reality interface, and processing circuitry.

System for creating images, videos and sounds in an omnidimensional virtual environment from real scenes using a set of cameras and depth sensors, and playback of images, videos and sounds in three-dimensional virtual environments using a head-mounted display and a movement sensor

The invention relates to a system for creating images, videos and sounds in an omnidimensional virtual environment from real scenes using a set of cameras and depth sensors, and to the playback of images, videos and sounds in three-dimensional virtual environments using a head-mounted display and a movement sensor, for enabling the creation and playback of images, videos, sounds and films in a virtual omnidirectional environment from real scenes, allowing the spectator to control autonomously the position and direction of viewing during playback of the digital file, said system comprising a set of image positioning cameras (2) including image cameras (8) parallel to depth cameras (9), sound capture devices (13) and an inner computer (18). The images of the inner environment formed by the image positioning cameras can thus be captured and positioned in a virtual environment by a central computer (35). An example is the playback of a virtual environment file by a head-mounted display (14) coupled to a movement sensor (15) and a computer program for decoding the data received from the movement sensor (15) and transmitting the data to the sound emitter and to the user display screen, with a remote depth control (19) that allows the spectator to determine autonomously the desired virtual position during playback of the files, giving the user the sensation of total immersion in the virtual environment.