Processing

Please wait...

Settings

Settings

Goto Application

1. WO2020112499 - SYSTEMS AND METHODS FOR DETERMINING MOVEMENT OF A CONTROLLER WITH RESPECT TO AN HMD

Note: Text based on automatic Optical Character Recognition processes. Please use the PDF version for legal matters

[ EN ]

CLAIMS

1. A system for tracking a hand-held controller relative to a head mounted display (HMD), comprising:

the hand-held controller;

the HMD configured to be worn on a head of a user, wherein the HMD is configured to display a plurality of image scenes having content generated by execution of an application, wherein the HMD has a lower region having a camera, wherein the camera has a capture view directed downward from the HMD, wherein the camera is configured to capture a plurality images of the hand-held controller; and

a processing device associated with the HMD for receiving a plurality of images of a real-world environment surrounding the HMD and the plurality of images of the hand-held controller, wherein the processing device is configured to determine a position and movement of the HMD within the real-world environment from the plurality of images of the real-world environment, wherein the processing device is configured to determine a position of the hand held controller relative to the HMD from the plurality of images of the hand-held controller, wherein the processing device is configured to modify a state output by execution of the application according to the position and movement of the HMD and changes in the position of the hand-held controller,

wherein the HMD is configured to update the plurality of image scenes displayed in the HMD according to the modification of the state.

2. The system of claim 1, wherein the HMD includes a plurality of inertial sensors, wherein the plurality of inertial sensors of the HMD are configured to measure the position and an orientation of the HMD in the real-world environment, wherein the processing device is associated with the HMD for receiving a plurality of measurements of the orientation from the HMD to determine the orientation of the HMD, wherein the processing device is associated with the HMD for receiving one or more measurements of the position measured by the plurality of inertial sensors to determine the position of the HMD.

3. The system of claim 2, wherein the hand-held controller includes a plurality of inertial sensors, wherein the plurality of inertial sensors of the hand-held controller are configured to measure a position and an orientation of the hand-held controller in the real-world environment, wherein the processing device is associated with the hand-held controller for receiving a plurality of measurements of the orientation from the hand-held controller to

determine the orientation of the hand-held controller, wherein the processing device is associated with the hand-held controller for receiving one or more measurements of the position measured by the plurality of inertial sensors to determine the position of the hand held controller relative to the HMD.

4. The system of claim 1, wherein the camera has the capture view that is directed downward to capture a movement of the hand-held controller in an interaction zone, wherein the interaction zone is located below the camera.

5. The system of claim 4, wherein the interaction zone is a polygonal volume that extends from the camera towards a floor of the real-world environment.

6. The system of claim 1, wherein the HMD includes an eye mask, wherein the lower region is a bottom surface of the eye mask of the HMD, wherein the bottom surface is not connected to a top headband strap and is not configured to cover eyes of a user, wherein the eye mask has a front region, wherein the front region is configured to cover the eyes of the user, wherein the front region is located adjacent to an upper region of the HMD, wherein the upper region is connected to the top headband strap.

7. The system of claim 1, wherein the capture view is a field-of-view of a lens of the camera, wherein the field-of-view includes a portion of the real-world environment that is seen by the camera, wherein the portion has a horizontal range, a vertical range, and a diagonal range.

8. The system of claim 1, wherein the capture view is a field-of-view that includes a reference point within the real-world environment, wherein the reference point falls at an intersection of a first wall within the real-world environment, a second wall within the real-world environment, and a floor within the real-world environment.

9. The system of claim 1, wherein the capture view is a field-of-view that includes a reference point within the real-world environment, wherein the reference point falls at an intersection of a vertical surface of an object within the real-world environment and a horizontal surface of the object.

10. The system of claim 1, wherein the processing device is a game console that is coupled to the HMD via a wired or a wireless connection.

11. The system of claim 1, wherein the processing device is a server that is coupled to the HMD via a computer network.

12. The system of claim 1, wherein the capture view is a field-of-view that includes a reference point within the real-world environment, wherein the position and movement of the HMD with respect to the reference point is determined from the plurality of images of the real-world environment.

13. The system of claim 12, wherein the camera provides another reference point within the real-world environment, wherein the position of the hand-held controller is determined with reference to the other reference point of the HMD.

14. The system of claim 1, wherein the hand-held controller includes a plurality of inertial sensors, wherein the plurality of inertial sensors of the hand-held controller are configured to measure an acceleration of the hand-held controller in the real-world environment to output a measurement, wherein the processing device is associated with the hand-held controller for receiving the measurement of the acceleration from the hand-held controller, wherein the processing device is configured to determine whether the hand-held controller is outside the capture view of the camera, wherein the processing device is configured to apply the measurement of the acceleration to determine the a change in the position of the hand-held controller upon determining that the hand-held controller is outside the capture view of the camera.

15. The system of claim 1, wherein the HMD has an additional camera on a side region of the HMD, wherein the additional camera has a capture view different from the capture view of the camera of the lower region, wherein the additional camera is configured to capture an additional plurality of images of a position and movement of the hand-held controller when the hand-held controller is outside the capture view of the camera of the lower region, wherein the processing device is configured to receive the additional plurality of images from the additional camera and stitch one of the additional plurality of images with one of the plurality of images received from the camera of the lower region to change the state.

16. A method for tracking a hand-held controller relative to a head mounted display (HMD), comprising:

displaying a plurality of image scenes having content generated by execution of an application, wherein the HMD has a lower region having a camera, wherein the camera has a capture view directed downward from the HMD;

capturing, by the camera, a plurality of images of the hand-held controller with respect to the HMD;

receiving a plurality of images of a real-world environment surrounding the HMD and the plurality of images of the hand-held controller;

determining a position and movement of the HMD within the real-world environment from the plurality of images of the real-world environment;

determining a position of the hand-held controller relative to the HMD from the plurality of images of the hand-held controller;

modifying a state output by execution of the application according to the position and movement of the HMD and changes in the position of the hand-held controller; and

updating the plurality of image scenes displayed in the HMD according to the modification of the state.

17. The method of claim 16, further comprising:

measuring an orientation of the HMD in the real-world environment to output a plurality of measurements of the orientation of the HMD;

receiving the plurality of measurements of the orientation from the HMD to identify the orientation of the HMD.

18. The method of claim 17, further comprising:

measuring an orientation of the hand-held controller relative to the HMD to output another plurality of measurements of the hand-held controller;

receiving the other plurality of measurements of the orientation from the hand-held controller to identify the orientation of the hand-held controller.

19. A computer-readable medium containing program instructions for tracking a hand held controller relative to a head mounted display (HMD), wherein execution of the program instructions by one or more processors of a computer system causes the one or more processors to carry out a plurality of operations of:

displaying a plurality of image scenes having content generated by execution of an application, wherein the HMD has a lower region having a camera, wherein the camera has a capture view directed downward from the HMD;

capturing, by the camera, a plurality of images of the hand-held controller with respect to the HMD;

receiving a plurality of images of a real-world environment surrounding the HMD and the plurality of images of the hand-held controller;

determining a position and movement of the HMD within the real-world environment from the plurality of images of the real-world environment;

determining a position of the hand-held controller relative to the HMD from the plurality of images of the hand-held controller;

modifying a state output by execution of the application according to the position and movement of the HMD and changes in the position of the hand-held controller; and

updating the plurality of image scenes displayed in the HMD according to the modification of the state.

20. The computer-readable medium of claim 19, wherein the operations further include: measuring an orientation of the HMD in the real-world environment to output a plurality of measurements of the orientation of the HMD;

identifying the orientation of the HMD from the plurality of measurements.

21. The computer-readable medium of claim 20, wherein the operations further include: measuring an orientation of the hand-held controller relative to the HMD to output another plurality of measurements of the hand-held controller;

identifying the orientation of the hand-held controller from the other plurality of measurements.