US 11,386,629 C1 (12,934th)
Cross reality system
Jeremy Dwayne Miranda, Miramar, FL (US); Rafael Domingos Torres, Boca Raton, FL (US); Daniel Olshansky, Mountain View, CA (US); Anush Mohan, Mountain View, CA (US); Robert Blake Taylor, Porter Ranch, CA (US); Samuel A. Miller, Hollywood, FL (US); Jehangir Tajik, Fort Lauderdale, FL (US); Ashwin Swaminathan, Dublin, CA (US); Lomesh Agarwal, Fremont, CA (US); Ali Shahrokni, San Jose, CA (US); Prateek Singhal, Mountain View, CA (US); Joel David Holder, Austin, TX (US); Xuan Zhao, San Jose, CA (US); Siddharth Choudhary, San Jose, CA (US); Helder Toshiro Suzuki, Mountain View, CA (US); Hirai Honar Barot, Plantation, FL (US); Eran Guendelman, Tel Aviv (IL); Michael Harold Liebenow, Loxahatchee, FL (US); and Christian Ivan Robert Moore, Cupertino, CA (US)
Filed by Magic Leap, Inc., Plantation, FL (US)
Assigned to MAGIC LEAP, INC., Plantation, FL (US)
Reexamination Request No. 90/019,485, Apr. 16, 2024.
Reexamination Certificate for Patent 11,386,629, issued Jul. 12, 2022, Appl. No. 17/208,844, Mar. 22, 2021.
Application 90/019,485 is a continuation of application No. 16/538,759, filed on Aug. 12, 2019, granted, now 10,957,112.
Claims priority of provisional application 62/884,109, filed on Aug. 7, 2019.
Claims priority of provisional application 62/870,954, filed on Jul. 5, 2019.
Claims priority of provisional application 62/868,786, filed on Jun. 28, 2019.
Claims priority of provisional application 62/815,955, filed on Mar. 8, 2019.
Claims priority of provisional application 62/812,935, filed on Mar. 1, 2019.
Claims priority of provisional application 62/742,237, filed on Oct. 5, 2018.
Claims priority of provisional application 62/718,357, filed on Aug. 13, 2018.
Ex Parte Reexamination Certificate issued on Jun. 6, 2025.
Int. Cl. G06T 19/00 (2011.01); G02B 27/00 (2006.01); G02B 27/01 (2006.01); G06F 3/01 (2006.01); G06T 19/20 (2011.01); G06V 20/20 (2022.01)
CPC G06T 19/006 (2013.01) [G02B 27/0093 (2013.01); G02B 27/017 (2013.01); G06F 3/011 (2013.01); G06T 19/20 (2013.01); G06V 20/20 (2022.01)]
OG exemplary drawing
AS A RESULT OF REEXAMINATION, IT HAS BEEN DETERMINED THAT:
Claims 1, 5 and 7-14 are determined to be patentable as amended.
Claims 2-4, 6 and 15-20, dependent on an amended claim, are determined to be patentable.
New claims 21-28 are added and determined to be patentable.
1. An electronic system comprising:
an electronic device that includes:
a processor;
a computer-readable medium connected to the processor, the computer-readable medium comprising a first coordinate frame and a second coordinate frame different from the first coordinate frame [ , wherein the first coordinate frame and the second coordinate frame are stored on the computer-readable medium of the electronic device] ;
a data channel to receive data representing virtual content, wherein the processor is configured to execute a coordinate frame transformer to transform a positioning of the virtual content from the first coordinate frame to the second coordinate frame; and
a display system adapted to display the virtual content based, at least in part, on the positioning of the virtual content in the second coordinate frame.
5. The electronic system of claim 3, wherein [ :]
the electronic device further includes : an inertial measurement unit secured to the head-mountable frame that detects movement of the head-mountable frame, wherein [ and ]
the head frame determining routine computes the head coordinate frame based on a measurement of the inertial measurement unit.
7. The electronic system of claim 1 [ 3] , wherein:
the computer-readable medium further includes a camera coordinate frame including a plurality of eye positions of an eye that moves relative to the head-mountable frame, wherein the camera coordinate frame is the second coordinate frame, and
the coordinate frame transformer comprises transforming the head coordinate frame to the camera coordinate frame.
8. An electronic system [ device ] comprising:
a device comprising one or more sensors configured to capture data about one or more objects in a scene, the data being in a first coordinate frame; and
a computer-readable medium comprising computer executable instructions for specifying a location of virtual content in the scene based at least in part on information derived from the data in the first coordinate frame, wherein [ :]
the location of the virtual content is specified in a second coordinate frame different from the first coordinate frame [ ; and
the first coordinate frame and the second coordinate frame are stored on the computer-readable medium] .
9. The electronic system [ device ] of claim 8, wherein:
the first coordinate frame is a first pose of the electronic system [ device ] when the electronic system [ device ] is powered on for capturing the data.
10. The electronic system [ device ] of claim 8, wherein:
the first coordinate frame has an origin determined based, at least in part, on dimensions of the electronic system [ device ] and one or more poses of the one or more sensors of the electronic system [ device ] when capturing the data.
11. The electronic system [ device ] of claim 8, comprising: at least one processor configured to execute additional computer executable instructions to provide the virtual content, wherein the additional computer executable instructions comprise instructions for:
determining the first coordinate frame based, at least in part, on the one or more objects in the scene; and
transforming the specified location of the virtual content in the second coordinate frame to the first coordinate frame.
12. The electronic system [ device ] of claim 11, wherein the first [ second ] coordinate frame is determined based, at least in part, on one or more nodes on an outer surface of a bounding box that encloses the virtual content.
13. The electronic system [ device ] of claim 11, wherein specifying a location of the virtual content in the scene based at least in part on information derived from the data in the first coordinate frame comprises determining a location of the device in a coordinate frame used by an application.
14. An electronic system comprising:
a portable device comprising a display and one or more sensors configured to capture data, with respect to a device coordinate frame, about a 3D environment; and
a computer-readable medium comprising computer executable instructions for:
[ generate a persistent coordinate frame (PCF) based on data captured with the one or more sensors;]
obtaining a location of a virtual object in a stored map of the 3D environment with respect to a stored coordinate frame of the 3D environment,
computing a transformation between the device coordinate frame and the stored coordinate frame,
[ determining, based at least in part on the transformation, the location of the virtual object with respect to the PCF, ] and
rendering [ the virtual object with respect to the PCF such that ] the virtual object [ is rendered ] on the display at a location determined, at least in part, based on the transformation and the obtained location of the virtual object [ , wherein:
the PCF comprises a plurality of anchors; and
the location of the virtual object with respect to the PCF is determined relative to an anchor of the plurality of anchors] .
[ 21. The electronic systems of claim 2, wherein executing, by the processor, the world frame determining routine to compute the world coordinate frame based on the at least one point comprises:
determining a pose of the electronic device in response to the electronic device being powered on; and
computing the world coordinate frame using the pose as a world origin of the world coordinate frame.]
[ 22. The electronic systems of claim 3, wherein executing, by the processor, the head frame determining routine to compute the head coordinate frame that changes upon movement of the head-mountable frame comprises:
determining a pose of the head-mountable frame in response to the head-mountable frame capturing an image; and
computing the head coordinate frame using the pose as a head origin of the head coordinate frame.]
[ 23. The electronic system of claim 7, wherein the processor is further configured to execute:
a local frame determining routine to compute the first coordinate frame based on the received data representing the virtual content; and
a local frame storing instruction to store the first coordinate frame on the computer-readable medium.]
[ 24. The electronic system of claim 23, wherein:
the coordinate frame transformer is configured to transform the first coordinate frame to the world coordinate frame, and the world coordinate frame to the head coordinate frame.]
[ 25. The electronic device of claim 11, wherein transforming the specified location of the virtual content in the second coordinate frame to the first coordinate frame comprises:
transforming the specified location of the virtual content in the second coordinate frame to a world coordinate frame, the world coordinate frame being a pose of the device determined in response to the device being powered on.]
[ 26. The electronic device of claim 25, wherein transforming the specified location of the virtual content in the second coordinate frame to the first coordinate frame comprises:
transforming the world coordinate frame to a head coordinate frame, the head coordinate frame being a pose of the device determined in response to the one or more sensors capturing data.]
[ 27. The electronic device of claim 26, wherein transforming the specified location of the virtual content in the second coordinate frame to the first coordinate frame comprises:
transforming the head coordinate frame to the first coordinate frame.]
[ 28. The electronic device of claim 27, wherein:
the first coordinate frame encompasses all pupil positions such that the virtual content can be displayed consistently regardless of pupil positions.]