Certains contenus de cette application ne sont pas disponibles pour le moment.
Si cette situation persiste, veuillez nous contacter àObservations et contact
1. (WO2018165767) SYSTÈMES ET PROCÉDÉS D'AFFICHAGE À RÉALITÉ AUGMENTÉE EN CHIRURGIES DE NAVIGATION
Note: Texte fondé sur des processus automatiques de reconnaissance optique de caractères. Seule la version PDF a une valeur juridique

Claims

What is claimed is:

1. A computer-implemented method to provide augmented reality in relation to a patient, the method comprising:

receiving, by at least one processor, images of a real 3D space containing the patient and one or more targets associated with respective objects and/or anatomical structures of the patient in the real 3D space, the images received from a single optical sensor unit having a field of view of the real 3D space containing the patient and one or more targets;

determining tracking information from the images for respective ones of the one or more targets;

registering an anatomical structure of the patient in a computational 3D space maintained by the at least one processor using tracking information for a respective target associated with the anatomical structure, generating a corresponding position and orientation of the anatomical structure in the computational 3D space from a position and orientation of the anatomical structure in the real 3D space;

aligning an overlay model of an augmented reality overlay to a desired position and orientation in the computational 3D space relative to the corresponding position and orientation of the anatomical structure; and

rendering and providing the augmented reality overlay for display on a display screen in the desired position and orientation.

2. The method of claim 1 comprising providing the images of the real 3D space for display on the display screen to simultaneously visualize the anatomical structure and the augmented reality overlay.

3. The method of any one of claims 1 and 2 wherein the optical sensor unit comprises calibration data to determine 3D measurements from the images of the real 3D space provided by the optical sensor unit in 2D and the step of determining tracking information comprises using by the at least one

processor the calibration data to determine the tracking information.

4. The method of any one of claims 1 to 3, comprising, in real time and in response to a relative movement of the anatomical structure and the optical sensor unit in the real 3D space, wherein a pose of the respective target associated with the anatomical structure continuously indicates a position and orientation of the anatomical structure in the real 3D space:

determining a moved position and orientation of the anatomical structure in the real 3D space using the images received from the optical sensor unit;

updating the aligning of the augmented reality overlay relative to the moved position and orientation of the anatomical structure to determine a moved desired position and orientation of the augmented reality overlay; and

providing the augmented reality overlay for display in the moved desired position and orientation.

5. The method of claim 4 wherein the respective target associated with the anatomical structure is either 1) attached to the anatomical structure such that one or both of the optical sensor unit and anatomical structure are free to move in the real 3D space or 2) attached to another object while the location of anatomical structure remains constant in the real 3D space and the optical sensor unit alone is free to move in the real 3D space.

6. The method of any one of claims 1 to 5, wherein the image of the real 3D space comprises an enlarged image and the augmented reality overlay is enlarged to match the enlarged image.

7. The method of any one of claims 1 to 6, wherein the anatomical structure is a femur and one of the targets associated with the anatomical structure is a femoral target attached to the femur.

8. The method of claim 7, wherein the overlay model is a 3D model of a generic or a patient-specific femur model and the augmented reality overlay is an image representing a generic or a patient-specific femur respectively.

9. The method of any one of claims 1 to 6, wherein the anatomical structure is a pelvis and the target associated with the anatomical structure is a pelvic target.

10. The method of claim 9, wherein the overlay model is a 3D model of a generic or a patient-specific pelvis model and the augmented reality overlay is an image representing a generic or a patient-specific pelvis respectively.

11. The method of any one of claims 1 to 6, wherein the overlay model is a 3D model of a mechanical axis model and the augmented reality overlay is an image of a mechanical axis and/or a further axis or plane, a location of which is determined relative to a location of the mechanical axis of the anatomical structure.

12. The method of claim 11, comprising determining the mechanical axis of the anatomical structure using tracking information obtained from target images as the anatomical structure is rotated about an end of the anatomical structure.

13. The method of claim 12, wherein the further axis and/or plane is a resection plane.

14. The method of claim 13, wherein the location of the resection plane along the mechanical axis model is adjustable in response to user input thereby to adjust the desired position and orientation of the resection plane in the augmented reality overlay.

15. The method of any one of claims 11 to 14, wherein the bone is a femur.

16. The method of claim 15, comprising:

registering a tibia of a same leg of the patient in the computational 3D space, the tibia coupled to a tibia target of the one or more targets, the at least one processor determining a position and orientation of the tibia in the real 3D space to generate a corresponding position and orientation of the tibia in the computational 3D space from tracking information determined from images of the tibia target;

aligning a second overlay model of a second augmented reality overlay to a second desired position and orientation in the computational 3D space relative to the corresponding position and orientation of the tibia;

providing the second augmented reality overlay for display on the display screen in the second desired position and orientation.

17. The method of claim 16, wherein registering uses images of one of the targets attached to a probe where the probe identifies first representative locations on the tibia with which to define a first end of the tibia and second identifying locations about an ankle of the patient with which to define a second end and a mechanical axis of the tibia.

18. The method of claim 16, comprising:

tracking movement of the position and orientation of the tibia in the real 3D space;

updating the corresponding position and orientation of the tibia in response to the movement of the position and orientation of the tibia in the real 3D space;

updating the aligning of the second augmented reality overlay relative to the position and orientation of the tibia as moved to determine the second desired position and orientation as moved; and

providing the second augmented reality overlay for display in the second desired position and orientation as moved.

19. The method of claim 18, comprising determining a location of each of the augmented reality overlay of the femur and the augmented reality overlay of the tibia and indicating a relative location to one another to denote at least one of proximity and intersection.

20. The method of any one of claims 1 to 19, wherein the optical sensor unit comprises a single unit configured in accordance with one of the following:

multi-spectral camera (providing visible and tracking channels);

dual cameras (providing respective visible and tracking channels);

dual imager (using prism to split visible and tracking channels); and

tracking channel using visible light.

21. The method of any one of claims 1 to 20, wherein the anatomical structure is surgically modified and wherein the overlay model is a 3D model of a generic or patient-specific human anatomical structure prior to replacement by a prosthetic implant and the augmented reality overlay is an image representing a generic or a patient-specific human anatomical structure respectively; and wherein the method comprises providing images of the patient for display on the display screen to simultaneously visualize the anatomical structure and the augmented reality overlay.

22. The method of any one of claims 1 to 21, wherein the overlay model is a 3D model defined from pre-operative images of the patient.

23. The method of any one of claims 1 to 6, wherein the overlay model is a 3D model defined from preoperative images of the patient and the pre-operative images of the patient show a diseased human anatomical structure and wherein the overlay model represents the diseased human anatomical structure without a disease.

24. A computer-implemented method to provide augmented reality in relation to a patient, the method comprising:

receiving, by at least one processor, images of a real 3D space containing the patient and one or more targets associated with respective objects and/or anatomical structures of the patient in the real 3D space, the images received from a single optical sensor unit having a field of view of the real 3D space containing the patient and one or more targets;

determining tracking information from the images for respective ones of the one or more targets;

providing, for simultaneous display on a display screen, i) images of the real 3D space from the optical sensor; and ii) renderings of an augmented reality overlay; wherein the augmented reality overlay is defined from an overlay model in a computational 3D space and displayed in an initial position and orientation within the field of view of the optical sensor unit as displayed on the display screen;

registering, by the at least one processor, an anatomical structure of the patient in the computational 3D space by receiving input to use tracking information to capture a pose of one of the targets in the field of view, the one of the targets attached to the anatomical structure, the input received when the anatomical structure as displayed is aligned with the initial position and orientation of the augmented reality overlay; and wherein the pose defines a position and orientation of the anatomical structure in the real 3D space to generate a corresponding position and orientation of the anatomical structure in the computational 3D space; and

associating, in the computational 3D space, a desired position and orientation of the augmented reality overlay relative to the corresponding position and orientation of the anatomical structure.

25. A computer-implemented method to provide augmented reality in relation to a patient, the method comprising:

receiving, by at least one processor, images of a real 3D space containing the patient and one or more targets associated with respective objects and/or anatomical structures of the patient in the real 3D space, the images received from a single optical sensor unit having a field of view of the real 3D space containing the patient and one or more targets;

determining tracking information from the images for respective ones of the one or more targets;

providing, for simultaneous display on a display screen, i) optical sensor images of the real 3D space from the optical sensor unit; and ii) renderings of an augmented reality overlay; wherein the augmented reality overlay is defined from an overlay model in a computational 3D space and displayed in an overlay position and orientation relative to a pose of an overlay target in the field of view of the optical sensor unit, the overlay position and orientation moving in response to movement of the overlay target in the real 3D space;

registering, by the at least one processor, an anatomical structure of the patient in the computational 3D space by receiving input to use tracking information to capture a registration lock pose of the overlay target and a registration pose of an anatomical structure target associated with the anatomical structure, the input received to affect an aligning when said augmented reality overlay is aligned with an initial position and orientation of the anatomical structure in the real 3D space, generating a corresponding position and orientation of the anatomical structure in the computational 3D space comprising the aligning from the initial position and orientation of the anatomical structure in the real 3D space; and

associating, in the computational 3D space, a desired position and orientation of the augmented reality overlay relative to the corresponding position and orientation of the anatomical structure for use when subsequently rendering the augmented reality overlay.

26. The method of claim 24 or 25, comprising, in real time and in response to a relative movement of the anatomical structure and the optical sensor unit in the real 3D space, wherein a pose of the anatomical structure target associated with the anatomical structure continuously indicates a position and orientation of the anatomical structure in the real 3D space:

determining a moved position and orientation of the anatomical structure using the images received from the optical sensor unit;

updating the aligning of the augmented reality overlay relative to the moved position and orientation of the anatomical structure to determine a moved desired position and orientation of the augmented reality overlay; and

rendering and providing, for simultaneous display on the display screen, i) images of the real 3D space from the optical sensor unit; and ii) the augmented reality overlay in response to the moved desired position and orientation of the augmented reality overlay.

27. The method of claim 24 or 25 comprising performing an initial registration of the anatomical structure, an initial aligning of the augmented reality overlay to the anatomical structure and an initial rendering and providing such that the augmented reality overlay and anatomical structure are misaligned in the images of the real 3D space when displayed.

28. A computer-implemented method to provide augmented reality in relation to a patient, the method

comprising:

receiving, by at least one processor, images of a real 3D space containing the patient, a bone removal tool and a target associated with an anatomical structure of the patient in the real 3D space, the images received from a single optical sensor unit having a field of view of the real 3D space containing the patient and the target;

determining tracking information from the images for the target;

registering the anatomical structure of the patient in a computational 3D space maintained by the at least one processor using tracking information for a respective target associated with the anatomical structure, generating a corresponding position and orientation of the anatomical structure in the computational 3D space from a position and orientation of the anatomical structure in the real 3D space;

aligning an overlay model of an augmented reality overlay comprising a planned implant position to a desired position and orientation in the computational 3D space relative to the corresponding position and orientation of the anatomical structure; and

rendering and providing the planned implant position and the images of the real 3D space for display on a display screen to simultaneously visualize the planned implant position and the bone removal tool.

29. A computer-implemented method to provide augmented reality in relation to a patient, the method comprising:

receiving, by at least one processor, images of a real 3D space containing the patient and one or more targets associated with respective objects and/or anatomical structures of the patient in the real 3D space, the images received from a single optical sensor unit having a field of view of the real 3D space containing the patient and one or more targets;

determining tracking information from the images for respective ones of the one or more targets;

registering an anatomical structure of the patient in a computational 3D space maintained by the at least one processor using tracking information for a respective target associated with the anatomical structure, generating a corresponding position and orientation of the anatomical structure in the computational 3D space from a position and orientation of the anatomical structure in the real 3D space;

registering one or more of: a surgical plan and a tool;

aligning respective overlay models of augmented reality overlays to desired positions and orientations in the computational 3D space relative to the corresponding positions and orientations of the anatomical structure, the surgical plan and/or the tool;

determining desired display information based on receiving user input or context information; and

selectively, based on the desired display information, rendering and providing the augmented reality overlays for display on a display screen in the desired positions and orientations.

30. A navigational surgery system comprising a computing unit, an optical sensor unit and one or more targets for tracking objects by the optical sensor unit providing tracking images having tracking information for said targets and visible images of a procedure in a field of view of the optical sensor unit to the computing unit, the computing unit having at least one processor configured to perform a method in accordance with any one of claims 1 to 29.

31. The navigational surgery system of claim 30 comprising:

a platform to selectively, removably and rigidly attach one of the optical sensor unit and one of the trackers to an anatomical structure of the anatomy of the patient, the platform comprising a body having at least one surface, the at least one surface configured to provide an optically trackable pattern, a repeatable optical sensor mount and a repeatable target mount, wherein the optically trackable pattern extends into a field of view of the optical sensor unit when mounted to the platform; and wherein:

a spatial relationship between the optically trackable pattern and the repeatable target mount is predefined by a target-pattern definition; and

the computing unit is configured to:

receive first images including the optically trackable pattern features when the optical sensor unit is mounted to the platform;

perform operations to calculate a pose of the optically trackable pattern;

perform operations to calculate the pose of the repeatable target mount based on the pose of the optically trackable pattern and the target-pattern definition;

receive second images when the optical sensor unit is removed from the platform and one of the trackers is mounted to the platform, the second images including the one of the trackers mounted to the platform; and

track the anatomical structure to which the one of the trackers is attached.