Traitement en cours

Veuillez attendre...

Paramétrages

Paramétrages

Aller à Demande

1. WO2020118276 - ÉLÉMENT DE POSITIONNEMENT ORIENTABLE

Note: Texte fondé sur des processus automatiques de reconnaissance optique de caractères. Seule la version PDF a une valeur juridique

[ EN ]

STEERABLE POSITIONING ELEMENT

Related Applications

[0001] The present application claims priority to U.S. Provisional Patent Application No. 62/777,061 , filed on December 7, 2018, and U.S. Provisional Patent Application 62/902,377, filed on September 17, 2019, and incorporates both applications by reference in their entirety.

Field of the Invention

[0002] The present application relates to near-eye display systems, and in particular to a steerable positioning element in a near-eye display.

Background

[0003] Near-eye displays have the competing requirements of displaying images at a high resolution, over a large field of view (FOV). For many applications in virtual and augmented reality, the field of view should be greater than 90 degrees, and ideally the binocular field of view would extend past 180 degrees. At the same time, the resolution of the display should match that of the human visual system so that little or no pixelation is perceived in the virtual images. Combining these two requirements in a single system presents a number of challenges. To avoid the appearance of pixelation, the resolution needs to be on the order of 0.01-0.02 degrees per pixel. Over a 90-degree square field of view, this corresponds to 4.5k x 4.5k pixels per eye or higher. Achieving such resolutions is challenging at the level of the panel, the drive electronics, and the rendering pipeline.

[0004] Additionally, optical systems that can project wide FOV images to the user with sufficiently high resolution over the entire field of view are also difficult to design. Systems architectures that are able to present the user with high resolution images over a wide field of view, while simultaneously reducing the rendering, data rate, and panel requirements will enable new applications for augmented and virtual reality systems.

List of Figures

[0005] The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:

[0006] Figure 1 A is an illustration of a first embodiment of a steerable positioning element.

[0007] Figures 1 B and 1C are a perspective view and a cross-section of another embodiment of a steerable positioning element.

[0008] Figure 1 D is an illustration of another embodiment of the steerable positioning element.

[0009] Figure 1 E is an illustration of another element of the steerable positioning element.

[0010] Figure 1 F is a cross-section of the embodiment of Figure 1 E.

[0011] Figure 2 is a block diagram of one embodiment of the system.

[0012] Figure 3 is a block diagram of one embodiment of the steerable positioning element.

[0013] Figure 4A is a flowchart of one embodiment of using a steerable positioning element.

[0014] Figure 4B is a flowchart of one embodiment of positioning verification for the steerable positioning element.

[0015] Figure 4C is an illustration of one embodiment of the movement of the display, in a steerable display.

[0016] Figure 5 is a flowchart of another embodiment of using a steerable positioning element.

[0017] Figure 6 is a flowchart of one embodiment of controlling the use of

the steerable element.

Detailed Description

[0018] The present application discloses a steerable positioning element which may be used to enable a steerable display. In one embodiment, the steerable positioning element may be a mirror, lens, prism, dichroic mirror, switchab!e crystal or other positioning element. The steerable display in one embodiment is designed to be positionable to provide a high resolution display in the area where the users fovea is located. The“fovea” is the small depression in the retina of the eye where visual acuity is highest. In another embodiment, the steerable display may be positioned to provide a heads-up display, or a sprite, in a particular location. The location may be based on the user’s surroundings, the user’s gaze, other externa! data, or another factor. The steerable display may be used in a virtual reality and/or an augmented reality display, in one embodiment. The steerable display may also be used for any other purpose, in which a high resolution display is designed to be positioned.

[0019] The following detailed description of embodiments of the invention makes reference to the accompanying drawings in which like references indicate similar elements, showing by way of illustration specific embodiments of practicing the invention. Description of these embodiments is in sufficient detail to enable those skilled in the art to practice the invention. One skilled in the art understands that other embodiments may be utilized and that logical, mechanical, electrical, functional and other changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims.

[0020] Figure 1 A illustrates one embodiment of a steerable positioning element. In one embodiment, the system 110 includes display element 120 supported by gimbals 155 and support structure columns 125.

[0021] The display element 120 may pivot along two axes. In one embodiment, the system includes two gimbals 155, each of which provides pivoting along one axis. The pivoting of the display element 120 is controlled by the piezoelectric elements 135 mounted to flexible arms 130, acting as the X-axis controller and the Y-axis controller. In one embodiment the flexible arms 130 are made of metal. In one embodiment, the flexible arms support the piezoelectric elements 135. The flexible arms 130 provide a static force against the side of the assembly, to ensure that piezoelectric elements 135 apply a force normal to the driving surface of the display element 120 as the piezoelectric element 135 is actuating and remains in contact with the display element 120 when at rest.

[0022] In one embodiment, the range of motion of the display element 120 may be +/- 10 degrees along both the X and Y axis. The drivers 145 drive the piezoelectric elements 135 to control motion.

[0023] In one embodiment, microcontroller 147 receives control data from the system, and controls the drivers 145 to drive the piezoelectric elements 135, to move the display element 120.

[0024] In one embodiment, position sensor 140 is used to verify the actual position of the display element 120. In one embodiment, position sensor 140 may be one or more magnetic sensors which can sense the relative change in a magnetic field of one or more magnets associated with the display device. In one embodiment the magnets are positioned near the outer diameter of the display element 120. In one embodiment, two magnets are positioned 90 degrees apart radially. In one

embodiment, the magnets and associated magnetic sensors are positioned opposite the drive surfaces. This provides minimal cross-coupling, and the most accurate measurement.

[0025] In one embodiment, the weight of the drive element is balanced by the weight of the magnet on the display element 120. The magnets may be rare earth magnets, in one embodiment. The magnetic sensors are placed in close proximity to the magnets. In another embodiment, four magnets may be used. In one embodiment, in a four magnet configuration two magnets are positioned opposite the drive elements and two additional magnets are placed further away from the display element. This adds more mass to the system, but provides the ability to cancel other magnetic fields in the space, including the earth’s magnetic field, for more accurate measurement of the changes in the magnetic field based on the movement of the display element 120. In one embodiment, the magnetic sensors are Hall effect sensors. In one embodiment, the magnetic sensors are magnetometers.

[0026] In one embodiment, one or more additional magnetic sensors are used to measure the earth’s magnetic field, or other ambient magnetic fields. In one embodiment, the impact of the ambient magnetic fields are subtracted, to negate its impacts on the display element position measurements. In one embodiment, the additional sensors are oriented to be approximately aligned with the measurement sensors on the assembly. In one embodiment, a single 3-axis magnetic sensor may be used. In one embodiment, a differential sensor may be used.

[0027] In one embodiment, the calculation comparing the actual position to the instructed position occurs on a processor, as will be described with respect to Figure 2. In another embodiment, the positioning element may be controlled using an analog control circuit, that does not utilize a microcontroller 147.

[0028] The display element 120 may be a mirror, lens, prism, holographic optical element (HOE), liquid crystal polymer, and/or another element utilized in directing light for a steerable display. In one embodiment, the display element 120 may be a Fresnel reflector, diffractive element, surface relief grating, light guide, wave guide, or volume hologram.

[0029] In one embodiment, piezo-electric elements 135 are actuators to move the display element 120. Alternatively, the piezo-electric elements 135 may be replaced magnetic and/or inductive elements, nanomotors, electrostatic elements, or other devices which enable the movement of the display element 120 with the precision and speed needed for a display system.

[0030] Figure 1 B is a top view of the steerable positioning element of Figure 1A.

[0031] Figure 1C is another embodiment of the steerable positioning element, in which a flexible printed circuit board 152 is added, and the

microcontroller is moved to a separate board. In one embodiment, the flexible printed circuit board 152 weaves in, as shown. This makes the steerable positioning element a little lighter.

[0032] The tables below illustrate exemplary configurations of the optical and physical characteristics of one embodiment of the steerable positioning element. Note that while these tables show measurements, and in some instances ranges of a preferred embodiment, variations from these ranges, and especially additional precision, may be preferred when possible. Additionally, while some ranges are provided, the system is not limited to these ranges, in one embodiment.

[0033] In one embodiment, a system may include two mirrors.

[0034] In one embodiment, a fast moving element may be designed to match the movement of the eye in speed, with a small angle movement range of 0.3° in 300 ps and a large angle movement range of 2° - 20° in 300 ps. Such a fast-moving element may move every frame and can ignore saccades because the movement speed is fast enough, so it is not perceptible by a user.

[0035] A medium fast-moving display element in one embodiment also can move every frame, with a small angle movement range of 0.3° in 4ms and a large angle movement range of 2° - 20° in 8ms - 50ms. In one embodiment, this configuration permits saccades to settle by the time the eye settling time begins.

[0036] A medium slow mirror in one embodiment has a small angle movement range of 0.6° in 9ms and a large angle movement range of 2° - 20° in 8ms - 50ms. In one embodiment, the medium slow mirror moves at approximately the same speed as the medium fast mirror over larger angles, but more slowly over smaller angles. However, even the medium slow mirror in one embodiment can move every frame.

[0037] A slow mirror has a small angle movement range of 0.15° in 16 ms and a large angle movement range of 2° - 20° in 20 ms - 100 ms. Because of its slower speed, the slow mirror utilizes a blank frame during movement, in one embodiment. In one embodiment, the blank frame may be a subframe for displays capable of subframe blanking. In one embodiment, a slow mirror utilizes saccadic masking, and relies on the eye settling time to ensure that the user does not perceive the motion of the display controlled by the mirror.

[0038] In one embodiment, the system is designed to move the mirror during the time the display is off. For most OLED based VR displays, the duty cycle is in the range of 20% (that is, the display itself is only on for 20% of the frame time). The steerable display can be moved during the time when the display is off. The specific angles, speeds, and configuration details discussed are of course merely exemplary. A faster, slower, or intermediate mirror having different specifications may be used.

[0039] The below tables should be considered exemplary configurations. One of skill in the art would understand that these aspects may be varied without departing from the scope of the invention.

[0001] Figure 1 D illustrates another embodiment of the steerable positioning element 111. In one embodiment, the system 111 includes display element 174 supported by flexible arms 170 and support structure base 160.

[0002] The display element 174 may pivot along two axes. The pivoting of the display element 174 is controlled by the piezoelectric elements 172. In one embodiment, the range of motion may be +/- 18 degrees along both the X and Y axis. The drivers 164 drive the piezoelectric elements 172 to control motion.

[0003] Microcontroller 176 receives the control data from the system and controls the drivers to move the display element 174. Position sensor 168 is used to verify the actual position of the display element 174. In one embodiment, the calculation comparing the actual position to the instructed position occurs on a processor, as will be described below.

[0004] The display element 174 may be a mirror, lens, prism, holographic optical element (HOE), liquid crystal polymer, adjustable mirror, tunable prism, acousto-optical modulator, adjustable display panel, a curved mirror, a diffractive element, a Fresnel reflector and/or another element utilized in directing light for a steerable display. In one embodiment, the display element 174 may be a Fresnel

reflector, diffractive element, surface relief grating, light guide, wave guide, or volume hologram.

[0005] In one embodiment, piezo-electric elements 172 are actuators to move the display element 174. Alternatively, the piezo-electric elements 172 may be replaced magnetic and/or inductive elements, nanomotors, electrostatic elements, or other devices which enable the movement of the display element 174 with the precision and speed needed for a display system.

[0006] Figures 1 E and 1 F are a perspective view and a cross-section of another embodiment of a steerable positioning element. The display element 180 is supported in position by a plurality of positioning columns 184. The positioning columns 185 enable the movement of the display element 180. The positioning columns 185 are supported by base structure 182. Although not shown, this embodiment also includes a microcontroller and position sensor.

[0007] The cross-section in Figure 1 F shows the elements of the positioning columns 185 and the central support 188, of the embodiment of Figure 1 E. In one embodiment, the system includes two or more positioning columns 192, and central support 188. The central support 188 in one embodiment is positioned in the center of the display element 180 and provides a stable point around which the display element tilts. Each positioning column 192 in one embodiment includes an actuator 198, a moving support structure 196, and a tilt top 194.

[0008] In one embodiment, the actuator 198 is a piezoelectric element which moves the moving support structure 196 up and down. Alternatively, the actuator 198 may be a magnetic and/or inductive element, nanomotor, electrostatic element, or other actuator mechanism which enables the movement of the moving support structure 196 with the precision and speed needed for a display system. [0009] The moving support structure 196 moves up and down and has attached a tilt-top 194. The tilt top 194 in one embodiment is round or has a rounded top which fits into a notch 190 in the display element 180. In one embodiment, the connection between the moving support structure 196 and the tilt-top 194 is magnetic.

[0010] The tilt top 194 enables the display element 180 to be tilted by moving up and down. Because the tilt top 194 is smooth and fits into the notch 190, the tilt top maintains contact with the display element 180.

[0011] In one embodiment, the tilt top 194 is a freely rotating sphere, coupled to the moving support structure 196 and the notch 190 via magnetic force.

In this way, the system can utilize an actuator with fast up-down motion capabilities to provide a range of motion to the display element 180. The range of motion, and capabilities of the display element are discussed below with respect to the Tables 5 through 6.

[0040] Tables 1 and 2 illustrate exemplary optical, physical, and other characteristics for a first configuration of the steerable mirror.

TABLE 1 :


TABLE 2:


[0041] Tables 3 and 4 illustrate exemplary optical, physical, and other characteristics for a second configuration, associated with the steerable positioning element of Figure 1A.

TABLE 3:


TABLE 3: Continued


TABLE 4:


[0042] Tables 5 and 6 illustrate exemplary optical, physical, and other characteristics for a third configuration, associated with the steerable positioning element of Figure 1 E and 1 F.

TABLE 5:


TABLE 5: Continued


TABLE 6:


[0043] Note that the above tables describe embodiments of mechanical, optical, and physical characteristics that describe a set of embodiments using various configurations of a steerable display element using a mirror as the positioning element. One of skill in the art would understand the modifications which may be made to the above ranges for a different positioning element.

[0044] Figure 2 illustrates one embodiment of the exemplary optical system 210, 280 and associated processing system 238. In one embodiment, the processing system may be implemented in a computer system including a

processor. In one embodiment, the processing system 238 may be part of the display system. In another embodiment, the processing system 238 may be remote. In one embodiment, the optical system 210, 280 may be implemented in a wearable system, such as a head mounted display. The steerable display image is presented to the user’s eye through a right eye steerable display 220 and left eye steerable display 230, which direct the steerable display. In one embodiment, the steerable displays 220, 230 direct the steerable display image primarily toward the center of the field of view of the user’s eye. In another embodiment, the image may be directed to a different location, as will be described below. The steerable display image is a high resolution image, in one embodiment. In one embodiment, the steerable display image is a variable resolution image. In one embodiment, the variable resolution corresponds to the change in the maximum resolution perceived by of the user’s eye, which drops off as it moves further from the center.

[0045] The image for the right eye is created using a first display element 222. In one embodiment, the display element is a digital micromirror device (DMD).

In one embodiment, the display element 222 is a scanning micromirror device. In one embodiment, the display element 222 is a scanning fiber device. In one embodiment, the display element is an organic light-emitting diode (OLED). In one embodiment, the display element 222 is a liquid crystal on silicon (LCOS) panel. In one embodiment, the display element 222 is a liquid crystal display (LCD) panel. In one embodiment, the display element 222 is a micro-LED or micro light emitting diode (pLED) panel. In one embodiment, the display element is a scanned laser system. In one embodiment, the system is a hybrid system with an off axis holographic optical element (HOE). In one embodiment, the system includes a waveguide. In one embodiment, the waveguide is a multilayer waveguide. In one embodiment, the display element may include a combination of such elements. Figure 3 below discusses the display elements in more detail.

[0046] In one embodiment, the first display element 222 is located in a near-eye device such as glasses or goggles.

[0047] The focus and field of view for the steerable display is set using intermediate optical elements 224. The intermediate optical elements 224 may include but are not limited to, lenses, mirrors, and diffractive optical elements. In one embodiment, the focus of the virtual image is set to infinity. In another embodiment, the focus of the virtual image is set closer than infinity. In one embodiment, the focus of the virtual image can be changed. In one embodiment, the virtual image can have two or more focal distances perceived simultaneously.

[0048] In one embodiment, the steerable display image is directed primarily toward the center of the field of view of the user’s eye. In one embodiment, the field of view (FOV) of the steerable display image is greater than 1 degree. In one embodiment, the FOV of the steerable display image is between 1 degree and 20 degrees. In one embodiment, the steerable display image may be larger than 5 degrees to address inaccuracies in eye tracking, to provide the region needed to successfully blend such that the user cannot perceive the blending, and to account for the time it takes to reposition the steerable display for the various types of eye movements.

[0049] In one embodiment, the system further includes a lower resolution field display image, which has a field of view of 20-220 degrees.

[0050] In one embodiment, the steerable display image is projected directly onto the user’s eye using a set of one or more totally or partially transparent positioning elements 226. In one embodiment, the positioning elements 226 include a steerable mirror, such as the steerable positioning element shown in Figure 1A. In one embodiment, the positioning elements 226 include a curved mirror. In one embodiment, the positioning elements 226 include a Fresnel reflector. In one embodiment, the positioning elements 226 include a diffractive element. In one embodiment, the diffractive element is a surface relief grating. In one embodiment, the diffractive element is a volume hologram. In one embodiment, the display 220 may include a focal adjustor 223, which enables the display to show image elements at a plurality of focal distances in the same frame. In one embodiment, the focal adjustor 223 may be an optical path length extender, as described in U.S. Patent Application No. 15/236,101 filed on 8/12/2016.

[0051] A similar set of elements are present for the left eye steerable display 230. In one embodiment, the right eye steerable display 220 and the left eye steerable display 230 are matched. In another embodiment, they may include different elements.

[0052] In one embodiment, an eye tracker 240 tracks the gaze vector of the user, e.g. where the eye is looking. In one embodiment, the eye tracking system is a camera-based eye tracking system 240. In one embodiment, the camera-based eye tracking system 240 includes a holographic optical element. In one embodiment, eye tracking system 240 is an infrared scanning laser with a receiving sensor. In one embodiment, the infrared scanning laser eye-tracking system 240 includes a holographic optical element. In one embodiment, eye tracking system 240 is an optical flow sensor. Other eye tracking mechanisms may be used. Position

calculator 245 determines a center of the user’s field of view based on data from the eye tracking system 240.

[0053] In one embodiment, the adjustable positioning elements 226, 236 are used to adjust the right and left eye steerable display 220, 230 to position the image to be directed primarily toward the center of the field of view of the user’s eye. In one embodiment, the adjustable position elements 226, 236 are used to adjust the right and left eye steerable display 220, 230 to position the eye box or exit pupil toward the center of the field of view of the user’s eye. In one embodiment, the direction of the image is adjusted by changing the angle of a mirror, one of the position elements 226, 236. In one embodiment, the angle of the mirror is changed by using electromagnetic forces. In one embodiment, the angle of the mirror is changed by using electrostatic forces. In one embodiment, the angle of the mirror is changed by using piezoelectric forces, as illustrated in Figure 1A. In one

embodiment, the adjustable element is the image source, or display element 222, 232 which is moved to position the image. In one embodiment, the image is positioned to be directed to the center of the field of view of the user’s eye. In another embodiment, another position element 226, 236 may be changed, such as a steering element 226, 236.

[0054] A field display 280 communicates with the processing system 238 via communication logics 270, 290. In one embodiment, there may be multiple displays. Here, two field displays are indicated, field display 285 and peripheral display 288. Additional levels of resolution may also be shown. In one embodiment, the field display 280 may include a single field display 285 viewed by both eyes of the user, or one field display per eye. In one embodiment, the field display 280 may have variable resolution. In one embodiment, the resolution drops off toward the

outside of the display 280, corresponding to the drop in the maximum perceived resolution by the eye.

[0055] In one embodiment, when the field display 280 is a separate system, sync signal generator 292 is used to synchronize the display of the independent steerable display 210 with the display of the field display 280. In one embodiment, the sync signal generator 292 is used to synchronize the adjustable mirror, or other positioning element of the steerable display with the field display. This results in the synchronization of the displays. In one embodiment, field display 280 includes blender system 294 to blend the edges of the steerable display image with the field display image to ensure that the transition is smooth.

[0056] In one embodiment, the lower resolution field display image is presented to the user with a fully or partially transparent optical system. In one embodiment, this partially transparent system includes a waveguide optical system. In one embodiment, this partially transparent system includes a partial mirror which may be flat or have optical power. In one embodiment, this partially transparent system includes a diffractive optical element. In one embodiment, this image is presented to the user through a direct view optical system. In one embodiment, this partially transparent system includes inclusions to reflect or scatter light.

[0057] In one embodiment of the field display 280, an additional display sub-system is used to display images in the region of monovision peripheral display 288. In one embodiment, this sub-system is an LED (light emitting diode) array. In one embodiment, this sub-system is an OLED (organic LED) array. In one embodiment, this display sub-system uses a scanned laser. In one embodiment, this sub-system uses an LCD (liquid crystal display) panel. In one embodiment the field display 280 is an LCOS (liquid crystal on silicon) display. In one embodiment, the field display is a DLP (digital light processing) display. In one embodiment, this sub-system has no intermediate optical elements to manipulate the FOV or focus of the image. In one embodiment, this sub-system has intermediate optical elements.

In one embodiment, these intermediate optical elements include a micro-lens array.

[0058] The image data displayed by the steerable display 210 and field display 280 are generated by processing system 238. In one embodiment, the system includes an eye tracker 240. In one embodiment, an eye tracker 240 tracks the gaze vector of the user, e.g. where the eye is looking. In one embodiment, the eye tracking system is a camera-based eye tracking system 240. Alternately, eye tracking system 240 may be infrared laser based. Foveal position calculator 245 determines a center of the user’s field of view based on data from the eye tracking system 240. In one embodiment, the foveal position calculator 245 additionally uses data from a slippage detection system. Slippage detection in one embodiment detects movement of the headset/goggles on the user’s head, and detects slippage or other shifting which displaces the real location of the user’s eye from the calculated location. In one embodiment, the foveal position calculator 245 may compensate for such slippage by adjusting the calculated foveal location, used by the system to position steerable display.

[0059] The processing system 238 in one embodiment further includes foveal position validator 247 which validates the positioning of the position elements 226, 236, to ensure that the displays 220, 230 are properly positioned. In one embodiment, this includes re-evaluating the steerable display location with respect to the center of the field of view of the user’s eye, in light of the movement of the steerable display. In one embodiment, the foveal position validator 247 provides feedback to verify that the positioning element has reached its target location, using a sensing mechanism. The sensing mechanism may be a camera, in one

embodiment. The sensing mechanism may be gearing in one embodiment. The sensing mechanism in position validator 247 may be a magnetic sensor. The sensing mechanism may be another type of sensor that can determine the position of the optical element. In one embodiment, if the actual position of the steerable display is not the target position, the foveal position validator 247 may alter the display to provide the correct image data. This is described in more detail below.

[0060] In one embodiment, eye movement classifier 260 can be used to predict where the user’s gaze vector will move. This data may be used by predictive positioner 265 to move the steerable display 220, 230 based on the next position of the user’s gaze vector. In one embodiment, smart positioner 267 may utilize user data such as eye movement classification and eye tracking to predictively position the displays 220, 230. In one embodiment, smart positioner 267 may additionally use data about upcoming data in the frames to be displayed to identify an optimal positioning for the displays 220, 230. In one embodiment, smart positioner 267 may position the display 220, 230 at a position not indicated by the gaze vector. For example, if the displayed frame data has only a small amount of relevant data (e.g. a butterfly illuminated on an otherwise dark screen) or the intention of the frame is to cause the viewer to look in a particular position.

[0061] The processing system 238 may further include a cut-out logic 250. Cut-out logic 250 defines the location of the steerable display 220, 230 and provides the display information with the cut-out to the associated field display 280. The field display 280 renders this data to generate the lower resolution field display image including the cut out of the corresponding portion of the image in the field display. This ensures that there isn’t interference between the steerable display image and field image. In one embodiment, when there is a cut-out, blender logic 255 blends the edges of the cutout with the steerable image to ensure that the transition is smooth. In another embodiment, the steerable display may be used to display a sprite, a brighter element overlaid over the lower resolution field image. In such a case, neither the cut out logic 250 nor blender logic 255 is necessary. In one embodiment, the cut out logic 250 and blender logic 255 may be selectively activated as needed.

[0062] In one embodiment, the system may synchronize the steerable display 210 with an independent field display 280. In this case, in one embodiment, synchronization logic 272 synchronizes the displays. In one embodiment, the independent field display 280 is synchronized with the adjustable mirror, or other positioning element of the steerable display 210. This results in the synchronization of the displays. The field display 280 may receive positioning data. In one embodiment, there may not be a cutout in this case.

[0063] In one embodiment, the processing system 238 may include an optical distortion system 275 for the steerable display 210 with distortion that increases from the center to the edge of the image. This intentional distortion would cause the pixels to increase in perceived size moving from the center of the image to the edge. This change in perceived resolution would reduce the amount of processing required, as fewer pixels would be needed to cover the same angular area of the steerable display image. The optical distortion may help with the blending between the steerable display 210 and the field display 280. In another embodiment, the steerable display 210 including the optical distortion system 275 could be used without a field display. It also provides for an easier optical design, and saves processing on the blending.

[0064] In one embodiment, the variable resolution highly distorted image has a large ratio between center and edge. The total FOV of this display would be large (up to 180 degrees).

[0065] In one embodiment, roll-off logic 277 provides a roll-off at the edges of the display. Roll-off in one embodiment may include resolution roll-off (decreasing resolution toward the edges of the display area). In one embodiment, this may be implemented with magnification by the optical distortion system 275. Roll-off includes in one embodiment brightness and/or contrast roll off (decreasing brightness and/or contrast toward the edges.) Such roll-off is designed to reduce the abruptness of the edge of the display. In one embodiment, the roll-off may be designed to roll off into“nothing,” that is gradually decreased from the full

brightness/contrast to gray or black or environmental colors. In one embodiment, roll-off logic 277 may be used by the steerable display 210 when there is no associated field display. In one embodiment, the roll-off logic 297 may be part of the field display 280, when there is a field display in the system.

[0066] Figure 3 illustrates one embodiment of the position elements 300. The position elements in one embodiment include a separate position element for the right eye and the left eye of the user. In one embodiment, rather than having a steerable element 310 for each eye, the system may utilize two or more steerable elements 310 for each eye. In one embodiment, a two element system may include separate steerable elements 310 for the X-axis movement and the Y-axis movement for each eye. In one embodiment, two or more steerable elements 310 may be used, with each steerable element 310 having one or more axes of steerability.

[0067] The steerable element 310 may comprise one or more of a mirror, prism, Fresnel lens, or other element which is positioned so that light can be directed to a particular location. In one embodiment, the steerable element 310 is a curved mirror.

[0068] The X-axis attachment 320 provides the physical moving element for rotating around the X-axis, while the Y-axis attachment 350 provides the moving element for pivoting around the Y-axis. In one embodiment, the moving elements are pivots 150 and gimbals 155.

[0069] The X-axis controller 330 and Y-axis controller 360 control the movement, while the X-axis actuator 340 and Y-axis actuator 370 provide the physical movement. Piezoelectric elements in one embodiment are the controllers. The data for the movement comes from microprocessor 390. In one embodiment, microprocessor 390 is part of the main control circuitry of the steerable display.

[0070] In one embodiment, the system also includes a position validator 380 which verifies the actual position of the steerable element 310 along the X and Y axes. In one embodiment, validator 380 comprises a magnetic sensor, which senses the movement of magnets associated with the movable element. In another embodiment, the validator 380 may be coupled to the actuators 340, 370 or attachment 320, 350, and determine the position of the steerable element 310 based on the physical position of the elements supporting the steerable element 310.

Other methods of determining the actual position of the steerable element 310 may be used.

[0071] In one embodiment, the validator 380 provides data to the microprocessor 390. The microprocessor may compare the data from the controllers 330, 360 with the data from the position validator 380. This may be used for recalibration, as well as to identify issues with the positioning of steerable element 310. In one embodiment, to enable position validator 380, the bottom of the

steerable element 310 has markings which are used by position validator 380 to determine the actual position of the steerable element 310.

[0072] Figure 4C illustrates one embodiment of the movement of the display over time. In one embodiment, the movement may correspond to the location of the user’s fovea as the user’s eye moves. In any time instance, there is a small zone, to which the image is displayed. The location of the 5 degree display of high resolution (in one embodiment) is focused on the center of the user’s field of view. In one embodiment, a low resolution field image provides a large field of view. But because the relative resolution of the eye outside the foveal area is lower, the user perceives this combination image, including the small high resolution steerable image and the larger low resolution field image as high resolution across the large field of view.

[0073] Figure 4A is a flowchart of one embodiment of utilizing the steerable display. The process starts at block 410. In one embodiment, prior to the start of this process the display system is fitted to the user. This initial set-up includes determining the interpupillary distance (IPD) and any prescription needed, to ensure that the“baseline” display for the user is accurate.

[0074] At block 415, the user’s eyes are tracked. In one embodiment, an IR camera is used for tracking eyes. In one embodiment, eye tracking identifies the gaze vector of the user, e.g. where the user is focused.

[0075] At block 420, the system calculates the gaze vector of the user.

The eye tracking may identify left and right eye gaze vector/angle, and gaze center (derived from the L/R eye gaze vectors). In one embodiment, the eye tracking may determine the location (X, Y, Z) and orientation (roll, pitch, yaw) of the left and right eyes relative to a baseline reference frame. The baseline reference frame is, in one embodiment, established when the display is initially fitted to the user and the user’s interpupillary distance, diopters, and other relevant data are established.

[0076] At block 420, the location of the fovea is determined based on the gaze vector data. In one embodiment, the fovea location includes coordinates (X, Y, Z) and orientation (roll, pitch, yaw) for each eye.

[0077] At block 425, the process determines whether the steerable display should be repositioned. This is based on comparing the current position of the steerable display with the user’s gaze vector or the intended position of the image. If they are misaligned, the system determines that the steerable display should be repositioned. If so, at block 430, the display is repositioned. The repositioning of the display is designed so the movement of the steerable display is not perceived by the user. In one embodiment, this may be accomplished by using a mirror that is fast enough to complete the movement in a way that the user cannot perceive it. In one embodiment, this may be accomplished by timing the movement to the user’s blink or eye movement. In one embodiment, if the intended display is moved more than a particular distance, the display is blanked during the move. This ensures that the user does not perceive the movement. In one embodiment, the particular distance is more than 0.5 degrees. In one embodiment, the intended display is not blanked if the movement is occurring while the user is blinking. Note that although the term “repositioning” is used, this corresponds to the movement of the positioning elements, to adjust the position of the display.

[0078] The process then continues to block 435, whether or not the display was repositioned.

[0079] At block 435, optionally the system cuts out the portion of the field display image that would be positioned in the same location as the steerable display image. This prevents the field display from interfering with the steerable display.

The cut-out, in one embodiment, is performed at the rendering engine. In another embodiment, the image may be a sprite or other bright image element which does not need a cut-out to be clear. In that instance, this block may be skipped. In one embodiment, the cut-out is skipped if the user eye tracking indicates that the user’s gaze has moved substantially from the baseline reference. The baseline reference is the user’s default gaze position, from which the movement of the gaze is tracked. A substantial movement from the baseline reference means that the system cannot determine the user’s correct gaze position. In this instance, in one embodiment, the steerable display image may be dropped, or the steerable display may be turned off momentarily. In one embodiment, this may be done by blanking the steerable display so that it is not seen by the user. In various embodiments, this may be done by disabling a backlight, disabling laser or LED illumination source, blanking the pixels, or through another method.

[0080] At block 440, in one embodiment, the edges between the steerable display image and the field image are blended. This ensures a smooth and imperceptible transition between the field image and the steerable display image. At block 445, the hybrid image is displayed to the user, incorporating the steerable display and the field display. The process then returns to block 410 to continue tracking and displaying. Note that while the description talks about a steerable display image and a field image, the images contemplated include the sequential images of video. Note also that while this description utilizes a combination of the steerable display and a field display in some embodiments, the steerable display may be used without the presence of a field display. In those instances, the process may include only blocks 415 through 430.

[0081] Figure 4B illustrates one embodiment of the corrective actions which may be taken when the display position validation indicates that the actual location of the steerable display does not match the intended location. The process starts at block 450.

[0082] At block 452, the steerable display positioning is initiated. In one embodiment, this corresponds to block 430 of Figure 4A. Returning to Figure 4B, at block 454, the actual position of the steerable display is verified. In one

embodiment, one or more sensors are used to determine the location and orientation of the steerable display. In one embodiment, the sensors may include cameras, mechanical elements detecting the position of the adjustable mirror or other positioning element, etc. This is done, in one embodiment, by the position validator 380 of Figure 3.

[0083] At block 456 the process determines whether the steerable display is correctly positioned. Correct positioning has the steerable display in the calculated location, to display the image in the appropriate location for the user. If the steerable display is correctly positioned, at block 464 the image is displayed. In one embodiment, this includes displaying a hybrid image including the steerable display image in the calculated location and the associated field display image, as discussed above with respect to Figure 4A. The process then ends at block 475.

[0084] If, at block 456, the process determines that the steerable display was not correctly positioned, the process continues to block 458.

[0085] At block 458, the process determines whether there is enough time for the steerable display to be repositioned. This determination is based on a distance that needs to be moved, the speed of movement, and time until the next image will be sent by the processing system.

[0086] In one embodiment, it also depends on the eye movement of the user. In one embodiment, the system preferentially moves the steerable display while the user is blinking, when no image is perceived. In one embodiment, the repositioning occurs within a blanking period of the display. For example, a movement of just one degree along one coordinate takes less time than moving the steerable display significantly and in three dimensions. If there is enough time, the process returns to block 452 to reposition the steerable display. Otherwise, the process continues to block 460.

[0087] At block 460, the process determines whether the actual position of the steerable display is within range of the intended position. In one embodiment, “within range” in this context means that the system is capable of adjusting the display for the difference. If it is within range, the process continues to block 462.

[0088] At block 462, the data processed for display on the steerable image is adjusted for rendering in the actual position. The adjusted image is then displayed at block 464. For example, in one embodiment, the original calculated image may be rendered in the wrong location if the position difference is very small, without causing visual artifacts. In another embodiment, the image may be adjusted to render appropriately at the actual location. For example, the image may be cropped, brightened, distorted, contrast adjusted, chromatic coordinate (white point) adjusted, cropped, and laterally shifted to account for the location difference.

[0089] In one embodiment, for a hybrid display, the radial location of the edge blending may be shifted or changed. In one embodiment, the system may over-render, e.g. render 5.5 degrees of visual image for a 5-degree steerable display, enabling a shift of 0.5 degrees without needing re-rendering.

[0090] If the steerable display is not within range, at block 466, in one embodiment the frame data is sent to the field display for rendering. At block 468, in one embodiment the steerable display image is not displayed. In one embodiment, the frame is dropped. In another embodiment, the steerable display is blanked momentarily. In one embodiment, the steerable display is not considered within range if the user eye tracking indicates that the user’s gaze has moved too far outside of the baseline reference.

[0091] At block 470, one embodiment, the field display image is rendered, without the image cut-out and without the display or rendering of the steerable display image. At block 472, the field display image is displayed. The process then ends.

[0092] Figure 5 is a flowchart of one embodiment of utilizing the steerable display, where positioning is not dependent on the user’s gaze vector. This may be applicable, for example, when the display is a heads-up type of display, or a sprite, or the only bright element on an otherwise dark display. Other reasons to provide positioning not based on the user’s gaze vector may be found. In one embodiment, this configuration may be combined with the configuration of Figure 4A discussed above, in which the positioning is based on the gaze vector. That is, the same system may vary between being gaze-vector based and not.

[0093] The process starts at block 510. In one embodiment, prior to the start of this process the display system is fitted to the user.

[0094] At block 515, the position for the steerable display is determined. This determination may be made based on external data (for example in a virtual reality display), or other determinations. In one embodiment, this decision may be made based on processor data.

[0095] At block 520, the process determines the current position of the steerable display.

[0096] At block 525, the process determines whether the steerable display should be repositioned. This is based on comparing the current position of the steerable display with the intended position of the image. If they are misaligned, the system determines that the steerable display should be repositioned. If so, at block 530, a display repositioning is triggered. The repositioning of the display is designed so the movement of the steerable display is not perceived by the user, in one embodiment. In one embodiment, this may be accomplished by using a mirror that is fast enough to complete the movement in a way that the user cannot perceive it, as described above. In one embodiment, this may be accomplished by timing the movement to the user’s blink or eye movement. In one embodiment, if the intended display is moved more than a particular distance, the display is blanked during the move. This ensures that the user does not perceive the movement. In one embodiment, the particular distance is more than 0.5 degrees. In one embodiment, the intended display is not blanked if the movement is occurring while the user is blinking. Note that although the term“repositioning” is used, this corresponds to the movement of the positioning elements, to adjust the position of the display.

[0097] The process then continues to block 535, whether or not the display was repositioned.

[0098] At block 535, optionally the system cuts out the portion of the field display image that would be positioned in the same location as the steerable display image. This prevents the field display from interfering with the steerable display.

[0099] The cut-out, in one embodiment, is performed at the rendering engine. In another embodiment, the image may be a sprite or other bright image

element which does not need a cut-out to be clear. In that instance, this block may be skipped.

[00100] At block 540, in one embodiment, the system determines whether the edges between the steerable display image and a field image should be blended. This ensures a smooth and imperceptible transition between the field image and the steerable display image. This may not be relevant when there is no field display, or when the steerable display is a sprite or other overlay element. If the system determines that the edges should be blended, at block 545, the edges are blended.

[00101] At block 550, the image from the steerable display is displayed to the user, optionally incorporating data from the field display. The process then returns to block 510 to continue tracking and displaying. Note that while the description talks about a steerable display image and a field image, the images contemplated include the sequential images of video. Note also that while this description utilizes a combination of the steerable display and a field display in some embodiments, the steerable display may be used without the presence of a field display.

[00102] Figure 6 is a flowchart of one embodiment of controlling the use of the steerable element. In one embodiment, the system determines the type of eye movement, saccade or smooth pursuit. For smooth pursuit, in one embodiment, the system moves one frame at a time, and matches the eye movement so that the steerable display may be on during the movement. In one embodiment, this can be done for up to a three degree per frame movement. For eye movement faster than that, in one embodiment, the steerable display may be blanked. For a saccade movement, in one embodiment the system blanks the steerable display temporarily for movement, to avoid visual aberrations. The system is designed to have a settling time that is faster than the user’s eye. Thus, the display is designed to be active again by the time the eye has settled after a saccade movement, and is back to full resolution. Figure 6 illustrates one embodiment of moving the steerable display for a saccade or other fast movement.

[00103] The process starts at block 605. In one embodiment, this process runs whenever the steerable display is active. At block 610, the user’s gaze position is monitored for the steerable display. In one embodiment, the steerable display is directed to the user’s fovea.

[00104] At block 615 a new gaze position is determined. In one

embodiment, the gaze position is identified using a camera directed at the user’s eye.

[00105] At block 620, the degree of movement needed for the steerable display to match the new gaze vector is identified.

[00106] At block 625, the time to move the steerable display to the new location is determined. In one embodiment, a look-up table is used. In one embodiment, the“gaze vector” determined may be a plurality of gaze vectors over time, as in a smooth pursuit eye movement.

[00107] At block 630, the steerable display is blanked, and the movement is started. In one embodiment, the movement is only started after the steerable display is blanked. The steerable display may be blanked in one embodiment by turning off a light source. In another embodiment the steerable display may be turned off by blanking the mirror. In another embodiment, the steerable display may be blanked by disabling a backlight or illumination. In another embodiment, the steerable display may be blanked by setting the pixels to black.

[00108] At block 635, the steerable display is moved. During this time, since the steerable display is blanked, in one embodiment, the field display is filled in to cover the full display area. In another embodiment, there may not be a field display in which case this does not apply.

[00109] At block 640, the process determines whether the time has elapsed to complete the calculated movement, in one embodiment. If not, the process continues to move at block 635.

[00110] If the time has elapsed, in one embodiment, the system provides a signal to activate the steerable display, at block 645. In another embodiment, the signal timing may be based on the movement data from the microprocessor and position verifier.

[00111] When the signal to activate n the display is received, at block 645, at block 650 the process verifies that the display has stopped moving and has settled. Settling means that the display is steady and is not vibrating as a result of the movement. In one embodiment, this is a closed loop determination made by the microprocessor in the display.

[00112] If the display has settled, at block 655 the steerable display is activated. In one embodiment, if there is a field display it may be cut out for the area in which the steerable display image is shown. The process then continues to block 610, to continue monitoring the gaze position of the user, and to determine a new gaze position. In this way, the steerable display is moved to match the user’s gaze, while providing no visual indicators of movement.

[00113] In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from

the broader spirit and scope of the invention as set forth in the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.